Comparison of different computational intelligent classifier to autonomously detect cardiac pathologies diagnosed by ECG

The electrocardiogram (ECG) is a noninvasive technique used to reflect underlying heart conditions by measuring the electrical activity of the heart, and nowadays it is possible with just a few derivation (with just only two), obtain important information in order than an expert can recognize abnormal heart rhythms (the heart rate is very fast, very slow, or irregular) or a heart attack (myocardial infarction), and if it was recent or some time ago. In this paper, several intelligent classifiers are developed, focusing on the problem of diagnosing cardiac diseases based on the electrocardiogram (ECG), or more precisely, on the differentiation of several arrhythmia using a large data set. We will study and imitate the ECG treatment methodologies and the features extracted from the electrocardiograms used by the researchers, which obtained the best results in the PhysioNet Challenge (www.physionet.org/). We will extract a great amount of features, partly those used by these researchers and some additional others we considered to be important for the distinction previously mentioned. A new method based on different paradigms of intelligent computation (such as extreme learning machine, support vector machine, decision trees, genetic algorithms and feature selection) will be used to select the most relevant characteristics and to obtain a classifier capable of autonomously distinguishing the different types arrhythmia from the ECG signal.

[1]  Chris H. Q. Ding,et al.  Minimum redundancy feature selection from microarray gene expression data , 2003, Computational Systems Bioinformatics. CSB2003. Proceedings of the 2003 IEEE Bioinformatics Conference. CSB2003.

[2]  Jacek M. Zurada,et al.  Normalized Mutual Information Feature Selection , 2009, IEEE Transactions on Neural Networks.

[3]  G.B. Moody,et al.  The impact of the MIT-BIH Arrhythmia Database , 2001, IEEE Engineering in Medicine and Biology Magazine.

[4]  Héctor Pomares,et al.  Improving Clustering Technique for Functional Approximation Problem Using Fuzzy Logic: ICFA Algorithm , 2005, IWANN.

[5]  I. Rojas,et al.  Recursive prediction for long term time series forecasting using advanced models , 2007, Neurocomputing.

[6]  J. Ross Quinlan,et al.  Induction of Decision Trees , 1986, Machine Learning.

[7]  Jeffrey M. Hausdorff,et al.  Physionet: Components of a New Research Resource for Complex Physiologic Signals". Circu-lation Vol , 2000 .

[8]  Han Zhao,et al.  Extreme learning machine: algorithm, theory and applications , 2013, Artificial Intelligence Review.

[9]  Héctor Pomares,et al.  Effective Input Variable Selection for Function Approximation , 2006, ICANN.

[10]  Salim Yusuf,et al.  Novel therapeutic concepts: the epidemic of cardiovascular disease in the developing world: global implications. , 2010, European heart journal.

[11]  Deng Wan Research on Extreme Learning of Neural Networks , 2010 .

[12]  R.F. Santopietro The origin and characterization of the primary signal, noise, and interference sources in the high frequency electrocardiogram , 1977, Proceedings of the IEEE.

[13]  Willis J. Tompkins,et al.  Biomedical Digital Signal Processing , 1993 .

[14]  Willis J. Tompkins,et al.  Quantitative Investigation of QRS Detection Rules Using the MIT/BIH Arrhythmia Database , 1986, IEEE Transactions on Biomedical Engineering.

[15]  Alexander Kraskov,et al.  Least-dependent-component analysis based on mutual information. , 2004, Physical review. E, Statistical, nonlinear, and soft matter physics.

[16]  Amaury Lendasse,et al.  Ensemble delta test-extreme learning machine (DT-ELM) for regression , 2014, Neurocomputing.

[17]  Benjamin Haibe-Kains,et al.  mRMRe: an R package for parallelized mRMR ensemble feature selection , 2013, Bioinform..

[18]  Lin Chen,et al.  Research on Extreme Learning of Neural Networks: Research on Extreme Learning of Neural Networks , 2010 .

[19]  Qinghua Zheng,et al.  Ordinal extreme learning machine , 2010, Neurocomputing.

[20]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .