An Ensemble of Deep Auto-Encoders for Healthcare Monitoring

Ambient Intelligence (AmI) is a new paradigm that redefines interaction between humans, sensors and flow of data and information. In AmI, environment is more sensitive and more responsive to the user who acts spontaneously in the foreground, while sensors, machines and intelligent methods act in background. Behind the AmI interfaces, a huge volume of data is collected and analysed to make decision in real time. In the field of health care, AmI solutions prevent the patients from emergency situations by using data mining techniques. This paper assess the performance of deep learning against traditional dimensionality reduction and classification algorithms for healthcare monitoring in a hospital environment. An ensemble method is proposed using three classifiers on a reduced version of the dataset, where the dimensionality reduction technique is based on deep learning. The evaluation based on different performance metrics like accuracy, precision, recall and f-measure illustrated reliable performance of the proposed ensemble method.

[1]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.

[2]  Sotiris B. Kotsiantis,et al.  Supervised Machine Learning: A Review of Classification Techniques , 2007, Informatica.

[3]  Sidong Liu,et al.  Early diagnosis of Alzheimer's disease with deep learning , 2014, 2014 IEEE 11th International Symposium on Biomedical Imaging (ISBI).

[4]  Ying Wah Teh,et al.  Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges , 2018, Expert Syst. Appl..

[5]  D. Cox The Regression Analysis of Binary Sequences , 1958 .

[6]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[7]  A. Franco,et al.  NeuroImage: Clinical , 2022 .

[8]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[9]  Karl Pearson F.R.S. LIII. On lines and planes of closest fit to systems of points in space , 1901 .

[10]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[11]  Francisco Charte,et al.  A practical tutorial on autoencoders for nonlinear feature fusion: Taxonomy, models, software and guidelines , 2018, Inf. Fusion.

[12]  Yoshua Bengio,et al.  Extracting and composing robust features with denoising autoencoders , 2008, ICML '08.

[13]  H. Hotelling Analysis of a complex of statistical variables into principal components. , 1933 .

[14]  Pierre Geurts,et al.  Extremely randomized trees , 2006, Machine Learning.

[15]  Yoav Freund,et al.  Boosting a weak learning algorithm by majority , 1995, COLT '90.

[16]  J. K. Brewer,et al.  Univariate selection: The effects of size of correlation, degree of skew, and degree of restriction , 1969 .

[17]  Wei-Yin Loh,et al.  Classification and regression trees , 2011, WIREs Data Mining Knowl. Discov..

[18]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[19]  Rashmi Agrawal,et al.  A Modified K-Nearest Neighbor Algorithm to Handle Uncertain Data , 2015, 2015 5th International Conference on IT Convergence and Security (ICITCS).