Multiwindow Fusion for Wearable Activity Recognition

The recognition of human activity has been extensively investigated in the last decades. Typically, wearable sensors are used to register body motion signals that are analyzed by following a set of signal processing and machine learning steps to recognize the activity performed by the user. One of the most important steps refers to the signal segmentation, which is mainly performed through windowing approaches. In fact, it has been proved that the choice of window size directly conditions the performance of the recognition system. Thus, instead of limiting to a specific window configuration, this work proposes the use of multiple recognition systems operating on multiple window sizes. The suggested model employs a weighted decision fusion mechanism to fairly leverage the potential yielded by each recognition system based on the target activity set. This novel technique is benchmarked on a well-known activity recognition dataset. The obtained results show a significant improvement in terms of performance with respect to common systems operating on a single window size.

[1]  Héctor Pomares,et al.  On the Use of Sensor Fusion to Reduce the Impact of Rotational and Additive Noise in Human Activity Recognition , 2012, Sensors.

[2]  Joan Cabestany,et al.  Dyskinesia and motor state detection in Parkinson's Disease patients with a single movement sensor , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[3]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[4]  Héctor Pomares,et al.  Daily living activity recognition based on statistical feature quality group selection , 2012, Expert Syst. Appl..

[5]  Wai Lam,et al.  Learning good prototypes for classification using filtering and abstraction of instances , 2002, Pattern Recognit..

[6]  Luca Benini,et al.  Network-Level Power-Performance Trade-Off in Wearable Activity Recognition: A Dynamic Sensor Selection Approach , 2012, TECS.

[7]  Gary M. Weiss,et al.  Activity recognition using cell phone accelerometers , 2011, SKDD.

[8]  Daniel P. Siewiorek,et al.  Activity recognition and monitoring using multiple sensors on different body positions , 2006, International Workshop on Wearable and Implantable Body Sensor Networks (BSN'06).

[9]  David G. Stork,et al.  Pattern Classification (2nd ed.) , 1999 .

[10]  Majid Sarrafzadeh,et al.  Designing a Robust Activity Recognition Framework for Health and Exergaming Using Wearable Sensors , 2014, IEEE Journal of Biomedical and Health Informatics.

[11]  Guy Lapalme,et al.  A systematic analysis of performance measures for classification tasks , 2009, Inf. Process. Manag..

[12]  Michael L. Littman,et al.  Activity Recognition from Accelerometer Data , 2005, AAAI.

[13]  Diogo R. Ferreira,et al.  Preprocessing techniques for context recognition from accelerometer data , 2010, Personal and Ubiquitous Computing.

[14]  Tatsuo Nakajima,et al.  Feature Selection and Activity Recognition from Wearable Sensors , 2006, UCS.

[15]  Sungyoung Lee,et al.  Mining Minds: An innovative framework for personalized health and wellness support , 2015, 2015 9th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth).

[16]  Héctor Pomares,et al.  Window Size Impact in Human Activity Recognition , 2014, Sensors.

[17]  Gary M. Weiss,et al.  Actitracker: A Smartphone-Based Activity Recognition System for Improving Health and Well-Being , 2016, 2016 IEEE International Conference on Data Science and Advanced Analytics (DSAA).

[18]  Bernt Schiele,et al.  A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.

[19]  Merryn J Mathie,et al.  Accelerometry: providing an integrated, practical method for long-term, ambulatory monitoring of human movement , 2004, Physiological measurement.

[20]  Héctor Pomares,et al.  A benchmark dataset to evaluate sensor displacement in activity recognition , 2012, UbiComp.

[21]  Héctor Pomares,et al.  Dealing with the Effects of Sensor Displacement in Wearable Activity Recognition , 2014, Sensors.

[22]  Philip J Peyton,et al.  A new method for measurement of gas exchange during anaesthesia using an extractable marker gas. , 2004, Physiological measurement.

[23]  Héctor Pomares,et al.  Human activity recognition based on a sensor weighting hierarchical classifier , 2013, Soft Comput..

[24]  Annemarie Laudanski,et al.  Activity classification in persons with stroke based on frequency features. , 2015, Medical engineering & physics.

[25]  Andrea Mannini,et al.  Activity recognition using a single accelerometer placed at the wrist or ankle. , 2013, Medicine and science in sports and exercise.

[26]  M Weiss Gary,et al.  Actitracker: A Smartphone-Based Activity Recognition System for Improving Health and Well-Being , 2016 .

[27]  Miguel Damas,et al.  Multi-sensor Fusion Based on Asymmetric Decision Weighting for Robust Activity Recognition , 2014, Neural Processing Letters.

[28]  Sinziana Mazilu,et al.  GaitAssist: a daily-life support and training system for parkinson's disease patients with freezing of gait , 2014, CHI.