On the Use of Sensor Fusion to Reduce the Impact of Rotational and Additive Noise in Human Activity Recognition

The main objective of fusion mechanisms is to increase the individual reliability of the systems through the use of the collectivity knowledge. Moreover, fusion models are also intended to guarantee a certain level of robustness. This is particularly required for problems such as human activity recognition where runtime changes in the sensor setup seriously disturb the reliability of the initial deployed systems. For commonly used recognition systems based on inertial sensors, these changes are primarily characterized as sensor rotations, displacements or faults related to the batteries or calibration. In this work we show the robustness capabilities of a sensor-weighted fusion model when dealing with such disturbances under different circumstances. Using the proposed method, up to 60% outperformance is obtained when a minority of the sensors are artificially rotated or degraded, independent of the level of disturbance (noise) imposed. These robustness capabilities also apply for any number of sensors affected by a low to moderate noise level. The presented fusion mechanism compensates the poor performance that otherwise would be obtained when just a single sensor is considered.

[1]  B. Ripley,et al.  Pattern Recognition , 1968, Nature.

[2]  Kamiar Aminian,et al.  Ambulatory system for human motion analysis using a kinematic sensor: monitoring of daily physical activity in the elderly , 2003, IEEE Transactions on Biomedical Engineering.

[3]  Ed Huai-hsin Chi,et al.  "Killer App" of wearable computing: wireless force sensing body protectors for martial arts , 2004, UIST '04.

[4]  Paul Lukowicz,et al.  Which Way Am I Facing: Inferring Horizontal Device Orientation from an Accelerometer Signal , 2009, 2009 International Symposium on Wearable Computers.

[5]  Paul Lukowicz,et al.  Dealing with sensor displacement in motion-based onbody activity recognition systems , 2008, UbiComp.

[6]  Héctor Pomares,et al.  Daily living activity recognition based on statistical feature quality group selection , 2012, Expert Syst. Appl..

[7]  J. Allum,et al.  Gait event detection using linear accelerometers or angular velocity transducers in able-bodied and spinal-cord injured individuals. , 2006, Gait & posture.

[8]  H.J. Stam,et al.  Automated estimation of initial and terminal contact timing using accelerometers; development and validation in transtibial amputees and controls , 2005, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[9]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[10]  Daniel P. Siewiorek,et al.  Activity recognition and monitoring using multiple sensors on different body positions , 2006, International Workshop on Wearable and Implantable Body Sensor Networks (BSN'06).

[11]  Michael L. Littman,et al.  Activity Recognition from Accelerometer Data , 2005, AAAI.

[12]  Bernt Schiele,et al.  Analyzing features for activity recognition , 2005, sOc-EUSAI '05.

[13]  Tatsuo Nakajima,et al.  Feature Selection and Activity Recognition from Wearable Sensors , 2006, UCS.

[14]  Jani Mäntyjärvi,et al.  User Independent Gesture Interaction for Small Handheld Devices , 2006, Int. J. Pattern Recognit. Artif. Intell..

[15]  Paul Lukowicz,et al.  Where am I: Recognizing On-body Positions of Wearable Sensors , 2005, LoCA.

[16]  Merryn J Mathie,et al.  Accelerometry: providing an integrated, practical method for long-term, ambulatory monitoring of human movement , 2004, Physiological measurement.

[17]  Gaetano Borriello,et al.  A Practical Approach to Recognizing Physical Activities , 2006, Pervasive.

[18]  Ilias Tachtsidis,et al.  Estimating a modified Grubb's exponent in healthy human brains with near infrared spectroscopy and transcranial Doppler , 2009, Physiological measurement.

[19]  Paul Lukowicz,et al.  From Backpacks to Smartphones: Past, Present, and Future of Wearable Computers , 2009, IEEE Pervasive Computing.

[20]  Gerhard Tröster,et al.  Unsupervised Classifier Self-Calibration through Repeated Context Occurences: Is there Robustness against Sensor Displacement to Gain? , 2009, 2009 International Symposium on Wearable Computers.

[21]  Kenneth Meijer,et al.  Activity identification using body-mounted sensors—a review of classification techniques , 2009, Physiological measurement.

[22]  Ronald Poppe,et al.  A survey on vision-based human action recognition , 2010, Image Vis. Comput..

[23]  Paul Lukowicz,et al.  Wearable Activity Tracking in Car Manufacturing , 2008, IEEE Pervasive Computing.

[24]  Ricardo Chavarriaga,et al.  Unsupervised adaptation for acceleration-based activity recognition: robustness to sensor displacement and rotation , 2011, Personal and Ubiquitous Computing.

[25]  Burr Settles,et al.  Active Learning Literature Survey , 2009 .

[26]  H. S. Wolff,et al.  iRun: Horizontal and Vertical Shape of a Region-Based Graph Compression , 2022, Sensors.

[27]  R. Shephard Limits to the measurement of habitual physical activity by questionnaires , 2003, British journal of sports medicine.

[28]  Philip J Peyton,et al.  A new method for measurement of gas exchange during anaesthesia using an extractable marker gas. , 2004, Physiological measurement.

[29]  L. Benini,et al.  Activity recognition from on-body sensors by classifier fusion: sensor scalability and robustness , 2007, 2007 3rd International Conference on Intelligent Sensors, Sensor Networks and Information.