Optimised Convolutional Neural Networks for Heart Rate Estimation and Human Activity Recognition in Wrist Worn Sensing Applications

Wrist-worn smart devices are providing increased insights into human health, behaviour and performance through sophisticated analytics. However, battery life, device cost and sensor performance in the face of movement-related artefact present challenges which must be further addressed to see effective applications and wider adoption through commoditisation of the technology. We address these challenges by demonstrating, through using a simple optical measurement, photoplethysmography (PPG) used conventionally for heart rate detection in wrist-worn sensors, that we can provide improved heart rate and human activity recognition (HAR) simultaneously at low sample rates, without an inertial measurement unit. This simplifies hardware design and reduces costs and power budgets. We apply two deep learning pipelines, one for human activity recognition and one for heart rate estimation. HAR is achieved through the application of a visual classification approach, capable of robust performance at low sample rates. Here, transfer learning is leveraged to retrain a convolutional neural network (CNN) to distinguish characteristics of the PPG during different human activities. For heart rate estimation we use a CNN adopted for regression which maps noisy optical signals to heart rate estimates. In both cases, comparisons are made with leading conventional approaches. Our results demonstrate a low sampling frequency can achieve good performance without significant degradation of accuracy. 5 Hz and 10 Hz were shown to have 80.2% and 83.0% classification accuracy for HAR respectively. These same sampling frequencies also yielded a robust heart rate estimation which was comparative with that achieved at the more energy-intensive rate of 256 Hz.

[1]  Brigid Pike Health in Ireland: key trends 2015. , 2016 .

[2]  Alexander J. Casson,et al.  Description of a Database Containing Wrist PPG Signals Recorded during Physical Exercise with Both Accelerometer and Gyroscope Measures of Motion , 2017, Data.

[3]  Kristof Van Laerhoven,et al.  Deep PPG: Large-Scale Heart Rate Estimation with Convolutional Neural Networks , 2019, Sensors.

[4]  Alan F. Smeaton,et al.  An Interpretable Machine Vision Approach to Human Activity Recognition using Photoplethysmograph Sensor Data , 2018, AICS.

[5]  Juan Arteaga-Falconi,et al.  Real-Time Contactless Heart Rate Estimation from Facial Video , 2018 .

[6]  Misha Pavel,et al.  Human Activity Recognition Using A Single Optical Heart Rate Monitoring Wristband Equipped with Triaxial Accelerometer , 2017 .

[7]  Alan F. Smeaton,et al.  Artifact Abstract: CNNs for Heart Rate Estimation and Human Activity Recognition in Wrist Worn Sensing Applications , 2020, 2020 IEEE International Conference on Pervasive Computing and Communications (PerCom).

[8]  Jeffrey M. Hausdorff,et al.  Physionet: Components of a New Research Resource for Complex Physiologic Signals". Circu-lation Vol , 2000 .

[9]  Giorgio Biagetti,et al.  Human Activity Recognition Using Accelerometer and Photoplethysmographic Signals , 2017, KES-IDT.

[10]  Haneen Farah,et al.  Heart Rate Analysis for Human Factors: Development and Validation of an Open Source Toolkit for Noisy Naturalistic Heart Rate Data , 2018 .

[11]  John Allen Photoplethysmography and its application in clinical physiological measurement , 2007, Physiological measurement.

[12]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[13]  Andreas Krause,et al.  Trading off prediction accuracy and power consumption for context-aware wearable computing , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).

[14]  Paul Lukowicz,et al.  Sampling frequency, signal resolution and the accuracy of wearable context recognition systems , 2004, Eighth International Symposium on Wearable Computers.

[15]  Nicholas B Allen,et al.  Accuracy of Consumer Wearable Heart Rate Measurement During an Ecologically Valid 24-Hour Period: Intraindividual Validation Study , 2019, JMIR mHealth and uHealth.

[16]  Angelo M. Sabatini,et al.  Machine Learning Methods for Classifying Human Physical Activity from On-Body Accelerometers , 2010, Sensors.

[17]  Jiri Matas,et al.  Visual Heart Rate Estimation with Convolutional Neural Network , 2018, BMVC.

[18]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[19]  Mehdi Boukhechba,et al.  ActiPPG: Using deep neural networks for activity recognition from wrist-worn photoplethysmography (PPG) sensors , 2019, Smart Health.