Avoid Touching Your Face: A Hand-to-face 3D Motion Dataset (COVID-away) and Trained Models for Smartwatches

World Health Organisation (WHO) advises that humans must try to avoid touching their eye, nose and mouth, which is an effective way to stop the spread of viral diseases. This has become even more prominent with the widespread coronavirus (COVID-19), resulting in a global pandemic. However, we humans on average touch our face (eye, nose and mouth) 10-20 times an hour [22] [12], which is often the primary source [15] of getting infected by a variety of viral infections including seasonal Influenza, Coronavirus, Swine flu, Ebola virus, etc. Touching our face all day long is a quirk of human nature [13] and it is extremely difficult to train people to avoid touching their face. However, wearable devices and technology can help to continuously monitor our movements and trigger a timely event reminding people to avoid touching their face. In this work, we have collected a hand-to-face multi-sensor 3D motion dataset and named it COVID-away dataset. Using our dataset, we trained models that can continuously monitor human arm/hand movement using a wearable device and trigger a timely notification (e.g. vibration) to warn the device users when their hands are moved (unintentionally) towards their face. Our trained COVID-away models can be easily integrated into an app for smartwatches or fitness bands. Our evaluation shows that the Minimum Covariance Determinant (MCD) model produces the highest F1-score (0.93) using just the smartwatch’s accelerometer data (39 features). Both the dataset and trained models are openly available on the Web at https://github.com/bharathsudharsan/COVID-away.

[1]  S J Henly,et al.  Understanding adherence to hand hygiene recommendations: the theory of planned behavior. , 2001, American journal of infection control.

[2]  Mark Nicas,et al.  A Study Quantifying the Hand-to-Face Contact Rate and Its Potential Application to Predicting Respiratory Tract Infection , 2008, Journal of occupational and environmental hygiene.

[3]  Martin Grunwald,et al.  EEG changes caused by spontaneous facial self-touch may represent emotion regulating processes and working memory maintenance , 2014, Brain Research.

[4]  Mary-Louise McLaws,et al.  Face touching: A frequent habit that has implications for hand hygiene , 2015, American Journal of Infection Control.

[5]  Nasser Kehtarnavaz,et al.  UTD-MHAD: A multimodal dataset for human action recognition utilizing a depth camera and a wearable inertial sensor , 2015, 2015 IEEE International Conference on Image Processing (ICIP).

[6]  Frederick Aardema,et al.  The impact of emotions on body-Focused repetitive behaviors: evidence from a non-treatment-seeking sample. , 2015, Journal of behavior therapy and experimental psychiatry.

[7]  Ani Nahapetian,et al.  AirDraw: Leveraging smart watch motion sensors for mobile human computer interactions , 2016, 2016 13th IEEE Annual Consumer Communications & Networking Conference (CCNC).

[8]  He Wang,et al.  I am a Smartwatch and I can Track my User's Arm , 2016, MobiSys.

[9]  Nicholas D. Lane,et al.  From smart to deep: Robust activity recognition on smartwatches using deep learning , 2016, 2016 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops).

[10]  Takuya Maekawa,et al.  Tree-structured classifier for acceleration-based activity and gesture recognition on smartwatches , 2016, 2016 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops).

[11]  Anind K. Dey,et al.  Serendipity: Finger Gesture Recognition using an Off-the-Shelf Smartwatch , 2016, CHI.

[12]  Sanjay Ghosh,et al.  Evaluation of Microgesture Recognition Using a Smartwatch , 2017, 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA).

[13]  Gert R. G. Lanckriet,et al.  Recognizing Detailed Human Context in the Wild from Smartphones and Smartwatches , 2016, IEEE Pervasive Computing.

[14]  Mateus M. Luna,et al.  Wrist Player: A Smartwatch Gesture Controller for Smart TVs , 2017, 2017 IEEE 41st Annual Computer Software and Applications Conference (COMPSAC).

[15]  Panlong Yang,et al.  Control with Gestures: A Hand Gesture Recognition System Using Off-the-Shelf Smartwatch , 2018, 2018 4th International Conference on Big Data Computing and Communications (BIGCOM).

[16]  Xinjun Sheng,et al.  Feasibility of Wrist-Worn, Real-Time Hand, and Surface Gesture Recognition via sEMG and IMU Sensing , 2018, IEEE Transactions on Industrial Informatics.

[17]  Hasan Ogul,et al.  HANDY: A Benchmark Dataset for Context-Awareness via Wrist-Worn Motion Sensors , 2018, Data.

[18]  Chen Chen,et al.  Deep Fisher discriminant learning for mobile hand gesture recognition , 2017, Pattern Recognit..

[19]  Muhammad Intizar Ali,et al.  Edge2Train: a framework to train machine learning models (SVMs) on resource-constrained IoT edge devices , 2020, IOT.

[20]  Muhammad Intizar Ali,et al.  RCE-NN: a five-stage pipeline to execute neural networks (CNNs) on resource-constrained IoT edge devices , 2020, IOT.

[21]  Yongyan Wang,et al.  Diagnosis, treatment, and prevention of 2019 novel coronavirus infection in children: experts’ consensus statement , 2020, World Journal of Pediatrics.