An early characterisation of wearing variability on motion signals for wearables

We explore a new variability observed in motion signals acquired from modern wearables. Wearing variability refers to the variations of the device orientation and placement across wearing events. We collect the accelerometer data on a smartwatch and an earbud and analyse how motion signals change due to the wearing variability. Our analysis shows that the wearing variability can bring an unexpected change to motion signals, not only from different users but also from different wearing sessions of the same user. We also provide empirical ranges of changes in device orientations.

[1]  Dana Kulic,et al.  Data augmentation of wearable sensor data for parkinson’s disease monitoring using convolutional neural networks , 2017, ICMI.

[2]  Oliver Amft,et al.  Sparse natural gesture spotting in free living to monitor drinking with wrist-worn inertial sensors , 2018, UbiComp.

[3]  Emre Ertin,et al.  puffMarker: a multi-sensor approach for pinpointing the timing of first lapse in smoking cessation , 2015, UbiComp.

[4]  Paul Lukowicz,et al.  Dealing with human variability in motion based, wearable activity recognition , 2014, 2014 IEEE International Conference on Pervasive Computing and Communication Workshops (PERCOM WORKSHOPS).

[5]  Mikkel Baun Kjærgaard,et al.  Smart Devices are Different: Assessing and MitigatingMobile Sensing Heterogeneities for Activity Recognition , 2015, SenSys.

[6]  Gregory D. Abowd,et al.  EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[7]  Billur Barshan,et al.  Activity Recognition Invariant to Sensor Orientation with Wearable Motion Sensors , 2017, Sensors.

[8]  Özlem Durmaz Incel,et al.  User, device and orientation independent human activity recognition on mobile phones: challenges and a proposal , 2013, UbiComp.

[9]  Lin Sun,et al.  Activity Recognition on an Accelerometer Embedded Mobile Phone with Varying Positions and Orientations , 2010, UIC.

[10]  Seungchul Lee,et al.  Automatic Smile and Frown Recognition with Kinetic Earables , 2019, AH.

[11]  Parth H. Pathak,et al.  Finger-writing with Smartwatch: A Case for Finger and Hand Gesture Recognition using Smartwatch , 2015, HotMobile.

[12]  Sungyoung Lee,et al.  Smartphone Location-Independent Physical Activity Recognition Based on Transportation Natural Vibration Analysis , 2017, Sensors.

[13]  Chulhong Min,et al.  Earables for Personal-Scale Behavior Analytics , 2018, IEEE Pervasive Computing.