An Easily Customized Gesture Recognizer for Assisted Living Using Commodity Mobile Devices

Automatic gesture recognition is an important field in the area of human-computer interaction. Until recently, the main approach to gesture recognition was based mainly on real time video processing. The objective of this work is to propose the utilization of commodity smartwatches for such purpose. Smartwatches embed accelerometer sensors, and they are endowed with wireless communication capabilities (primarily Bluetooth), so as to connect with mobile phones on which gesture recognition algorithms may be executed. The algorithmic approach proposed in this paper accepts as the input readings from the smartwatch accelerometer sensors and processes them on the mobile phone. As a case study, the gesture recognition application was developed for Android devices and the Pebble smartwatch. This application allows the user to define the set of gestures and to train the system to recognize them. Three alternative methodologies were implemented and evaluated using a set of six 3-D natural gestures. All the reported results are quite satisfactory, while the method based on SAX (Symbolic Aggregate approXimation) was proven the most efficient.

[1]  Panayiotis Tsanakas,et al.  Fall detection and activity identification using wearable and hand-held devices , 2016, Integr. Comput. Aided Eng..

[2]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[3]  Marc T. P. Adam,et al.  Designing User Interfaces for the Elderly: A Systematic Literature Review , 2017, ACIS.

[4]  Andreas Holzinger,et al.  The effect of previous exposure to technology on acceptance and its importance in usability and accessibility engineering , 2011, Universal Access in the Information Society.

[5]  Hao-Chuan Wang,et al.  HandVis: Visualized Gesture Support for Remote Cross-Lingual Communication , 2016, CHI Extended Abstracts.

[6]  Yang Li,et al.  Protractor: a fast and accurate gesture recognizer , 2010, CHI.

[7]  Farhat Fnaiech,et al.  Intelligent Control Wheelchair Using a New Visual Joystick , 2018, Journal of healthcare engineering.

[8]  Luca Benini,et al.  Gesture Recognition Using Wearable Vision Sensors to Enhance Visitors’ Museum Experiences , 2015, IEEE Sensors Journal.

[9]  Zhen Wang,et al.  uWave: Accelerometer-based Personalized Gesture Recognition and Its Applications , 2009, PerCom.

[10]  Sen Zhang,et al.  Detection of Activities by Wireless Sensors for Daily Life Surveillance: Eating and Drinking , 2009, Sensors.

[11]  Seong G. Kong,et al.  Visual Analysis of Eye State and Head Pose for Driver Alertness Monitoring , 2013, IEEE Transactions on Intelligent Transportation Systems.

[12]  Hung D. Nguyen,et al.  A glove-based gesture recognition system for Vietnamese sign language , 2015, 2015 15th International Conference on Control, Automation and Systems (ICCAS).

[13]  Lisa Anthony,et al.  A lightweight multistroke recognizer for user interface prototypes , 2010, Graphics Interface.

[14]  Junbo Wang,et al.  Magic Ring: a self-contained gesture input device on finger , 2013, MUM.

[15]  J. Movellan,et al.  Human and computer recognition of facial expressions of emotion , 2007, Neuropsychologia.

[16]  Vineeth Vijayaraghavan,et al.  Low-cost intelligent gesture recognition engine for audio-vocally impaired individuals , 2014, IEEE Global Humanitarian Technology Conference (GHTC 2014).

[17]  Xuyang Liu,et al.  An Interactive Care System Based on a Depth Image and EEG for Aged Patients with Dementia , 2017, Journal of healthcare engineering.

[18]  Lisa Anthony,et al.  $N-protractor: a fast and accurate multistroke recognizer , 2012, Graphics Interface.

[19]  Michael Rohs,et al.  Protractor3D: a closed-form solution to rotation-invariant 3D gestures , 2011, IUI '11.

[20]  Ramón F. Brena,et al.  Long-Term Activity Recognition from Wristwatch Accelerometer Data , 2014, Sensors.

[21]  Paul J. M. Havinga,et al.  Towards detection of bad habits by fusing smartphone and smartwatch sensors , 2015, 2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops).

[22]  Nelson Pacheco da Rocha,et al.  Usability, accessibility and ambient-assisted living: a systematic literature review , 2013, Universal Access in the Information Society.

[23]  Kongqiao Wang,et al.  A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors , 2011, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[24]  Paolo Dario,et al.  A Survey of Glove-Based Systems and Their Applications , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[25]  Ayaka ONISHI,et al.  Event Detection using Archived Smart House Sensor Data obtained using Symbolic Aggregate Approximation , 2011 .

[26]  Michael Rohs,et al.  A $3 gesture recognizer: simple gesture recognition for devices equipped with 3D acceleration sensors , 2010, IUI '10.

[27]  Sung Ho Han,et al.  An Analytical Approach to Creating Multitouch Gesture Vocabularies in Mobile Devices: A Case Study for Mobile Web Browsing Gestures , 2014, Int. J. Hum. Comput. Interact..

[28]  Dario Salvi,et al.  Experience in Evaluating AAL Solutions in Living Labs , 2014, Sensors.

[29]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[30]  Eamonn J. Keogh,et al.  A symbolic representation of time series, with implications for streaming algorithms , 2003, DMKD '03.

[31]  Bousaaid Mourad,et al.  System Interactive Cyber Presence for E learning to Break Down Learner Isolation , 2015, 1502.06641.

[32]  Christian von Lücken,et al.  Using the Kinect sensor with open source tools for the development of educational games for kids in pre-school age , 2015, 2015 Latin American Computing Conference (CLEI).

[33]  Dario Salvi,et al.  A framework for evaluating Ambient Assisted Living technologies and the experience of the universAAL project , 2015, J. Ambient Intell. Smart Environ..

[34]  Bharti Bansal,et al.  Gesture Recognition: A Survey , 2016 .

[35]  Radu-Daniel Vatavu,et al.  Gestures as point clouds: a $P recognizer for user interface prototypes , 2012, ICMI '12.

[36]  Li Wei,et al.  Experiencing SAX: a novel symbolic representation of time series , 2007, Data Mining and Knowledge Discovery.

[37]  Perttu Hämäläinen,et al.  Children's intuitive gestures in vision-based action games , 2005, CACM.

[38]  Sahin Albayrak,et al.  eRing: multiple finger gesture recognition with one ring using an electric field , 2015, iWOAR.

[39]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[40]  Johannes Schöning,et al.  The office smartwatch: development and design of a smartwatch app to digitally augment interactions in an office environment , 2014, DIS Companion '14.