Detecting Mid-Air Gestures for Digit Writing With Radio Sensors and a CNN

In this paper, we classify digits written in mid-air using hand gestures. Impulse radio ultrawideband (IR-UWB) radar sensors are used for data acquisition, with three radar sensors placed in a triangular geometry. Conventional radar-based gesture recognition methods use whole raw data matrices or a group of features for gesture classification using convolutional neural networks (CNNs) or other machine learning algorithms. However, if the training and testing data differ in distance, orientation, hand shape, hand size, or even gesture speed or the radar setup environment, these methods become less accurate. To develop a more robust gesture recognition method, we propose not using raw data for the CNN classifier, but instead employing the hand’s mid-air trajectory for classification. The hand trajectory has a stereotypical shape for a given digit, regardless of the hand’s orientation or speed, making its classification easy and robust. Our proposed method consists of three stages: signal preprocessing, hand motion localization, and tracking and transforming the trajectory data into an image to classify it using a CNN. Our proposed method outperforms conventional approaches because it is robust to changes in orientation, distance, and hand shape and size. Moreover, this method does not require building a huge training database of digits drawn by different users in different orientations; rather, we can use training databases already available in the image processing field. Overall, the proposed mid-air handwritten digit recognition system provides a user-friendly and accurate mid-air handwriting modality that does not place restrictions on users.

[1]  Yoshua Bengio,et al.  Deep Sparse Rectifier Neural Networks , 2011, AISTATS.

[2]  Anupam Agrawal,et al.  Vision based hand gesture recognition for human computer interaction: a survey , 2012, Artificial Intelligence Review.

[3]  Weihua Sheng,et al.  Wearable Sensor-Based Hand Gesture and Daily Activity Recognition for Robot-Assisted Living , 2011, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[4]  William S. Murphy,et al.  Determination of a Position in Three Dimensions using Trilateration and Approximate Dis- tances , 1995 .

[5]  L. U. Khan,et al.  Implementation of linear prediction techniques in state estimation , 2013, Proceedings of 2013 10th International Bhurban Conference on Applied Sciences & Technology (IBCAST).

[6]  Jin Woo Kim,et al.  A Hand Gesture Recognition Sensor Using Reflected Impulses , 2017, IEEE Sensors Journal.

[7]  Ana-Maria Cretu,et al.  Static and Dynamic Hand Gesture Recognition in Depth Data Using Dynamic Time Warping , 2016, IEEE Transactions on Instrumentation and Measurement.

[8]  Nicolas D. Georganas,et al.  Real-Time Hand Gesture Detection and Recognition Using Bag-of-Features and Support Vector Machine Techniques , 2011, IEEE Transactions on Instrumentation and Measurement.

[9]  Muhammad Shafi,et al.  Recovery of information through linear prediction technique in attitude estimation of spacecraft systems , 2015 .

[10]  A.G. Huizing,et al.  Gesture recognition with a low power FMCW radar and a deep convolutional neural network , 2017, 2017 European Radar Conference (EURAD).

[11]  Sung Ho Cho,et al.  Hand-Based Gesture Recognition for Vehicular Applications Using IR-UWB Radar , 2017, Sensors.

[12]  Yang Zhang,et al.  Tomo: Wearable, Low-Cost Electrical Impedance Tomography for Hand Gesture Recognition , 2015, UIST.

[13]  Sung Ho Cho,et al.  Vital Sign Monitoring and Mobile Phone Usage Detection Using IR-UWB Radar for Intended Use in Car Crash Prevention , 2017, Sensors.

[14]  Youngwook Kim,et al.  Hand Gesture Recognition Using Micro-Doppler Signatures With Convolutional Neural Network , 2016, IEEE Access.