Gesture recognition method based on deep learning

With the rapid development of science and technology, human-computer interaction is born more frequently around us. Human motion analysis and recognition based on attitude sensor is a new field, which overcomes many shortcomings and limitations of motion recognition based on video and is more practical. In this paper, we proposes a new method based on time gesture recognition. By analyzing the kinematics of gestures, the features of gestures are extracted and classified using Recurrent Neural Networks and their variant networks. The methods achieved an accuracy of over 98% in 16 experimenters. The results show that the algorithm can quickly and accurately identify gestures.

[1]  Jeen-Shing Wang,et al.  Using acceleration measurements for activity recognition: An effective learning algorithm for constructing neural classifiers , 2008, Pattern Recognit. Lett..

[2]  Angelo M. Sabatini,et al.  Machine Learning Methods for Classifying Human Physical Activity from On-Body Accelerometers , 2010, Sensors.

[3]  J. D. Janssen,et al.  A triaxial accelerometer and portable data processing unit for the assessment of daily physical activity , 1997, IEEE Transactions on Biomedical Engineering.

[4]  Ig-Jae Kim,et al.  Mobile health monitoring system based on activity recognition using accelerometer , 2010, Simul. Model. Pract. Theory.

[5]  Mubarak Shah,et al.  Visual gesture recognition , 1994 .

[6]  Seongil Lee,et al.  Enabling a gesture-based numeric input on mobile phones , 2011, 2011 IEEE International Conference on Consumer Electronics (ICCE).

[7]  WangJeen-Shing,et al.  Using acceleration measurements for activity recognition , 2008 .

[8]  Pavlo Molchanov,et al.  Online Detection and Classification of Dynamic Hand Gestures with Recurrent 3D Convolutional Neural Networks , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[9]  Weihua Sheng,et al.  Online hand gesture recognition using neural network based segmentation , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Abhishek Ranjan,et al.  Interacting with large displays from a distance with vision-tracked multi-finger gestural input , 2005, SIGGRAPH '06.

[11]  Pavlo Molchanov,et al.  Hand gesture recognition with 3D convolutional neural networks , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[12]  Mubarak Shah,et al.  Motion-based recognition a survey , 1995, Image Vis. Comput..

[13]  Daniel Olgu ´ õn,et al.  Human Activity Recognition: Accuracy across Common Locations for Wearable Sensors , 2006 .

[14]  Ravin Balakrishnan,et al.  VisionWand: interaction techniques for large displays using a passive wand tracked in 3D , 2004, SIGGRAPH 2004.

[15]  L. Benini,et al.  Activity recognition from on-body sensors by classifier fusion: sensor scalability and robustness , 2007, 2007 3rd International Conference on Intelligent Sensors, Sensor Networks and Information.

[16]  Christoph Maggioni,et al.  A novel gestural input device for virtual reality , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[17]  Kaisa Väänänen,et al.  Gesture Driven Interaction as a Human Factor in Virtual Environments - An Approach with Neural Networks , 1993, Virtual Reality Systems.

[18]  Zhenyu He,et al.  Gesture recognition based on 3D accelerometer for cell phones interaction , 2008, APCCAS 2008 - 2008 IEEE Asia Pacific Conference on Circuits and Systems.