Touchless interaction for future mobile applications

We present a light-weight real-time applicable 3D-gesture recognition system on mobile devices for improved Human-Machine Interaction. We utilize time-of-flight data coming from a single sensor and implement the whole gesture recognition pipeline on two different devices outlining the potential of integrating these sensors onto mobile devices. The main components are responsible for cropping the data to the essentials, calculation of meaningful features, training and classifying via neural networks and realizing a GUI on the device. With our system we achieve recognition rates of up to 98% on a 10-gesture set with frame rates reaching 20Hz, more than sufficient for any real-time applications.

[1]  Uwe Handmann,et al.  A pragmatic approach to multi-class classification , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[2]  Huidong Bai,et al.  Natural Gesture Based Interaction for Handheld Augmented Reality , 2013 .

[3]  Lixin Fan,et al.  Finger Tracking for Gestural Interaction in Mobile Devices , 2013, SCIA.

[4]  Joe Marshall,et al.  Experiments in 3D interaction for mobile phone AR , 2007, GRAPHITE '07.

[5]  Heng Tao Shen,et al.  Principal Component Analysis , 2009, Encyclopedia of Biometrics.

[6]  Nico Blodow,et al.  Aligning point cloud views using persistent feature histograms , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.