iHand: an interactive bare-hand-based augmented reality interface on commercial mobile phones

Abstract. The performance of mobile phones has rapidly improved, and they are emerging as a powerful platform. In many vision-based applications, human hands play a key role in natural interaction. However, relatively little attention has been paid to the interaction between human hands and the mobile phone. Thus, we propose a vision- and hand gesture-based interface in which the user holds a mobile phone in one hand but sees the other hand’s palm through a built-in camera. The virtual contents are faithfully rendered on the user’s palm through palm pose estimation, and reaction with hand and finger movements is achieved that is recognized by hand shape recognition. Since the proposed interface is based on hand gestures familiar to humans and does not require any additional sensors or markers, the user can freely interact with virtual contents anytime and anywhere without any training. We demonstrate that the proposed interface works at over 15 fps on a commercial mobile phone with a 1.2-GHz dual core processor and 1 GB RAM.

[1]  James E. Davis,et al.  Camera-based pointing interface for mobile devices , 2008, 2008 15th IEEE International Conference on Image Processing.

[2]  James M. Rehg,et al.  Statistical Color Models with Application to Skin Detection , 2004, International Journal of Computer Vision.

[3]  Tobias Höllerer,et al.  Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[4]  Kwang-Seok Hong,et al.  Finger gesture-based mobile user interface using a rear-facing camera , 2011, 2011 IEEE International Conference on Consumer Electronics (ICCE).

[5]  Woontack Woo,et al.  A mobile phone guide: spatial, personal, and social experience for cultural heritage , 2009, IEEE Transactions on Consumer Electronics.

[6]  Björn Stenger,et al.  Filtering using a tree-based estimator , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[7]  Jing Yang,et al.  Beatbox music phone: gesture-based interactive mobile phone using a tri-axis accelerometer , 2005, 2005 IEEE International Conference on Industrial Technology.

[8]  Jong-Il Park,et al.  Hand shape recognition using distance transform and shape decomposition , 2011, 2011 18th IEEE International Conference on Image Processing.

[9]  Tolga K. Çapin,et al.  Mobile Camera-Based User Interaction , 2005, ICCV-HCI.

[10]  Shaohan Hu,et al.  NeuroPhone: brain-mobile phone interface using a wireless EEG headset , 2010, MobiHeld '10.

[11]  Yen-Ting Chen,et al.  Developing a Multiple-angle Hand Gesture Recognition System for Human Machine Interactions , 2007, IECON 2007 - 33rd Annual Conference of the IEEE Industrial Electronics Society.

[12]  Chuqing Cao,et al.  Hand posture recognition via joint feature sparse representation , 2011 .

[13]  Daeho Lee,et al.  Vision-based remote control system by motion detection and open finger counting , 2009, IEEE Transactions on Consumer Electronics.

[14]  Ahmed M. Elgammal,et al.  Spatiotemporal pyramid representation for recognition of facial expressions and hand gestures , 2008, 2008 8th IEEE International Conference on Automatic Face & Gesture Recognition.

[15]  Jong-Il Park,et al.  One-handed interaction with augmented virtual objects on mobile devices , 2008, VRCAI.

[16]  Antonis A. Argyros,et al.  Markerless and Efficient 26-DOF Hand Pose Recovery , 2010, ACCV.

[17]  Yasue Mitsukura,et al.  Classification of hand postures based on 3D vision model for human-robot interaction , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[18]  Jungsik Park,et al.  Bare-hand-based augmented reality interface on mobile phone , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[19]  Jun Rekimoto,et al.  GraspZoom: zooming and scrolling control model for single-handed mobile interaction , 2009, Mobile HCI.