A natural click interface for AR systems with a single camera

Clicking on a virtual object is the most fundamental and important interaction in augmented reality (AR). However, existing AR systems do not support natural click interfaces, because head-mounted displays with only one camera are usually adopted to realize augmented reality and it is difficult to recognize an arbitrary gesture without accurate depth information. For the ease of detection, some systems force users to make unintuitive gestures, such as pinching with the thumb and forefinger. This paper presents a new natural click interface for AR systems. Through a study investigating how users intuitively click virtual objects in AR systems, we found that the speed and acceleration of fingertips provide cues for detecting click gestures. Based on our findings, we developed a new technique for recognizing natural click gestures with a single camera by focusing on temporal differentials between adjacent frames. We further validated the effectiveness of the recognition algorithm and the usability of our new interface through experiments.

[1]  James M. Rehg,et al.  Statistical Color Models with Application to Skin Detection , 2004, International Journal of Computer Vision.

[2]  Ankit Chaudhary,et al.  Intelligent Approaches to interact with Machines using Hand Gesture Recognition in Natural way: A Survey , 2011, ArXiv.

[3]  Tobias Höllerer,et al.  Initializing Markerless Tracking Using a Simple Hand Gesture , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[4]  Mathias Kölsch,et al.  Fast 2D Hand Tracking with Flocks of Features and Multi-Cue Integration , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[6]  R. Green,et al.  3D natural hand interaction for AR applications , 2008, 2008 23rd International Conference Image and Vision Computing New Zealand.

[7]  Georgia Albuquerque,et al.  Tangible 3D: Immersive 3D Modeling through Hand Gesture Interaction , 2005 .

[8]  Mathias Kölsch,et al.  Analysis of rotational robustness of hand detection with a Viola-Jones detector , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[9]  Andrew D. Wilson Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.

[10]  Masatoshi Ishikawa,et al.  Fast finger tracking system for in-air typing interface , 2009, CHI Extended Abstracts.

[11]  Tobias Höllerer,et al.  Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[12]  Jovan Popovic,et al.  Real-time hand-tracking with a color glove , 2009, SIGGRAPH '09.

[13]  Robert van Liere,et al.  Proceedings of the 11th Eurographics conference on Virtual Environments , 2000 .

[14]  Shahrokh Valaee,et al.  A Novel Accelerometer-based Gesture Recognition System by , 2010 .

[15]  Alberto Del Bimbo,et al.  Visual capture and understanding of hand pointing actions in a 3-D environment , 2003, IEEE Trans. Syst. Man Cybern. Part B.

[16]  Päivi Majaranta,et al.  Effects of feedback on eye typing with a short dwell time , 2004, ETRA.

[17]  Katsuhito Fujimoto,et al.  Gesture keyboard requiring only one camera , 2011, UIST '11 Adjunct.

[18]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[19]  Georgia Albuquerque,et al.  Tangible 3D: hand gesture interaction for immersive 3D modeling , 2005, EGVE'05.