WeARHand: Head-worn, RGB-D camera-based, bare-hand user interface with visually enhanced depth perception

We introduce WeARHand, which allows a user to manipulate virtual 3D objects with a bare hand in a wearable augmented reality (AR) environment. Our method uses no environmentally tethered tracking devices and localizes a pair of near-range and far-range RGB-D cameras mounted on a head-worn display and a moving bare hand in 3D space by exploiting depth input data. Depth perception is enhanced through egocentric visual feedback, including a semi-transparent proxy hand. We implement a virtual hand interaction technique and feedback approaches, and evaluate their performance and usability. The proposed method can apply to many 3D interaction scenarios using hands in a wearable AR environment, such as AR information browsing, maintenance, design, and games.

[1]  Antonis A. Argyros,et al.  Efficient model-based 3D tracking of hand articulations using Kinect , 2011, BMVC.

[2]  Horst Bischof,et al.  Real-Time Hand Gesture Recognition in a Virtual 3D Environment , 2012 .

[3]  Andy Cockburn,et al.  FingARtips: gesture based direct manipulation in Augmented Reality , 2004, GRAPHITE '04.

[4]  Zhi Li,et al.  Real time Hand Gesture Recognition using a Range Camera , 2009, ICRA 2009.

[5]  Doug A. Bowman,et al.  Design and evaluation of menu systems for immersive virtual environments , 2001, Proceedings IEEE Virtual Reality 2001.

[6]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[7]  Li Bai,et al.  Virtual Touch Screen for Mixed Reality , 2004, ECCV Workshop on HCI.

[8]  Jae Yeol Lee,et al.  Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences , 2013, Expert Syst. Appl..

[9]  Shumin Zhai,et al.  The “Silk Cursor”: investigating transparency for 3D target acquisition , 1994, CHI '94.

[10]  Doug A. Bowman,et al.  An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments , 1997, SI3D.

[11]  Woontack Woo,et al.  Hand Tracking with a Near-Range Depth Camera for Virtual Object Manipulation in an Wearable Augmented Reality , 2014, HCI.

[12]  David W. Murray,et al.  Video-rate localization in multiple maps for wearable augmented reality , 2008, 2008 12th IEEE International Symposium on Wearable Computers.

[13]  Hujun Bao,et al.  Robust monocular SLAM in dynamic environments , 2013, 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[14]  Woontack Woo,et al.  Bare Hand Interface for Interaction in the Video See-Through HMD Based Wearable AR Environment , 2006, ICEC.

[15]  Robert Oliver Castle,et al.  Simultaneous recognition, localization and mapping for wearable visual robots , 2009 .

[16]  Thomas Martinetz,et al.  Real-time skeleton tracking for embedded systems , 2013, Electronic Imaging.

[17]  A. ROSENFELD,et al.  Distance functions on digital pictures , 1968, Pattern Recognit..

[18]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[19]  Richard D. Green,et al.  An advanced interaction framework for augmented reality based exposure treatment , 2013, 2013 IEEE Virtual Reality (VR).

[20]  Ian D. Reid,et al.  3D hand tracking for human computer interaction , 2012, Image Vis. Comput..

[21]  Kunihiro Chihara,et al.  HIT-Wear: A Menu System Superimposing on a Human Hand for Wearable Computers , 1999 .

[22]  Woontack Woo,et al.  A usability study of multimodal input in an augmented reality environment , 2013, Virtual Reality.

[23]  R. Green,et al.  3D natural hand interaction for AR applications , 2008, 2008 23rd International Conference Image and Vision Computing New Zealand.

[24]  Sterling Orsten,et al.  Dynamics based 3D skeletal hand tracking , 2013, I3D '13.

[25]  Abdolhossein Sarrafzadeh,et al.  An adaptive real-time skin detector based on Hue thresholding: A comparison on two motion tracking methods , 2006, Pattern Recognit. Lett..

[26]  Anatole Lécuyer,et al.  The Visual Appearance of User's Avatar Can Influence the Manipulation of Both Real Devices and Virtual Objects , 2007, 2007 IEEE Symposium on 3D User Interfaces.

[27]  Mark Billinghurst,et al.  Automatic zooming interface for tangible augmented reality applications , 2012, VRCAI '12.

[28]  Michael Harrington,et al.  WearTrack: a self-referenced head and hand tracker for wearable computers and portable VR , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[29]  Mark Billinghurst,et al.  Poster: Physically-based natural hand and tangible AR interaction for face-to-face collaboration on a tabletop , 2012, 2012 IEEE Symposium on 3D User Interfaces (3DUI).

[30]  Mark Billinghurst,et al.  Physically-based Interaction for Tabletop Augmented Reality Using a Depth-sensing Camera for Environment Mapping , 2011 .

[31]  Hiroshi Sasaki,et al.  Wireless User Perspectives in Europe: HandSmart Mediaphone Interface , 2002, Wirel. Pers. Commun..

[32]  Bruce H. Thomas,et al.  Glove Based User Interaction Techniques for Augmented Reality in an Outdoor Environment , 2002, Virtual Reality.

[33]  Tobias Höllerer,et al.  Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[34]  Yael Edan,et al.  Vision-based hand-gesture applications , 2011, Commun. ACM.

[35]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[36]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[37]  Woontack Woo,et al.  Depth-assisted Real-time 3D Object Detection for Augmented Reality , 2011 .