Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality

Mobile augmented reality (AR) has been widely used in smart and mobile device-based applications such as entertainment, games, visual experience, and information visualization. However, most of the mobile AR applications have limitations in natural user interaction and do not fully support the direct manipulation of 3D AR objects. This paper proposes a new method for naturally and directly manipulating 3D AR objects through touch and hand gesture-based interactions in handheld devices. The touch gesture is used for the AR object selection and the natural hand gesture is used for the direct and interactive manipulation of the selected objects. Thus, the hybrid interaction makes the user more accurately interact with and manipulate AR objects in the real 3D space, not in the 2D space. In particular, natural hand gestures are detected by the Leap Motion sensor attached to the front or back of mobile devices. Thus the user can easily interacts with 3D AR objects for 3D transformation to enhance usability and usefulness. In this research, comprehensive comparative analyses were performed among the proposed approach and the widely used screen touch-based approach and vision-based approach in terms of quantitative and qualitative aspects. Quantitative analysis was conducted by measuring task completion time and failure rate to perform given tasks such as 3D object matching and grasp-hang-release operation. Both tasks require simultaneous 3D translation and 3D rotation. In addition, we have compared the gesture performance depending on whether the gesture sensor is located in the front or the back of the mobile device. Furthermore, to support other complex operations, an assembly task has also been evaluated. The assembly task consists of a sequence of combining parts into a sub-assembly. Qualitative analysis was performed through enquiring questionnaire after the experiment that examines factors such as ease-of-use, ease-of-natural interaction, etc. Both analyses showed that the proposed approach can provide more natural and intuitive interaction and manipulation of mobile AR objects. Several implementation results will also be given to show the advantage and effectiveness of the proposed approach.

[1]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[2]  Otmar Hilliges,et al.  Joint Estimation of 3D Hand Position and Gestures from Monocular Video for Mobile Interaction , 2015, CHI.

[3]  Samir Otmane,et al.  Design and evaluation of a low-cost 3D interaction technique for wearable and handled AR devices , 2014, 2014 4th International Conference on Image Processing Theory, Tools and Applications (IPTA).

[4]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[5]  Holger Regenbrecht,et al.  A leap-supported, hybrid AR interface approach , 2013, OZCHI.

[6]  Peter Fröhlich,et al.  Markerless visual fingertip detection for natural mobile device interaction , 2011, Mobile HCI.

[7]  Lik-Kwan Shark,et al.  Immersive manipulation of virtual objects through glove-based hand gesture interaction , 2011, Virtual Reality.

[8]  Tobias Höllerer,et al.  Real-time hand interaction for augmented reality on mobile phones , 2013, IUI '13.

[9]  Cristina V. Lopes,et al.  Free-hand interaction with leap motion controller for stroke rehabilitation , 2014, CHI Extended Abstracts.

[10]  Bernd Fröhlich,et al.  Effective manipulation of virtual objects within arm's reach , 2011, 2011 IEEE Virtual Reality Conference.

[11]  Rafael Radkowski,et al.  Interactive Hand Gesture-based Assembly for Augmented Reality Applications , 2012, ACHI 2012.

[12]  Jae Yeol Lee,et al.  Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences , 2013, Expert Syst. Appl..

[13]  Hannes Kaufmann,et al.  DrillSample: precise selection in dense handheld augmented reality environments , 2013, VRIC.

[14]  Steven K. Feiner,et al.  WeARHand: Head-worn, RGB-D camera-based, bare-hand user interface with visually enhanced depth perception , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[15]  Wolfgang Hürst,et al.  Gesture-based interaction via finger tracking for mobile augmented reality , 2011, Multimedia Tools and Applications.

[16]  Hannes Kaufmann,et al.  3DTouch and HOMER-S: intuitive manipulation techniques for one-handed handheld augmented reality , 2013, VRIC.

[17]  Antonis A. Argyros,et al.  Efficient model-based 3D tracking of hand articulations using Kinect , 2011, BMVC.

[18]  Lei Gao,et al.  Poster: Markerless fingertip-based 3D interaction for handheld augmented reality in a small workspace , 2013, 2013 IEEE Symposium on 3D User Interfaces (3DUI).

[19]  Tobias Höllerer,et al.  Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[20]  Yael Edan,et al.  Vision-based hand-gesture applications , 2011, Commun. ACM.

[21]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[22]  Hyungjun Park,et al.  Tangible AR interaction based on fingertip touch using small-sized nonsquare markers , 2014, J. Comput. Des. Eng..

[23]  Jae Yeol Lee,et al.  Hand gesture-based tangible interactions for manipulating virtual objects in a mixed reality environment , 2010 .

[24]  Fons J. Verbeek,et al.  Pointing Task Evaluation of Leap Motion Controller in 3D Virtual Environment , 2014 .

[25]  Doug A. Bowman,et al.  An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments , 1997, SI3D.

[26]  Junsong Yuan,et al.  Depth camera based hand gesture recognition and its applications in Human-Computer-Interaction , 2011, 2011 8th International Conference on Information, Communications & Signal Processing.

[27]  Andy Cockburn,et al.  FingARtips: gesture based direct manipulation in Augmented Reality , 2004, GRAPHITE '04.