A Comparison of Surface and Motion User-Defined Gestures for Mobile Augmented Reality

Advancements in Augmented Reality (AR) technologies and processing power of mobile devices have created a surge in the number of mobile AR applications. Nevertheless, many AR applications have adopted surface gestures as the default method for interaction with virtual content. In this paper, we investigate two gesture modalities, surface and motion, for operating mobile AR applications. In order to identify optimal gestures for various interactions, we conducted an elicitation study with 21 participants for 12 tasks, which yielded a total of 504 gestures. We classified and illustrated the two sets of gestures, and compared them in terms of goodness, ease of use, and engagement. The elicitation process yielded two separate sets of user-defined gestures; legacy surface gestures, which were familiar and easy to use by the participants, and motion gestures, which had better engagement. From the interaction patterns of this second set of gestures, we propose a new interaction class called TMR (Touch-Move-Release), which defines for mobile AR.

[1]  Hai Huang,et al.  You Are How You Touch: User Verification on Smartphones via Tapping Behaviors , 2014, 2014 IEEE 22nd International Conference on Network Protocols.

[2]  D. W. F. van Krevelen,et al.  A Survey of Augmented Reality Technologies, Applications and Limitations , 2010, Int. J. Virtual Real..

[3]  Daniel Kurz,et al.  Smartwatch-aided handheld augmented reality , 2014, ISMAR.

[4]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[5]  Robert Hardy,et al.  Touch & interact: touch-based interaction of mobile phones with displays , 2008, Mobile HCI.

[6]  Huidong Bai,et al.  Mobile Augmented Reality: Free-hand Gesture-based Interaction , 2016 .

[7]  Kening Zhu,et al.  Tripartite Effects: Exploring Users’ Mental Model of Mobile Gestures under the Influence of Operation, Handheld Posture, and Interaction Space , 2017, Int. J. Hum. Comput. Interact..

[8]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[9]  Takuji Narumi,et al.  Integrated view-input ar interaction for virtual object manipulation using tablets and smartphones , 2015, Advances in Computer Entertainment.

[10]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[11]  Ajune Wanis Ismail,et al.  3D Object Manipulation Techniques in Handheld Mobile Augmented Reality Interface: A Review , 2019, IEEE Access.

[12]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.