The Role of Haptics in User Input for Simple 3D Interaction Tasks - An Analysis of Interaction Performance and User Experience

Traditionally, input devices allowed for at least a certain degree of haptic experience by involving direct physical contact between user and device. Recently, touchless interaction gained popularity through readily available, cheap devices like the Leap motion controller or Microsoft Kinect. Usually, these devices support more than two degrees of freedom and are thus especially suitable for interaction tasks in a three-dimensional space. However, besides the high potential that lies within touchless input techniques, they also involve new challenges (e.g., lack of borders and natural haptic guidance). In this paper, we aim at the identification of potentials and limitations inherent to three different input techniques that involve a varying amount of haptics (i.e., touchful, touchless and semi-touchless input). We present a study conducted with 25 users that focuses on simple input tasks in a 3D interaction space and analyzes objective interaction performance metrics (e.g., regularity or time) and subjective User Experience aspects (e.g., dependability or efficiency). It reveals parallels as well as contrasts between the users’ actual interaction performance and perceived UX (e.g., several metrics suggested haptic input to outperform touchless input while differences regarding UX were not significant). The results are intended to inform other researchers when designing interactive environments.

[1]  Thomas Burger,et al.  The Role of Haptics in User Input for People with Motor and Cognitive Impairments , 2017, AAATE Conf..

[2]  Sungjae Hwang,et al.  MagPen: magnetically driven pen interactions on and around conventional smartphones , 2013, MobileHCI '13.

[3]  Victor Y. Chen,et al.  Comparing bare-hand-in-air Gesture and Object-in-hand Tangible User Interaction for Navigation of 3D Objects in Modeling , 2016, Tangible and Embedded Interaction.

[4]  Arko Lucieer,et al.  Gestural navigation in Google Earth , 2011, OZCHI.

[5]  Wolfgang Hürst,et al.  Mobile 3D graphics and virtual reality interaction , 2011, Advances in Computer Entertainment Technology.

[6]  Thomas Burger,et al.  An Analysis and Modeling Framework for Personalized Interaction , 2017, IUI Companion.

[7]  Robert Tscharn,et al.  User Experience of 3D Map Navigation - Bare-Hand Interaction or Touchable Device? , 2016, Mensch & Computer.

[8]  Bernd Fröhlich,et al.  On 3D input devices , 2006, IEEE Computer Graphics and Applications.

[9]  Fons J. Verbeek,et al.  Pointing Task Evaluation of Leap Motion Controller in 3D Virtual Environment , 2014 .

[10]  Martin Schrepp,et al.  Construction and Evaluation of a User Experience Questionnaire , 2008, USAB.

[11]  Werner Kurschl,et al.  Measuring Physical Pressure in Smart Phone Interaction for People with Impairments , 2015, Mensch & Computer Workshopband.

[12]  Michitaka Hirose,et al.  3D User Interfaces: New Directions and Perspectives , 2008, IEEE Computer Graphics and Applications.

[13]  Roope Raisamo,et al.  Emotional and behavioral responses to haptic stimulation , 2008, CHI.

[14]  Karon E. MacLean,et al.  Haptic Interaction Design for Everyday Interfaces , 2008 .

[15]  Shumin Zhai,et al.  Human Performance in Six Degree of Freedom Input Control , 2002 .

[16]  Takeo Kanade,et al.  Visual Tracking of High DOF Articulated Structures: an Application to Human Hand Tracking , 1994, ECCV.

[17]  M. Stella Atkins,et al.  Evaluating Interaction Techniques for Stack Mode Viewing , 2008, Journal of Digital Imaging.

[18]  René de la Barré,et al.  Touchless Interaction-Novel Chances and Challenges , 2009, HCI.