Investigating mid-air gestures and handhelds in motion tracked environments

Smart spaces with multiple interactive devices and motion tracking capabilities are becoming more common. However, there is little research on how interaction with one device affects the usage of other devices in the space. We investigate the effects of mobile devices and physical interactive devices on gestural interaction in motion-tracked environments. For our user study, we built a smart space consisting of a gesture-controlled large display, an NFC reader and a mobile device, to simulate a system in which users can transfer information between the space and personal devices. The study with 13 participants revealed that (1) the mobile device affects gesturing as well as passive stance; (2) users may stop moving completely when they intend to stop interacting with a display; (3) interactive devices with overlapping interaction space make unintentional interaction significantly more frequent. Our findings give implications for gestural interaction design as well as design of motion-tracked smart spaces.

[1]  Paul Holleis,et al.  Design and evaluation of techniques for mobile interaction with dynamic NFC-displays , 2011, Tangible and Embedded Interaction.

[2]  Peter Fröhlich,et al.  Display pointing: a qualitative study on a recent screen pairing technique for smartphones , 2013, BCS HCI.

[3]  Adiyan Mujibiya,et al.  Interactive Study of WallSHOP: Multiuser Connectivity between Public Digital Advertising and Private Devices for Personalized Shopping , 2015, PerDis.

[4]  Jun Rekimoto,et al.  Pick-and-drop: a direct manipulation technique for multiple computer environments , 1997, UIST '97.

[5]  Anthony Collins,et al.  Investigating intuitiveness and effectiveness of gestures for free spatial interaction with large displays , 2012, PerDis '12.

[6]  Yong Zhang,et al.  TouchInteract: An Interaction Technique with Large Displays Using Touchscreen-Phone , 2010, 2010 7th International Conference on Ubiquitous Intelligence & Computing and 7th International Conference on Autonomic & Trusted Computing.

[7]  Markku Turunen,et al.  Magnetic Cursor: Improving Target Selection in Freehand Pointing Interfaces , 2014, PerDis.

[8]  Antti Järvi Design and Implementation of an NFC-Based Mobile Interaction Concept for a Smart Bulletin Board , 2015 .

[9]  Jörg Müller,et al.  Cuenesics: using mid-air gestures to select items on interactive public displays , 2014, MobileHCI '14.

[10]  Judy Kay,et al.  To Dwell or Not to Dwell: An Evaluation of Mid-Air Gestures for Large Information Displays , 2015, OZCHI.

[11]  Rick Kjeldsen,et al.  Design issues for vision-based computer interaction systems , 2001, PUI '01.

[12]  Markku Turunen,et al.  Information wall: evaluation of a gesture-controlled public display , 2014, MUM.

[13]  Jayson Turner Cross-device eye-based interaction , 2013, UIST '13 Adjunct.

[14]  Scott E. Hudson,et al.  Combining body pose, gaze, and gesture to determine intention to interact in vision-based interfaces , 2014, CHI.

[15]  Robert Hardy,et al.  Mobile interaction with static and dynamic NFC-based displays , 2010, Mobile HCI.

[16]  Judy Kay,et al.  An in-the-wild study of learning mid-air gestures to browse hierarchical information at a large interactive public display , 2015, UbiComp.