A User-Defined Gesture Set for Music Interaction in Immersive Virtual Environment

In recent years, hand-tracking technologies had been implemented in Virtual Reality application, allowing users to use natural hand gesture for interaction within the environment. However, little efforts have been conducted in understanding user's preference when they use their hands to interact with the VR world. In this paper, the result of a guessability test for hand gestures in order to operate musical tasks to support music interaction within immersive Virtual Environment. A total number of 750 gestures have been elicited from 15 participants for 50 selected tasks, including 10 musical tasks. Our result enables a smaller size of gesture set to be elicited from the users. The implications of this work can be relevant in hand gesture design, gesture interaction, and gestural interfaces design for music interaction, all of which are highlighted in this study.

[1]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[2]  Clayton Lewis,et al.  Learning to use a text processing system: Evidence from “thinking aloud” protocols , 1982, CHI '82.

[3]  Weiqiang Wang,et al.  Recognition of In-air Handwritten Chinese Character Based on Leap Motion Controller , 2015, ICIG.

[4]  Arjan Kuijper,et al.  Mid-Air Gestures for Virtual Modeling with Leap Motion , 2016, HCI.

[5]  Simon Holland Learning About Harmony with Harmony Space: An Overview , 1993, Music Education: An Artificial Intelligence Approach.

[6]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[7]  Yvonne Rogers,et al.  Feel the force: Using tactile technologies to investigate the extended mind , 2008 .

[8]  Sergi Jordà,et al.  The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces , 2007, TEI.

[9]  Hao Chen,et al.  Gesture Authentication with Touch Input for Mobile Devices , 2011, MobiSec.

[10]  Xin Zhang,et al.  Real-time fingertip tracking and detection using Kinect depth sensor for a new writing-in-the air system , 2012, ICIMCS '12.

[11]  Maria Kutar,et al.  Cognitive Dimensions of Notations: Design Tools for Cognitive Technology , 2001, Cognitive Technology.

[12]  Samir Garbaya,et al.  The affect of contact force sensations on user performance in virtual assembly tasks , 2007, Virtual Reality.

[13]  Simon Holland,et al.  Song Walker Harmony Space: Embodied Interaction Design for Complex Musical Skills , 2013, Music and Human-Computer Interaction.

[14]  Gang Ren,et al.  Freehand gestural text entry for interactive TV , 2013, EuroITV.

[15]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[16]  Jakob Nielsen,et al.  Noncommand user interfaces , 1993, CACM.

[17]  David England Whole Body Interaction: An Introduction , 2011, Whole Body Interaction.