A Methodology to Introduce Gesture-Based Interaction into Existing Consumer Product

The continuous progress of interaction technologies reveals that we are witnessing a revolution that is leading to a redefinition of the concept of "user interface" and to the development of new ways to interact with the electronic devices of all sizes and capabilities. Current trends in research related to the Human-Machine Interaction HMI show a considerable interest toward gesture, motion-based and full-body based interactions. In this context, a User-Centered Design UCD methodology to implement these novel interaction paradigms into consumer products is proposed with the aim to improve its usability, intuitiveness and experience. A case study is used to validate the methodology and measure the achieved improvements in user performance.

[1]  D. Norman The Design of Everyday Things: Revised and Expanded Edition , 2013 .

[2]  C. Palmer,et al.  Temporal coordination in joint music performance: effects of endogenous rhythms and auditory feedback , 2014, Experimental Brain Research.

[3]  Thomas Mitchell,et al.  SoundGrasp: A Gestural Interface for the Performance of Live Music , 2011, NIME.

[4]  Marcelo Knörich Zuffo,et al.  On the usability of gesture interfaces in virtual reality environments , 2005, CLIHC '05.

[5]  Isabella Poggi,et al.  Gestures in performance , 2009 .

[6]  Marcelo M. Wanderley,et al.  A Quantitative Comparison of Position Trackers for the Development of a Touch-less Musical Interface , 2012, NIME.

[7]  Geoffrey Underwood,et al.  Restricting the field of view to investigate the perceptual spans of pianists , 2003 .

[8]  Jakob Nielsen,et al.  Usability inspection methods , 1994, CHI 95 Conference Companion.

[9]  Hamed Ketabdar,et al.  Towards digital music performance for mobile devices based on magnetic interaction , 2010 .

[10]  Cristina Manresa-Yee,et al.  Hand Tracking and Gesture Recognition for Human-Computer Interaction , 2009, Progress in Computer Vision and Image Analysis.

[11]  Stephen A. Brewster,et al.  Usable gestures for mobile interfaces: evaluating social acceptability , 2010, CHI.

[12]  T. Pozzo,et al.  Looking into the eyes of a conductor performing Lerdahl's “Time after Time” , 2010 .

[13]  Teresa Marrin Nakra,et al.  Inside the conductor's jacket: analysis, interpretation and musical synthesis of expressive gesture , 2000 .

[14]  James R. Lewis,et al.  IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use , 1995, Int. J. Hum. Comput. Interact..

[15]  Kenton O'Hara,et al.  On the naturalness of touchless: Putting the “interaction” back into NUI , 2013, TCHI.

[16]  Teresa Marrin,et al.  Possibilities for the digital baton as a general-purpose gestural interface , 1997, CHI 1997.

[17]  K.C. Ng,et al.  Music via motion: transdomain mapping of motion and sound for interactive performances , 2004, Proceedings of the IEEE.

[18]  Shuji Hashimoto,et al.  A computer music system that follows a human conductor , 1991, Computer.

[19]  S. Finney,et al.  Auditory Feedback and Musical Keyboard Performance , 1997 .

[20]  Daijin Kim,et al.  Hand Gesture Recognition To Understand Musical Conducting Action , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[21]  Hamed Ketabdar,et al.  MagiMusic: using embedded compass (magnetic) sensor for touch-less gesture based interaction with digital music instruments in mobile devices , 2011, Tangible and Embedded Interaction.

[22]  Robert Zivadinov,et al.  Brain responses to altered auditory feedback during musical keyboard production: An fMRI study , 2014, Brain Research.

[23]  M. Wiesendanger,et al.  Sight-reading of violinists: eye movements anticipate the musical flow , 2009, Experimental Brain Research.

[24]  Alexander Refsum Jensenius,et al.  Using IR Optical Marker Based Motion Capture for Exploring Musical Interaction , 2010, NIME.