Suggestions for Interface Design Using Head Tracking and Voice Commands

Multimodal Interactions have been used in many fields of application, such as medicine, manipulation of assistive technologies, interactions in public environments, among others. It is important to not only develop technologies (hardware and software) but to study and project optimization possibilities for the usage of these innovative interfaces as well. This work aims to identify which are the major problems when using head tracking interactions combined with voice commands, this being a multimodal interaction. This evaluation is focused in the lowest level of interaction, which are actions more physical and less cognitive, such as click, drag and drop, scroll a page, among others. Therefore, as consequence of this research, there were proposed some suggestions to improvement of interface projects that use this form of interaction.

[1]  Elise van den Hoven,et al.  Grasping gestures: Gesturing with physical artifacts , 2011, Artificial Intelligence for Engineering Design, Analysis and Manufacturing.

[2]  Nina Valkanova,et al.  Public visualization displays of citizen data: Design, impact and implications , 2014, Int. J. Hum. Comput. Stud..

[3]  Sandi Ljubic,et al.  Integrating Blink Click interaction into a head tracking system: implementation and usability issues , 2013, Universal Access in the Information Society.

[4]  Mohan M. Trivedi,et al.  Head Pose Estimation in Computer Vision: A Survey , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[6]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[7]  M. Sheelagh T. Carpendale,et al.  Empirical Studies in Information Visualization: Seven Scenarios , 2012, IEEE Transactions on Visualization and Computer Graphics.

[8]  Eiichi Itoh Multi-modal Interface with Voice and Head Tracking for Multiple Home Appliances , 2001, INTERACT.

[9]  Cristina Manresa-Yee,et al.  Face Me! Head-Tracker Interface Evaluation on Mobile Devices , 2015, CHI Extended Abstracts.

[10]  Sandeep Bhatia,et al.  Radio Frequency Home Appliance Control Based on Head Tracking and Voice Control for Disabled Person , 2015, 2015 Fifth International Conference on Communication Systems and Network Technologies.

[11]  Cyril Allauzen,et al.  Improved recognition of contact names in voice commands , 2015, 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[12]  Masamitsu Kurisu,et al.  Position determination of a popup menu on operation screens of a teleoperation system using a low cost head tracker , 2015, 2015 15th International Conference on Control, Automation and Systems (ICCAS).

[13]  Kamran Sedig,et al.  Human-Centered Interactivity of Visualization Tools: Micro- and Macro-level Considerations , 2014, Handbook of Human Centric Visualization.

[14]  Jessica Colnago,et al.  Design de aplicações para interação em espaços públicos: formalizando as lições aprendidas , 2014, IHC.

[15]  Monika Elepfandt Pointing and speech: comparison of various voice commands , 2012, NordiCHI.

[16]  Kenton O'Hara,et al.  Touchless interaction in surgery , 2014, CACM.

[17]  Sharon L. Oviatt,et al.  Multimodal Interfaces: A Survey of Principles, Models and Frameworks , 2009, Human Machine Interaction.

[18]  Matthew Turk,et al.  Multimodal interaction: A review , 2014, Pattern Recognit. Lett..

[19]  Jörg Müller,et al.  MyPosition: sparking civic discourse by a public interactive poll visualization , 2014, CSCW.

[20]  Tobias Isenberg,et al.  A Systematic Review on the Practice of Evaluating Visualization , 2013, IEEE Transactions on Visualization and Computer Graphics.

[21]  Jakob Nielsen,et al.  Heuristic evaluation of user interfaces , 1990, CHI '90.