Analyzing the benefits of the combined interaction of head and eye tracking in 3D visualization information

This work presents an evaluation of the joint interaction of eye tracking and head tracking in a 3D information visualization environment. In this context, it was conducted a task-based evaluation of the interactions, in a prototype using 3D scatter plots, such as navigation, selection, filters, and other typical interactions of an information visualization tool. The tasks mentioned were performed through interactions using head tracking for navigation and eye tracking for selection, and they were evaluated according to quantitative metrics (time and response of a questionnaire) and qualitative (extracted using the Think-Aloud Protocol). The results show that the "click by blinking" configuration was unstable, but the head tracking as a form of navigation showed a greater accuracy in the interaction.

[1]  Constantine Stephanidis,et al.  3D Visualization and Multimodal Interaction with Temporal Information Using Timelines , 2013, INTERACT.

[2]  Doug A. Bowman,et al.  New Directions in 3D User Interfaces , 2006 .

[3]  Dong-Han Kim,et al.  Development of gaze tracking interface for controlling 3D contents , 2012 .

[4]  Sharon L. Oviatt,et al.  Multimodal Interfaces: A Survey of Principles, Models and Frameworks , 2009, Human Machine Interaction.

[5]  Kosuke Sato,et al.  LazyNav: 3D ground navigation with non-critical body parts , 2015, 2015 IEEE Symposium on 3D User Interfaces (3DUI).

[6]  Anne Marsden,et al.  International Organization for Standardization , 2014 .

[7]  Kosuke Sato,et al.  Extended LazyNav: Virtual 3D Ground Navigation for Large Displays and Head-Mounted Displays , 2017, IEEE Transactions on Visualization and Computer Graphics.

[8]  Florian Alt,et al.  Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays , 2014, IUI.

[9]  Yasuhiro Yamamoto,et al.  Cognitive effects of animated visualization in exploratory visual data analysis , 2001, Proceedings Fifth International Conference on Information Visualisation.

[10]  Blair Nonnecke,et al.  Think aloud: effects and validity , 2012, SIGDOC '12.

[11]  Miad Faezipour,et al.  Eye Tracking and Head Movement Detection: A State-of-Art Survey , 2013, IEEE Journal of Translational Engineering in Health and Medicine.

[12]  Chadwick A. Wingrave,et al.  New Directions in 3 D User Interfaces , .

[13]  John T. Stasko,et al.  Toward a Deeper Understanding of the Role of Interaction in Information Visualization , 2007, IEEE Transactions on Visualization and Computer Graphics.

[14]  Anderson Marques,et al.  IVOrpheus 2.0 - A Proposal for Interaction by Voice Command-Control in Three Dimensional Environments of Information Visualization , 2016, HCI.

[15]  Päivi Majaranta,et al.  Eye Tracking , 2009, The Universal Access Handbook.

[16]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[17]  Ben Shneiderman,et al.  The eyes have it: a task by data type taxonomy for information visualizations , 1996, Proceedings 1996 IEEE Symposium on Visual Languages.

[18]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[19]  Du-Sik Park,et al.  3D user interface combining gaze and hand gestures for large-scale display , 2010, CHI EA '10.

[20]  Nikolas Jorge S. Carneiro,et al.  Evaluation of Information Visualization Interaction Techniques Using Gestures and Widgets in 3D Environments , 2016, HCI.