StARe: Gaze-Assisted Face-to-Face Communication in Augmented Reality

This research explores the use of eye-tracking during Augmented Reality (AR) - supported conversations. In this scenario, users can obtain information that supports the conversation, without augmentations distracting the actual conversation.We propose using gaze that allows users to gradually reveal information on demand. Information is indicated around user’s head, which becomes fully visible when other’s visual attention explicitly falls upon the area. We describe the design of such an AR UI and present an evaluation of the feasibility of the concept. Results show that despite gaze inaccuracies, users were positive about augmenting their conversations with contextual information and gaze interactivity. We provide insights into the trade-offs between focusing on the task at hand (i.e., the conversation), and consuming AR information. These findings are useful for future use cases of eye based AR interactions by contributing to a better understanding of the intricate balance between informative AR and information overload.

[1]  Steven K. Feiner,et al.  Windows on the world: 2D windows for 3D augmented reality , 1993, UIST '93.

[2]  Roel Vertegaal,et al.  Attentive User Interfaces , 2003 .

[3]  Sean White,et al.  SiteLens: situated visualization techniques for urban site visits , 2009, CHI.

[4]  Fang Wu,et al.  The economics of attention: maximizing user value in information-rich environments , 2007, ADKDD '07.

[5]  Steven K. Feiner,et al.  View management for virtual and augmented reality , 2001, UIST '01.

[6]  A BoltRichard Gaze-orchestrated dynamic windows , 1981 .

[7]  Robert J. K. Jacob,et al.  Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces , 2003 .

[8]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[9]  Gudrun Klinker,et al.  Boundary conditions for information visualization with respect to the user's gaze , 2014, AH.

[10]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[11]  Jorma Laaksonen,et al.  An augmented reality interface to contextual information , 2011, Virtual Reality.

[12]  Rakesh D. Desale Rakesh D. Desale,et al.  A Study on Wearable Gestural Interface – A SixthSense Technology , 2013 .

[13]  Ann McNamara,et al.  Using Eye Tracking to Improve Information Retrieval in Virtual Reality , 2018, 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct).

[14]  Jakob Nielsen,et al.  Noncommand user interfaces , 1993, CACM.

[15]  Ann McNamara,et al.  Mobile Augmented Reality: Placing Labels Based on Gaze Position , 2016, 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct).

[16]  JacobRob,et al.  What you look at is what you get , 2016 .

[17]  Ronald Azuma,et al.  Recent Advances in Augmented Reality , 2001, IEEE Computer Graphics and Applications.

[18]  Timo Engelke,et al.  Controlling and Filtering Information Density with Spatial Interaction Techniques via Handheld Augmented Reality , 2013, HCI.

[19]  Jun Rekimoto,et al.  Peripheral vision annotation: noninterference information presentation method for mobile augmented reality , 2011, AH '11.

[20]  Joohwan Kim,et al.  Towards foveated rendering for gaze-tracked virtual reality , 2016, ACM Trans. Graph..

[21]  Steven K. Feiner,et al.  Information filtering for mobile augmented reality , 2000, Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000).

[22]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[23]  Laura A. Dabbish,et al.  Shop-i: Gaze based Interaction in the Physical World for In-Store Social Shopping Experience , 2015, CHI Extended Abstracts.