Look There! Be Social and Share

This paper discusses the social challenges posed by a future wherein humans are using Head-Mounted Devices (HMDs) in their everyday lives. Factors causing a negative attitude towards HMDs today, such as privacy concerns and hardware limitations, are no longer an issue, due to technological advances and increasing distribution and familiarity within society. Thus, the debate of social acceptance of HMDs has moved towards their impact on human-human interaction. We explore the potential of utilizing gaze as an implicit input to HMDs, in order to enhance social engagement among individuals and groups. In particular, we identified three crucial aspects that can benefit from gaze input in order

[1]  Mitsuru Ishizuka,et al.  Interest estimation based on dynamic bayesian networks for visual attentive presentation agents , 2007, ICMI '07.

[2]  Robin Wolff,et al.  A Mixed Reality Telepresence System for Collaborative Space Operation , 2017, IEEE Transactions on Circuits and Systems for Video Technology.

[3]  Anton Nijholt,et al.  Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes , 2001, CHI.

[4]  Martin Raubal,et al.  GeoGazemarks: providing gaze history for the orientation on small display maps , 2012, ICMI '12.

[5]  Jeffrey S. Shell,et al.  Auramirror: reflections on attention , 2004, ETRA.

[6]  M A Just,et al.  A theory of reading: from eye fixations to comprehension. , 1980, Psychological review.

[7]  Darren Gergle,et al.  Gazed and Confused: Understanding and Designing Shared Gaze for Remote Collaboration , 2016, CHI.

[8]  Kent Lyons,et al.  Looking at or through?: using eye tracking to infer attention location for wearable transparent displays , 2014, SEMWEB.

[9]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[10]  Martin Raubal,et al.  Interacting with Maps on Optical Head-Mounted Displays , 2016, SUI.

[11]  Roel Vertegaal,et al.  The GAZE groupware system: mediating joint attention in multiparty communication and collaboration , 1999, CHI '99.

[12]  Keita Higuchi,et al.  Can Eye Help You?: Effects of Visualizing Eye Fixations on Remote Collaboration Scenarios for Physical Tasks , 2016, CHI.

[13]  Yusuke Sugano,et al.  Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery , 2017, UIST.

[14]  Boris M. Velichkovsky,et al.  Evaluating requirements for gaze-based interaction in a see-through head mounted display , 2008, ETRA '08.

[15]  Ken Perlin,et al.  Spacetime: Enabling Fluid Individual and Collaborative Editing in Virtual Reality , 2018, UIST.

[16]  Martin Raubal,et al.  The importance of visual attention for adaptive interfaces , 2016, MobileHCI Adjunct.

[17]  Andreas Bulling,et al.  A Design Space for Gaze Interaction on Head-mounted Displays , 2019, CHI.

[18]  John M. Levine,et al.  Group dynamics over time: Development and socialization in small groups. , 1988 .

[19]  Peter Kiefer,et al.  Towards gaze-based interaction with urban outdoor spaces , 2016, UbiComp Adjunct.

[20]  Xucong Zhang,et al.  Robust eye contact detection in natural multi-person interactions using gaze and speaking behaviour , 2018, ETRA.

[21]  Sean Andrist,et al.  Looking Coordinated: Bidirectional Gaze Mechanisms for Collaborative Interaction with Virtual Characters , 2017, CHI.

[22]  D. Campbell Common fate, similarity, and other indices of the status of aggregates of persons as social entities , 2007 .