Look together: using gaze for assisting co-located collaborative search

Gaze information provides indication of users focus which complements remote collaboration tasks, as distant users can see their partner’s focus. In this paper, we apply gaze for co-located collaboration, where users’ gaze locations are presented on the same display, to help collaboration between partners. We integrated various types of gaze indicators on the user interface of a collaborative search system, and we conducted two user studies to understand how gaze enhances coordination and communication between co-located users. Our results show that gaze indeed enhances co-located collaboration, but with a trade-off between visibility of gaze indicators and user distraction. Users acknowledged that seeing gaze indicators eases communication, because it let them be aware of their partner’s interests and attention. However, users can be reluctant to share their gaze information due to trust and privacy, as gaze potentially divulges their interests.

[1]  Yanxia Zhang,et al.  A collaborative gaze aware information display , 2015, UbiComp/ISWC Adjunct.

[2]  Robin Wolff,et al.  Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments , 2008, CSCW.

[3]  Robert E. Kraut,et al.  Visual Information as a Conversational Resource in Collaborative Physical Tasks , 2003, Hum. Comput. Interact..

[4]  Carl Gutwin,et al.  Design for individuals, design for groups: tradeoffs between power and workspace awareness , 1998, CSCW '98.

[5]  Jörg Müller,et al.  GazeHorizon: enabling passers-by to interact with public displays by gaze , 2014, UbiComp.

[6]  M. Roseman,et al.  A usability study of awareness widgets in a shared workspace groupware system , 1996, CSCW '96.

[7]  Yanxia Zhang,et al.  SideWays: a gaze interface for spontaneous interaction with situated displays , 2013, CHI.

[8]  Yvonne Rogers,et al.  Collaborating around vertical and horizontal large interactive displays: which way is best? , 2004, Interact. Comput..

[9]  Manfred Tscheligi,et al.  Shared Gaze in the Car: Towards a Better Driver-Passenger Collaboration , 2014, AutomotiveUI.

[10]  Steve Whittaker,et al.  The role of vision in face-to-face and mediated communication. , 1997 .

[11]  Robert E. Kraut,et al.  Language Efficiency and Visual Technology , 2004 .

[12]  Shumin Zhai,et al.  Conversing with the user based on eye-gaze patterns , 2005, CHI.

[13]  Christopher A. Dickinson,et al.  Coordinating spatial referencing using shared gaze , 2010, Psychonomic bulletin & review.

[14]  Yanxia Zhang,et al.  Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction , 2014, AVI.

[15]  Carl Gutwin,et al.  The effects of co-present embodiments on awareness and collaboration in tabletop groupware , 2008, Graphics Interface.

[16]  Jörg Müller,et al.  Eye tracking for public displays in the wild , 2015, Personal and Ubiquitous Computing.

[17]  Carl Gutwin,et al.  A Descriptive Framework of Workspace Awareness for Real-Time Groupware , 2002, Computer Supported Cooperative Work (CSCW).

[18]  Andrew Blake,et al.  The Costs and Benefits of Combining Gaze and Hand Gestures for Remote Interaction , 2015, INTERACT.

[19]  M. Sheelagh T. Carpendale,et al.  CoCoNutTrix: Collaborative Retrofitting for Information Visualization , 2009, IEEE Computer Graphics and Applications.

[20]  Andreas Paepcke,et al.  Improving the accuracy of gaze input for interaction , 2008, ETRA.

[21]  Christopher A. Dickinson,et al.  Coordinating cognition: The costs and benefits of shared gaze during collaborative search , 2008, Cognition.

[22]  Eli Blevis,et al.  Concepts that support collocated collaborative work inspired by the specific context of industrial designers , 2004, CSCW.

[23]  Frank Broz,et al.  Mutual gaze, personality, and familiarity: Dual eye-tracking during conversation , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[24]  Shumin Zhai,et al.  RealTourist - A Study of Augmenting Human-Human and Human-Computer Dialogue with Eye-Gaze Overlay , 2005, INTERACT.

[25]  Roel Vertegaal,et al.  The GAZE groupware system: mediating joint attention in multiparty communication and collaboration , 1999, CHI '99.

[26]  Abigail Sellen,et al.  Remote Conversations: The Effects of Mediating Talk With Technology , 1995, Hum. Comput. Interact..

[27]  Wilson S. Geisler,et al.  Gaze-contingent real-time simulation of arbitrary visual fields , 2002, IS&T/SPIE Electronic Imaging.

[28]  Carl Gutwin,et al.  Semantic telepointers for groupware , 1996, Proceedings Sixth Australian Conference on Computer-Human Interaction.

[29]  Pierre Dillenbourg,et al.  Deixis and gaze in collaborative work at a distance (over a shared map): a computational model to detect misunderstandings , 2008, ETRA.

[30]  Susan Brennan,et al.  Another person's eye gaze as a cue in solving programming problems , 2004, ICMI '04.

[31]  Philip R. Cohen,et al.  Referring as a Collaborative Process , 2003 .

[32]  Hiroshi Ishii,et al.  ClearBoard: a seamless medium for shared drawing and conversation with eye contact , 1992, CHI.