Designing for augmented attention: Towards a framework for attentive user interfaces

Abstract Attentive user interfaces are user interfaces that aim to support the user’s attentional capacities. By sensing the users’ attention for objects and people in their everyday environment, and by treating user attention as a limited resource, these interfaces avoid today’s ubiquitous patterns of interruption. Focusing upon attention as a central interaction channel allows development of more sociable methods of communication and repair with ubiquitous devices. Our methods are analogous to human turn taking in group communication. Turn taking improves the user’s ability to conduct foreground processing of conversations. Attentive user interfaces bridge the gap between the foreground and periphery of user activity in a similar fashion, allowing users to move smoothly in between. We present a framework for augmenting user attention through attentive user interfaces. We propose five key properties of attentive systems: (i) to sense attention; (ii) to reason about attention; (iii) to regulate interactions; (iv) to communicate attention and (v) to augment attention.

[1]  Austin Henderson,et al.  Making sense of sensing systems: five questions for designers and researchers , 2002, CHI.

[2]  Roel Vertegaal,et al.  Catching the eye: management of joint attention in cooperative work , 1997, SGCH.

[3]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[4]  Shumin Zhai,et al.  Gaze and Speech in Attentive User Interfaces , 2000, ICMI.

[5]  Roel Vertegaal,et al.  EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.

[6]  Roel Vertegaal,et al.  Explaining effects of eye gaze on mediated group conversations:: amount or synchronization? , 2002, CSCW '02.

[7]  Anton Nijholt,et al.  Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes , 2001, CHI.

[8]  Andrew T. Duchowski,et al.  Gaze-Contingent Level Of Detail Rendering , 2001, Eurographics.

[9]  Michael H. Goldhaber,et al.  The Attention Economy and the Net , 1997, First Monday.

[10]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[11]  Woodrow Barfield,et al.  Virtual environments and advanced interface design , 1995 .

[12]  Roel Vertegaal,et al.  EyeWindows : Using Eye-Controlled Zooming Windows for Focus Selection , 2004 .

[13]  Christos Douligeris,et al.  Home Automation , 2008, Wiley Encyclopedia of Computer Science and Engineering.

[14]  G. McConkie,et al.  The span of the effective stimulus during a fixation in reading , 1975 .

[15]  S. Eubank,et al.  If smallpox strikes Portland.... , 2005, Scientific American.

[16]  Roel Vertegaal,et al.  Media eyepliances: using eye tracking for remote control focus selection of appliances , 2005, CHI EA '05.

[17]  Roel Vertegaal,et al.  The GAZE groupware system: mediating joint attention in multiparty communication and collaboration , 1999, CHI '99.

[18]  John Short,et al.  The social psychology of telecommunications , 1976 .

[19]  Roel Vertegaal,et al.  Attentive Office Cubicles: Mediating Visual and Auditory Interactions Between Office Co-Workers , 2004 .

[20]  Roel Vertegaal,et al.  Using social geometry to manage interruptions and co-worker attention in office environments , 2005, Graphics Interface.

[21]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[22]  Roy Want,et al.  Bridging physical and virtual worlds with electronic tags , 1999, CHI '99.

[23]  Harold Fox,et al.  Evaluating look-to-talk: a gaze-aware interface in a collaborative environment , 2002, CHI Extended Abstracts.

[24]  Andrea Lockerd Thomaz,et al.  Eye-R, a glasses-mounted eye motion detection interface , 2001, CHI Extended Abstracts.

[25]  W. Wayt Gibbs Considerate computing. , 2005, Scientific American.

[26]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[27]  Eric Harslem,et al.  Designing the STAR User Interface , 1987, ECICS.

[28]  Carl Gutwin Improving focus targeting in interactive fisheye views , 2002, CHI.

[29]  E. C. Cherry Some Experiments on the Recognition of Speech, with One and with Two Ears , 1953 .

[30]  Paul P. Maglio,et al.  SUITOR: an attentive information system , 2000, IUI '00.

[31]  Paul F Kirvan Conversing with computers , 1984 .

[32]  Peter Carter,et al.  Mies Van Der Rohe at Work , 1974 .

[33]  E. C. Cmm,et al.  on the Recognition of Speech, with , 2008 .

[34]  M. Argyle,et al.  Gaze and Mutual Gaze , 1994, British Journal of Psychiatry.

[35]  Carlos Hitoshi Morimoto,et al.  Pupil detection and tracking using multiple light sources , 2000, Image Vis. Comput..

[36]  Eric Horvitz,et al.  Principles of mixed-initiative user interfaces , 1999, CHI '99.

[37]  Roel Vertegaal,et al.  Using mental load for managing interruptions in physiologically attentive user interfaces , 2004, CHI EA '04.

[38]  Allison Woodruff,et al.  Popout prism: adding perceptual principles to overview+detail document interfaces , 2002, CHI.

[39]  Roel Vertegaal,et al.  GAZE-2: conveying eye contact in group video conferencing using eye-controlled camera direction , 2003, CHI '03.

[40]  Eric Horvitz,et al.  Attention-Sensitive Alerting , 1999, UAI.

[41]  Jakob Nielsen,et al.  Noncommand user interfaces , 1993, CACM.

[42]  Robert J. K. Jacob,et al.  Eye tracking in advanced interface design , 1995 .

[43]  S. Duncan,et al.  Some Signals and Rules for Taking Speaking Turns in Conversations , 1972 .

[44]  S. Fiske,et al.  Social Psychology , 2019, Definitions.

[45]  Patrick Baudisch,et al.  Focus plus context screens: combining display technology with visualization techniques , 2001, UIST '01.

[46]  Jeffrey S. Shell,et al.  Interacting with groups of computers , 2003, Commun. ACM.