The conductor interaction method

Computers have increasingly become part of our everyday lives, with many activities either involving their direct use or being supported by one. This has prompted research into developing methods and mechanisms to assist humans in interacting with computers (human-computer interaction, or HCI). A number of HCI techniques have been developed over the years, some of which are quite old but continue to be used, and some more recent and still evolving. Many of these interaction techniques, however, are not natural in their use and typically require the user to learn a new means of interaction. Inconsistencies within these techniques and the restrictions they impose on user creativity can also make such interaction techniques difficult to use, especially for novice users. This article proposes an alternative interaction method, the conductor interaction method (CIM), which aims to provide a more natural and easier-to-learn interaction technique. This novel interaction method extends existing HCI methods by drawing upon techniques found in human-human interaction. It is argued that the use of a two-phased multimodal interaction mechanism, using gaze for selection and gesture for manipulation, incorporated within a metaphor-based environment, can provide a viable alternative for interacting with a computer (especially for novice users). Both the model and an implementation of the CIM within a system are presented in this article. This system formed the basis of a number of user studies that have been performed to assess the effectiveness of the CIM, the findings of which are discussed in this work.

[1]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[2]  Fionn Murtagh,et al.  Computer display control and interaction using eye‐gaze , 2002 .

[3]  Christian Breiteneder,et al.  TELEPORT – Towards immersive copresence , 1999, Multimedia Systems.

[4]  Anton Nijholt,et al.  Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes , 2001, CHI.

[5]  Terumasa Aoki,et al.  MONJU no CHIE SYSTEM : VIDEOCONFERENCE SYSTEM WITH EYE CONTACT FOR DECISION MAKING , 1999 .

[6]  Oliver Schreer,et al.  An immersive 3D video-conferencing system using shared virtual team user environments , 2002, CVE '02.

[7]  Richard A. Bolt,et al.  Two-handed gesture in multi-modal natural dialog , 1992, UIST '92.

[8]  M. Argyle Bodily communication, 2nd ed. , 1988 .

[9]  Fionn Murtagh,et al.  Eye-movements and Voice as Interface Modalities to Computer Systems , 2003, SPIE OPTO-Ireland.

[10]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[11]  Greg Welch,et al.  Toward a compelling sensation of telepresence: demonstrating a portal to a distant (static) office , 2000 .

[12]  Adrian David Cheok,et al.  Magic Asian art , 2006, CHI EA '06.

[13]  Newton Lee,et al.  ACM Transactions on Multimedia Computing, Communications and Applications (ACM TOMCCAP) , 2007, CIE.

[14]  Ben Shneiderman,et al.  Designing the User Interface: Strategies for Effective Human-Computer Interaction , 1998 .

[15]  Joseph A. Paradiso,et al.  The Digital Baton: a Versatile Performance Instrument , 1997, ICMC.

[16]  Steve Benford,et al.  VR‐VIBE: A Virtual Environment for Co‐operative Information Retrieval , 1995, Comput. Graph. Forum.

[17]  Howell O. Istance,et al.  Keeping an Eye on your Interface: Potential for Eye-Based Control of Graphical User Interfaces (GUI's) , 1994, BCS HCI.

[18]  Shumin Zhai,et al.  Conversing with the user based on eye-gaze patterns , 2005, CHI.

[19]  Karl-Friedrich Kraiss,et al.  Towards an Automatic Sign Language Recognition System Using Subunits , 2001, Gesture Workshop.

[20]  W. Buxton Human-Computer Interaction , 1988, Springer Berlin Heidelberg.

[21]  Dario D. Salvucci Inferring intent in eye-based interfaces: tracing eye movements with process models , 1999, CHI '99.

[22]  Gregory D. Abowd,et al.  Human-Computer Interaction (3rd Edition) , 2003 .

[23]  Roel Vertegaal,et al.  The GAZE groupware system: mediating joint attention in multiparty communication and collaboration , 1999, CHI '99.

[24]  Kristinn R. Thórisson,et al.  Real-time decision making in multimodal face-to-face communication , 1998, AGENTS '98.

[25]  Franz H. Bäuml,et al.  Dictionary of Worldwide Gestures , 1997 .

[26]  Greg Welch,et al.  Toward a compelling sensation of telepresence: demonstrating a portal to a distant (static) office , 2000, Proceedings Visualization 2000. VIS 2000 (Cat. No.00CH37145).

[27]  Jan O. Borchers WorldBeat: designing a baton-based interface for an interactive music exhibit , 1997, CHI.

[28]  Marc Erich Latoschik,et al.  Temporal Symbolic Integration Applied to a Multimodal System Using Gestures and Speech , 1999, Gesture Workshop.

[29]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.