Comparison of multimodal interactions in perspective-corrected multi-display environment

This paper compares multi-modal interaction techniques in a perspective-corrected multi-display environment (MDE). The performance of multimodal interactions using gestures, eye gaze, and head direction are experimentally examined in an object manipulation task in MDEs and compared with a mouse operated perspective cursor. Experimental results showed that gesture-based multimodal interactions provide performance equivalent in task completion time to mouse-based perspective cursors. A technique utilizing user head direction received positive comments from subjects even though it was not as fast. Based on the experimental results and observations, we discuss the potential of multimodal interaction techniques in MDEs.

[1]  Patrick Baudisch,et al.  Halo: a technique for visualizing off-screen objects , 2003, CHI '03.

[2]  Jun Rekimoto,et al.  Augmented surfaces: a spatially continuous work space for hybrid computing environments , 1999, CHI '99.

[3]  Andriy Pavlovych,et al.  The tradeoff between spatial jitter and latency in pointing tasks , 2009, EICS '09.

[4]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[5]  Carl Gutwin,et al.  Perspective cursor: perspective-based interaction for multi-display environments , 2006, CHI.

[6]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[7]  Wolfgang Stuerzlinger,et al.  Laser Pointers as Collaborative Pointing Devices , 2002, Graphics Interface.

[8]  Christopher Schmandt,et al.  The intelligent voice-interactive interface , 1982, CHI '82.

[9]  Andrew S. Forsberg,et al.  Image plane interaction techniques in 3D immersive environments , 1997, SI3D.

[10]  Claudio S. Pinhanez,et al.  A study on the manipulation of 2D objects in a projector/camera-based augmented reality environment , 2005, CHI.

[11]  Carl Gutwin,et al.  A comparison of techniques for multi-display reaching , 2005, CHI.

[12]  Kenneth L. Kraemer,et al.  Computer-based systems for cooperative work and group decision making , 1988, CSUR.

[13]  Regan L. Mandryk,et al.  TractorBeam: seamless integration of local and remote pointing for tabletop displays , 2005, Graphics Interface.

[14]  Desney S. Tan,et al.  WinCuts: manipulating arbitrary window regions for more effective use of screen space , 2004, CHI EA '04.

[15]  Jun Rekimoto,et al.  Pick-and-drop: a direct manipulation technique for multiple computer environments , 1997, UIST '97.

[16]  Mountaz Hascoët,et al.  Throwing Models for Large Displays , 2008 .

[17]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[18]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[19]  Roel Vertegaal,et al.  EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.

[20]  Brad A. Myers,et al.  Collaboration using multiple PDAs connected to a PC , 1998, CSCW '98.

[21]  Mark Ashdown,et al.  Combining head tracking and mouse input for a GUI on multiple monitors , 2005, CHI Extended Abstracts.

[22]  Patrick Baudisch,et al.  Stitching: pen gestures that span multiple displays , 2004, AVI.

[23]  Yasuhito Suenaga,et al.  "Finger-pointer": a glove free interface , 1992, CHI '92.

[24]  I. Scott MacKenzie,et al.  Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts' law research in HCI , 2004, Int. J. Hum. Comput. Stud..

[25]  M. Sheelagh T. Carpendale,et al.  A comparison of ray pointing techniques for very large displays , 2010, Graphics Interface.

[26]  Robert J. Teather,et al.  Assessing the Effects of Orientation and Device on (Constrained) 3D Movement Techniques , 2008, 2008 IEEE Symposium on 3D User Interfaces.

[27]  Jun Rekimoto,et al.  A multiple device approach for supporting whiteboard-based interactions , 1998, CHI.

[28]  Norbert A. Streitz,et al.  Roomware: computers disappear and interaction evolves , 2004, Computer.

[29]  Mary Czerwinski,et al.  Drag-and-Pop and Drag-and-Pick: Techniques for Accessing Remote Screen Content on Touch- and Pen-Operated Systems , 2003, INTERACT.

[30]  Jeffrey Nichols,et al.  Interacting at a distance: measuring the performance of laser pointers and other devices , 2002, CHI.

[31]  Rencheng Wang,et al.  An extending Fitts' Law for human upper limb performance evaluation , 2001, 2001 Conference Proceedings of the 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[32]  George W. Fitzmaurice,et al.  A remote control interface for large displays , 2004, UIST '04.

[33]  Takeo Igarashi,et al.  Voice as sound: using non-verbal voice input for interactive control , 2001, UIST '01.

[34]  Yoshifumi Kitamura,et al.  A Middleware for Seamless Use of Multiple Displays , 2008, DSV-IS.

[35]  Carl Gutwin,et al.  E-conic: a perspective-aware interface for multi-display environments , 2007, UIST.

[36]  Brian P. Bailey,et al.  ARIS: An Interface for Application Relocation in an Interactive Space , 2004, Graphics Interface.

[37]  Dan R. Olsen,et al.  Laser pointer interaction , 2001, CHI.