Perspective cursor: perspective-based interaction for multi-display environments

Multi-display environments and smart meeting rooms are now becoming more common. These environments build a shared display space from variety of devices: tablets, projected surfaces, tabletops, and traditional monitors. Since the different display surfaces are usually not organized in a single plane, traditional schemes for stitching the displays together can cause problems for interaction. However, there is a more natural way to compose display space -- using perspective. In this paper, we develop interaction techniques for multi-display environments that are based on the user's perspective on the room. We designed the Perspective Cursor, a mapping of cursor to display space that appears natural and logical from wherever the user is located. We conducted an experiment to compare two perspective-based techniques, the Perspective Cursor and a beam-based technique, with traditional stitched displays. We found that both perspective techniques were significantly faster for targeting tasks than the traditional technique, and that Perspective Cursor was the most preferred method. Our results show that integrating perspective into the design of multi-display environments can substantially improve performance.

[1]  Carl Gutwin,et al.  A comparison of techniques for multi-display reaching , 2005, CHI.

[2]  Xing Chen,et al.  Lumipoint: multi-user laser-based interaction on large tiled displays , 2002 .

[3]  Steven K. Feiner,et al.  Multi-monitor mouse , 2005, CHI EA '05.

[4]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[5]  Patrick Baudisch,et al.  Mouse ether: accelerating the acquisition of targets across multi-monitor displays , 2004, CHI EA '04.

[6]  Jörg Geißler Shuffle, throw or take it! working efficiently with an interactive wall , 1998, CHI Conference Summary.

[7]  Jun Rekimoto,et al.  Pick-and-drop: a direct manipulation technique for multiple computer environments , 1997, UIST '97.

[8]  Norbert A. Streitz,et al.  i-LAND: an interactive landscape for creativity and innovation , 1999, CHI '99.

[9]  Mountaz Hascoët,et al.  Throwing Models for Large Displays , 2008 .

[10]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[11]  Jeffrey Nichols,et al.  Interacting at a Distance Using Semantic Snarfing , 2001, UbiComp.

[12]  Brian P. Bailey,et al.  ARIS: An Interface for Application Relocation in an Interactive Space , 2004, Graphics Interface.

[13]  Ravin Balakrishnan,et al.  "Beating" Fitts' law: virtual enhancements for pointing facilitation , 2004, Int. J. Hum. Comput. Stud..

[14]  Tovi Grossman,et al.  The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area , 2005, CHI.

[15]  S. H. Sato,et al.  Interaction design for large displays , 1997, INTR.

[16]  Terry Winograd,et al.  PointRight: experience with flexible input redirection in interactive workspaces , 2002, UIST '02.

[17]  Patrick Baudisch,et al.  Halo: a Technique for Visualizing Off-Screen Locations , 2003 .

[18]  Jun Rekimoto,et al.  SyncTap: An Interaction Technique for Mobile Networking , 2003, Mobile HCI.

[19]  A. Fox,et al.  Integrating information appliances into an interactive workspace , 2000, IEEE Computer Graphics and Applications.

[20]  G. W. Furnas,et al.  Generalized fisheye views , 1986, CHI '86.

[21]  Patrick Baudisch,et al.  Stitching: pen gestures that span multiple displays , 2004, AVI.

[22]  Klara Nahrstedt,et al.  A Middleware Infrastructure for Active Spaces , 2002, IEEE Pervasive Comput..

[23]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[24]  Mary Czerwinski,et al.  Drag-and-Pop and Drag-and-Pick: Techniques for Accessing Remote Screen Content on Touch- and Pen-Operated Systems , 2003, INTERACT.

[25]  Jeffrey Nichols,et al.  Interacting at a distance: measuring the performance of laser pointers and other devices , 2002, CHI.

[26]  Patrick Baudisch,et al.  Halo: a technique for visualizing off-screen objects , 2003, CHI '03.

[27]  Brian D. Fisher,et al.  The "mighty mouse" multi-screen collaboration tool , 2002, UIST '02.

[28]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[29]  Dan R. Olsen,et al.  Laser pointer interaction , 2001, CHI.

[30]  Ken Hinckley,et al.  Synchronous gestures for multiple persons and computers , 2003, UIST '03.

[31]  A BoltRichard,et al.  Put-that-there , 1980 .

[32]  Takuro Yonezawa,et al.  u-Texture: Self-Organizable Universal Panels for Creating Smart Surroundings , 2005, UbiComp.

[33]  Regan L. Mandryk,et al.  TractorBeam: seamless integration of local and remote pointing for tabletop displays , 2005, Graphics Interface.

[34]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[35]  Jun Rekimoto,et al.  Augmented surfaces: a spatially continuous work space for hybrid computing environments , 1999, CHI '99.

[36]  Desney S. Tan,et al.  The large-display user experience , 2005, IEEE Computer Graphics and Applications.