Touch projector: mobile interaction through video

In 1992, Tani et al. proposed remotely operating machines in a factory by manipulating a live video image on a computer screen. In this paper we revisit this metaphor and investigate its suitability for mobile use. We present Touch Projector, a system that enables users to interact with remote screens through a live video image on their mobile device. The handheld device tracks itself with respect to the surrounding displays. Touch on the video image is "projected" onto the target display in view, as if it had occurred there. This literal adaptation of Tani's idea, however, fails because handheld video does not offer enough stability and control to enable precise manipulation. We address this with a series of improvements, including zooming and freezing the video image. In a user study, participants selected targets and dragged targets between displays using the literal and three improved versions. We found that participants achieved highest performance with automatic zooming and temporary image freezing.

[1]  Ben Shneiderman,et al.  Direct Manipulation: A Step Beyond Programming Languages , 1983, Computer.

[2]  Thomas Seifried,et al.  CRISTAL : Design and Implementation of a Remote Control System Based on a Multi-touch Display , 2009 .

[3]  Lynn Wilcox,et al.  Shared interactive video for teleconferencing , 2003, MULTIMEDIA '03.

[4]  W. Buxton,et al.  Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display , 2002, UIST '02.

[5]  Jun Rekimoto,et al.  Pick-and-drop: a direct manipulation technique for multiple computer environments , 1997, UIST '97.

[6]  Johannes Schöning,et al.  Map navigation with mobile devices: virtual versus physical movement with and without visual context , 2007, ICMI '07.

[7]  Anthony Tang,et al.  Shadow reaching: a new perspective on interaction for large displays , 2007, UIST.

[8]  Charles L. A. Clarke,et al.  Bimanual and unimanual image alignment: an evaluation of mouse-based techniques , 2005, UIST '05.

[9]  Daniel Jackson,et al.  Smart Phone Interaction with Registered Displays , 2009, IEEE Pervasive Computing.

[10]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[11]  Kimiya Yamaashi,et al.  Object-oriented video: interaction with real-world objects through live video , 1992, CHI.

[12]  Andreas Butz,et al.  Shoot & copy: phonecam-based information transfer from public displays onto mobile phones , 2007, Mobility '07.

[13]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[14]  Daniel Vogel,et al.  HybridPointing: fluid switching between absolute and relative pointing with a direct input device , 2006, UIST.

[15]  Steven A. Shafer,et al.  XWand: UI for intelligent spaces , 2003, CHI '03.

[16]  Carl Gutwin,et al.  Perspective cursor: perspective-based interaction for multi-display environments , 2006, CHI.

[17]  Michael Rohs,et al.  The smart phone: a ubiquitous input device , 2006, IEEE Pervasive Computing.

[18]  Mary Czerwinski,et al.  Drag-and-Pop and Drag-and-Pick: Techniques for Accessing Remote Screen Content on Touch- and Pen-Operated Systems , 2003, INTERACT.

[19]  Jeffrey Nichols,et al.  Interacting at a distance: measuring the performance of laser pointers and other devices , 2002, CHI.

[20]  Randy Pausch,et al.  Virtual reality on a WIM: interactive worlds in miniature , 1995, CHI '95.

[21]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[22]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[23]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI Conference Companion.

[24]  Takeo Igarashi,et al.  Sketch and run: a stroke-based interface for home robots , 2009, CHI.

[25]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI 1994.

[26]  Patrick Baudisch,et al.  Introduction to this Special Issue on Ubiquitous Multi-Display Environments , 2009, Hum. Comput. Interact..

[27]  William Buxton,et al.  Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display , 2002, UIST '02.

[28]  Terry Winograd,et al.  Benefits of merging command selection and direct manipulation , 2005, TCHI.

[29]  Daisuke Sakamoto,et al.  CRISTAL: a collaborative home media and device controller based on a multi-touch display , 2009, ITS '09.

[30]  Carl Gutwin,et al.  There and Back Again: Cross-Display Object Movement in Multi-Display Environments , 2009, Hum. Comput. Interact..

[31]  Desney S. Tan,et al.  WinCuts: manipulating arbitrary window regions for more effective use of screen space , 2004, CHI EA '04.

[32]  Ben Shneiderman,et al.  High Precision Touchscreens: Design Strategies and Comparisons with a Mouse , 1991, Int. J. Man Mach. Stud..

[33]  Desney S. Tan,et al.  The large-display user experience , 2005, IEEE Computer Graphics and Applications.

[34]  Michael Rohs,et al.  Sweep and point and shoot: phonecam-based interactions for large public displays , 2005, CHI Extended Abstracts.

[35]  Patrick Baudisch,et al.  Stitching: pen gestures that span multiple displays , 2004, AVI.

[36]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[37]  Terry Winograd,et al.  PointRight: experience with flexible input redirection in interactive workspaces , 2002, UIST '02.

[38]  Andrew S. Forsberg,et al.  Image plane interaction techniques in 3D immersive environments , 1997, SI3D.

[39]  Daniel Vogel,et al.  Shift: a technique for operating pen-based interfaces using touch , 2007, CHI.