A camera-based interface for interaction with mobile handheld computers

Recent advances in mobile computing allow the users to deal with 3D interactive graphics on handheld computers. Although the computing resources and screen resolutions grow steadily, user interfaces for handheld computers do not change significantly. Consequently, we designed a new 3-DOF interface adapted to the characteristics of handheld computers. This interface tracks the movement of a target that the user holds behind the screen by analyzing the video stream of the handheld computer camera. The position of the target is directly inferred from the color-codes that are printed on it using an efficient algorithm. The users can easily interact in real-time in a mobile setting. The visualization of the data is good as the target does not occlude the screen and the interaction techniques are not dependent on the orientation of the handheld computer. We used the interface in several test applications for the visualization of large images such as maps, the manipulation of 3D models, and the navigation in 3D scenes. This new interface favors the development of 2D and 3D interactive applications on handheld computers.

[1]  William Ribarsky,et al.  Isometric pointer interfaces for wearable 3D visualization , 2003, CHI Extended Abstracts.

[2]  Shumin Zhai,et al.  Virtual reality for palmtop computers , 1993, TOIS.

[3]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[4]  Michael Rohs,et al.  Real-World Interaction with Camera Phones , 2004, UCS.

[5]  Katashi Nagao,et al.  The world through the computer: computer augmented interaction with real world environments , 1995, UIST '95.

[6]  Xavier Granier,et al.  A software for reconstructing 3D-terrains from scanned maps , 2004, SIGGRAPH '04.

[7]  Gregory D. Abowd,et al.  Opportunistic Annexing for Handheld Devices: Opportunities and Challenges , 2003 .

[8]  Mark R. Mine,et al.  Exploiting Proprioception in Virtual-Environment Interaction , 1997 .

[9]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[10]  Jun Rekimoto,et al.  Tilting operations for small screen interfaces , 1996, UIST '96.

[11]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[12]  Takeo Igarashi,et al.  Speed-dependent automatic zooming for browsing large documents , 2000, UIST '00.

[13]  Oliver Bimber,et al.  A Translucent Sketchpad for the Virtual Table Exploring Motion‐based Gesture Recognition , 1999, Comput. Graph. Forum.

[14]  Johan Plomp,et al.  An Adaptive Map-Based Interface for Situated Services , 2003 .

[15]  Mikael Wiberg,et al.  Scrollpad: tangible scrolling with mobile devices , 2004, 37th Annual Hawaii International Conference on System Sciences, 2004. Proceedings of the.

[16]  Ben Shneiderman,et al.  Designing the User Interface: Strategies for Effective Human-Computer Interaction , 1998 .

[17]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[18]  Richard Szeliski,et al.  The VideoMouse: a camera-based multi-degree-of-freedom input device , 1999, UIST '99.

[19]  James D. Hollan,et al.  Pad++: a zooming graphical interface for exploring alternate interface physics , 1994, UIST '94.

[20]  Dieter Schmalstieg,et al.  First steps towards handheld augmented reality , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[21]  Ravin Balakrishnan,et al.  The role of kinesthetic reference frames in two-handed input performance , 1999, UIST '99.

[22]  Jesper Kjeldskov,et al.  Interaction Design for Handheld Computers , 2002 .

[23]  Paul Mason,et al.  MagicMouse: an inexpensive 6-degree-of-freedom mouse , 2003, GRAPHITE '03.

[24]  James L. Flanagan,et al.  Multimodal interaction on PDA's integrating speech and pen inputs , 2003, INTERSPEECH.