Evaluating touching and pointing with a mobile terminal for physical browsing

Physical browsing is a user interaction paradigm in which the user interacts with physical objects by using a mobile terminal to select the object for some action. The objects contain links to digital services and information related to the objects. The links are implemented with tags that are readable by the mobile terminal. We have built a system that supports selecting objects for interaction by touching and pointing at them. Our physical browsing system emulates passive sensor-equipped long-range RFID tags and a mobile terminal equipped with an RFID reader. We have compared different system configurations for touching and pointing. Additionally, we have evaluated other parameters of physical selection, such as conditions for choice of selection method. In our evaluation of the system, we found touching and pointing to be useful and complementary methods for selecting an object for interaction.

[1]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[2]  Jeffrey Nichols,et al.  Interacting at a distance: measuring the performance of laser pointers and other devices , 2002, CHI.

[3]  J. Luttrull,et al.  Laser pointer-induced macular injury. , 1999, American journal of ophthalmology.

[4]  Jun Rekimoto,et al.  CyberCode: designing augmented reality environments with visual tags , 2000, DARE '00.

[5]  Roy Want,et al.  Bridging physical and virtual worlds with electronic tags , 1999, CHI '99.

[6]  Heikki Ailisto,et al.  A user interaction paradigm for physical browsing and near-object control based on tags , 2003 .

[7]  Kori Inkpen Quinn,et al.  That one there! Pointing to establish device identity , 2002, UIST '02.

[8]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[9]  K. Ducatel,et al.  Scenarios for Ambient Intelligence in 2010 Final Report , 2001 .

[10]  Pasi Välkkynen,et al.  RFID TAG READER SYSTEM EMULATOR TO SUPPORT TOUCHING , POINTING AND SCANNING , 2005 .

[11]  Heikki Ailisto,et al.  User experiment with physical pointing for accessing services with a mobile device , 2004 .

[12]  Alan F. Blackwell,et al.  Using camera-phones to interact with context-aware mobile services , 2004 .

[13]  Heinrich Müller,et al.  Interaction with a projection screen using a camera-tracked laser pointer , 1998, Proceedings 1998 MultiMedia Modeling. MMM'98 (Cat. No.98EX200).

[14]  Lars Erik Holmquist,et al.  Token-Based Acces to Digital Information , 1999, HUC.

[15]  Jeffrey Nichols,et al.  Interacting at a Distance Using Semantic Snarfing , 2001, UbiComp.

[16]  Neeti Gupta,et al.  Scanning Objects in the Wild: Assessing an Object Triggered Information System , 2005, UbiComp.

[17]  Jukka Riekki,et al.  Requesting Pervasive Services by Touching RFID Tags , 2006, IEEE Pervasive Computing.

[18]  J. Rekimoto,et al.  CyberCode YYYY No org found YYY: Designing Augmented Reality Environments with Visual Tags , 2000 .

[19]  Tim Kindberg Implementing physical hyperlinks using ubiquitous identifier resolution , 2002, WWW '02.

[20]  Michael Rohs,et al.  The smart phone: a ubiquitous input device , 2006, IEEE Pervasive Computing.

[21]  Bill Serra,et al.  People, Places, Things: Web Presence for the Real World , 2002, Mob. Networks Appl..

[22]  Arto Ylisaukko-oja,et al.  SoapBox: A Platform for Ubiquitous Computing Research and Applications , 2002, Pervasive.

[23]  M. H. Heycock,et al.  Papers , 1971, BMJ : British Medical Journal.

[24]  Jakob Nielsen,et al.  Usability engineering , 1997, The Computer Science and Engineering Handbook.

[25]  Katashi Nagao,et al.  The world through the computer: computer augmented interaction with real world environments , 1995, UIST '95.

[26]  J Marshall,et al.  The safety of laser pointers: myths and realities , 1998, The British journal of ophthalmology.

[27]  Dan R. Olsen,et al.  Laser pointer interaction , 2001, CHI.