Augmenting Looking, Pointing and Reaching Gestures to Enhance the Searching and Browsing of Physical Objects

In this paper we present a framework for attaching information to physical objects in a way that can be interactively browsed and searched in a hands-free, multi-modal, and personalized manner that leverages users' natural looking, pointing and reaching behaviors. The system uses small infrared transponders on objects in the environment and worn by the user to achieve dense, on-object visual feedback usually possible only in augmented reality systems, while improving on interaction style and requirements for wearable gear. We discuss two applications that have been implemented, a tutorial about the parts of an automobile engine and a personalized supermarket assistant. The paper continues with a user study investigating browsing and searching behaviors in the supermarket scenario, and concludes with a discussion of findings and future work.

[1]  Jeffrey S. Shell,et al.  Interacting with groups of computers , 2003, Commun. ACM.

[2]  Marc A. Smith,et al.  AURA: A mobile platform for object and location annotation , 2003 .

[3]  Gregory D. Abowd,et al.  Cyberguide: A mobile context‐aware tour guide , 1997, Wirel. Networks.

[4]  Jeffrey S. Shell,et al.  ECSGlasses and EyePliances: using attention to open sociable windows of interaction , 2004, ETRA.

[5]  Andy Hopper,et al.  The active badge location system , 1992, TOIS.

[6]  D. Roy Grounding words in perception and action: computational insights , 2005, Trends in Cognitive Sciences.

[7]  Steven K. Feiner,et al.  Exploring MARS: developing indoor and outdoor user interfaces to a mobile augmented reality system , 1999, Comput. Graph..

[8]  George Buchanan,et al.  An Evaluation of WebTwig - A Site Outliner for Handheld Web Access , 1999, HUC.

[9]  Jun Rekimoto,et al.  CyberCode: designing augmented reality environments with visual tags , 2000, DARE '00.

[10]  Bill Serra,et al.  People, Places, Things: Web Presence for the Real World , 2000, Proceedings Third IEEE Workshop on Mobile Computing Systems and Applications.

[11]  Roy Want,et al.  Bridging physical and virtual worlds with electronic tags , 1999, CHI '99.

[12]  Jun Rekimoto,et al.  iCam: Precise at-a-Distance Interaction in the Physical Environment , 2006, Pervasive.

[13]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[14]  Gaetano Borriello,et al.  UbiComp 2002: Ubiquitous Computing: 4th International Conference Göteborg, Sweden, September 29 – October 1, 2002 Proceedings , 2002, Lecture Notes in Computer Science.

[15]  Candace L. Sidner,et al.  Attention, Intentions, and the Structure of Discourse , 1986, CL.

[16]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[17]  Uwe Hansmann,et al.  Pervasive Computing , 2003 .

[18]  Keith Cheverst,et al.  Developing a context-aware electronic tourist guide: some issues and experiences , 2000, CHI.

[19]  William Ribarsky,et al.  The Perceptive Workbench: toward spontaneous and natural interaction in semi-immersive virtual environments , 2000, Proceedings IEEE Virtual Reality 2000 (Cat. No.00CB37048).

[20]  William Ribarsky,et al.  T owards Spontaneous Interaction with the Perceptive Workbench, a Semi-Immersive Virtual Environment* , 2000 .

[21]  Harold Fox,et al.  Evaluating look-to-talk: a gaze-aware interface in a collaborative environment , 2002, CHI Extended Abstracts.

[22]  Marcus Specht,et al.  A Context-Sensitive Nomadic Exhibition Guide , 2000, HUC.

[23]  Hiroshi Ishii,et al.  Sensetable: a wireless object tracking platform for tangible user interfaces , 2001, CHI.

[24]  Ronald Azuma,et al.  Recent Advances in Augmented Reality , 2001, IEEE Computer Graphics and Applications.

[25]  Steven K. Feiner,et al.  Augmented reality: a new way of seeing. , 2002, Scientific American.

[26]  Roel Vertegaal,et al.  ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis , 2005, UIST.

[27]  Katashi Nagao,et al.  The world through the computer: computer augmented interaction with real world environments , 1995, UIST '95.

[28]  Michael Rohs,et al.  USING CAMERA-EQUIPPED MOBILE PHONES FOR INTERACTING WITH REAL-WORLD OBJECTS , 2004 .

[29]  Anind K. Dey,et al.  UbiComp 2003: Ubiquitous Computing , 2003, Lecture Notes in Computer Science.

[30]  Joseph A. Paradiso,et al.  The FindIT Flashlight: Responsive Tagging Based on Optically Triggered Microprocessor Wakeup , 2002, UbiComp.

[31]  D. Roy Grounding Words in Perception and Action: Insights from Computational Models , 2005 .

[32]  Gregory D. Abowd,et al.  A 2-Way Laser-Assisted Selection Scheme for Handhelds in a Physical Environment , 2003, UbiComp.

[33]  Alexander H. Waibel,et al.  Estimating focus of attention based on gaze and sound , 2001, PUI '01.

[34]  Andrew T. Duchowski,et al.  Proceedings of the 2006 symposium on Eye tracking research & applications , 2000 .

[35]  Bill N. Schilit,et al.  An overview of the PARCTAB ubiquitous computing experiment , 1995, IEEE Wirel. Commun..