Robot-assisted shopping for the blind: issues in spatial cognition and product selection

Research on spatial cognition and blind navigation suggests that a device aimed at helping blind people to shop independently should provide the shopper with effective interfaces to the locomotor and haptic spaces of the supermarket. In this article, we argue that robots can act as effective interfaces to haptic and locomotor spaces in modern supermarkets. We also present the design and evaluation of three product selection modalities—browsing, typing and speech, which allow the blind shopper to select the desired product from a repository of thousands of products.

[1]  A. Parasuraman,et al.  Technology Readiness Index (Tri) , 2000 .

[2]  Max J. Egenhofer,et al.  Human Conceptions of Spaces: Implications for Geographic Information Systems1 , 1997 .

[3]  Hideo Mori,et al.  A Robotic Travel Aid for the Blind , 1998 .

[4]  Susanna Millar,et al.  Understanding and representing spatial information , 1995 .

[5]  Mark D. Dunlop,et al.  An Experimental Investigation into Wayfinding Directions for Visually Impaired People , 2005, Personal and Ubiquitous Computing.

[6]  Iwan Ulrich,et al.  The GuideCane-applying mobile robot technologies to assist the visually impaired , 2001, IEEE Trans. Syst. Man Cybern. Part A.

[7]  Paul U. Lee,et al.  How Space Structures Language , 1998, Spatial Cognition.

[8]  Wolfram Burgard,et al.  Mapping and localization with RFID technology , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[9]  Maja J. Mataric,et al.  Encouraging physical therapy compliance with a hands-Off mobile robot , 2006, HRI '06.

[10]  Scooter Willis Computer A Passive RFID Information Grid for Location and Proximity Sensing for the Blind User , 2004 .

[11]  Bhiksha Raj,et al.  Spokenquery: an alternate approach to chosing items with speech , 2004, INTERSPEECH.

[12]  Bruce N. Walker,et al.  SPEARCONS: SPEECH-BASED EARCONS IMPROVE NAVIGATION PERFORMANCE IN AUDITORY MENUS , 2006 .

[13]  Panos Markopoulos,et al.  People and Computers XVIII - Design for Life, Proceedings of HCI 2004, Leeds Metropolitan University, UK, 6-10 September 2004 , 2004, BCS HCI.

[14]  John Nicholson,et al.  RoboCart: toward robot-assisted navigation of grocery stores by the visually impaired , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  R. Klatzky,et al.  COGNITIVE MAPPING AND WAYFINDING BY ADULTS WITHOUT VISION , 1996 .

[16]  L. Kay,et al.  A sonar aid to enhance spatial perception of the blind: engineering design and evaluation , 1974 .

[17]  David Mioduser,et al.  A blind person's cognitive mapping of new spaces using a haptic virtual environment , 2003 .

[18]  Vladimir A. Kulyukin,et al.  Robots as interfaces to haptic and locomotor spaces , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[19]  John Nicholson,et al.  Robot-assisted wayfinding for the visually impaired in structured indoor environments , 2006, Auton. Robots.

[20]  Vladimir A. Kulyukin,et al.  Ergonomics-for-one in a robotic shopping cart for the blind , 2006, HRI '06.

[21]  Romedi Passini,et al.  Wayfinding without Vision , 1988 .

[22]  Gregg C. Vanderheiden,et al.  Web content accessibility guidelines 1.0 , 2001, INTR.

[23]  Stephen A. Brewster,et al.  Using nonspeech sounds to provide navigation cues , 1998, TCHI.

[24]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[25]  T. V. Raman Concrete Implementation of an Audio Desktop , 1997 .

[26]  Simon Ungar,et al.  Cognitive mapping without visual experience , 2018, Cognitive Mapping.

[27]  Susanna Millar,et al.  Theory, experiment and practical application in research on visual impairment , 1997 .

[28]  William W. Gaver The SonicFinder: An Interface That Uses Auditory Icons , 1989, Hum. Comput. Interact..

[29]  Sanjiv Singh,et al.  Preliminary results in range-only localization and mapping , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[30]  C. Freksa,et al.  Spatial Cognition, An Interdisciplinary Approach to Representing and Processing Spatial Knowledge , 1998 .

[31]  Ann C. Smith,et al.  Nonvisual tool for navigating hierarchical structures , 2003, ASSETS.

[32]  Dieter Fox,et al.  Markov localization - a probabilistic framework for mobile robot localization and navigation , 1998 .

[33]  Yolaine Bourda,et al.  A Context-aware Locomotion Assistance Device for the Blind , 2004, BCS HCI.

[34]  David J. Bryant,et al.  Three Spaces of Spatial Cognition , 1999 .

[35]  Priya Narasimhan,et al.  Trinetra: Assistive Technologies for the Blind , 2006 .

[36]  Minghui Jiang,et al.  Passive Radio Frequency Exteroception in Robot Assisted Shopping for the Blind , 2006, UIC.

[37]  Bhiksha Raj,et al.  A Speech-in List-out Approach to Spoken User Interfaces , 2004, HLT-NAACL.

[38]  Benjamin Kuipers,et al.  The Spatial Semantic Hierarchy , 2000, Artif. Intell..

[39]  Palle Klante Auditory Interaction Objects for Mobile Applications , 2004 .

[40]  Clifton Forlines,et al.  Subset languages for conversing with collaborative interface agents , 2002, INTERSPEECH.

[41]  Max J. Egenhofer,et al.  Human conceptions of spaces: Implications for GIS , 1997 .

[42]  Gregory D. Abowd,et al.  Towards a Better Understanding of Context and Context-Awareness , 1999, HUC.

[43]  Hideo Mori,et al.  Robotic travel aid for the blind: HARUNOBU-6 , 1998 .