GIST: a gestural interface for remote nonvisual spatial perception

Spatial perception is a challenging task for people who are blind due to the limited functionality and sensing range of hands. We present GIST, a wearable gestural interface that offers spatial perception functionality through the novel appropriation of the user's hands into versatile sensing rods. Using a wearable depth-sensing camera, GIST analyzes the visible physical space and allows blind users to access spatial information about this space using different hand gestures. By allowing blind users to directly explore the physical space using gestures, GIST allows for the closest mapping between augmented and physical reality, which facilitates spatial interaction. A user study with eight blind users evaluates GIST in its ability to help perform everyday tasks that rely on spatial perception, such as grabbing an object or interacting with a person. Results of our study may help develop new gesture based assistive applications.

[1]  Daniel Thalmann,et al.  A wearable system for mobility improvement of visually impaired people , 2007, The Visual Computer.

[2]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[3]  Harald Reiterer,et al.  NAVI - A Proof-of-Concept of a Mobile Navigational Aid for Visually Impaired Based on the Microsoft Kinect , 2011, INTERACT.

[4]  James M. Coughlan,et al.  Crosswatch: A Camera Phone System for Orienting Visually Impaired Pedestrians at Traffic Intersections , 2008, ICCHP.

[5]  Sandra Gordon-Salant,et al.  Recognition of rapid speech by blind and sighted older adults. , 2011, Journal of speech, language, and hearing research : JSLHR.

[6]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[7]  Kostas E. Bekris,et al.  The user as a sensor: navigating users with visual impairments in indoor spaces using tactile landmarks , 2012, CHI.

[8]  日向 俊二 Kinect for Windowsアプリを作ろう , 2012 .

[9]  William H Paloski,et al.  Sensorimotor posture control in the blind: superior ankle proprioceptive acuity does not compensate for vision loss. , 2013, Gait & posture.

[10]  Roberto Manduchi,et al.  Dynamic environment exploration using a virtual white cane , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[11]  James M. Coughlan,et al.  Towards a Real-Time System for Finding and Reading Signs for Visually Impaired Users , 2012, ICCHP.

[12]  Guido Bologna,et al.  Spatial awareness and intelligibility for the blind: audio-touch interfaces , 2012, CHI EA '12.

[13]  Andrew T. Woods,et al.  Visual, haptic and crossmodal recognition of scenes , 2005, Experimental Brain Research.

[14]  Tony Morelli,et al.  Spatial gestures using a tactile-proprioceptive display , 2012, Tangible and Embedded Interaction.

[15]  P. Bach-y-Rita,et al.  Sensory substitution and the human–machine interface , 2003, Trends in Cognitive Sciences.

[16]  Leslie Kay Ultrasonic eyeglasses for the blind , 2000 .

[17]  James M. Coughlan,et al.  Computer vision-based clear path guidance for blind wheelchair users , 2008, Assets '08.

[18]  A. Streri,et al.  Touching for knowing : cognitive psychology of haptic manual perception , 2003 .

[19]  Thomas Ertl,et al.  Design and development of an indoor navigation and object identification system for the blind , 2003, ASSETS.

[20]  Roberto Manduchi,et al.  Mobile Vision as Assistive Technology for the Blind: An Experimental Study , 2012, ICCHP.

[21]  Yves Rossetti,et al.  Effects of Visual Deprivation on Space Representation: Immediate and Delayed Pointing toward Memorised Proprioceptive Targets , 2006, Perception.

[22]  George D. Stetten,et al.  FingerSight#8482;: fingertip control and haptic sensing of the visual environment , 2008, SIGGRAPH '08.

[23]  C. E. Rash,et al.  Helmet-mounted Displays: Sensation, Perception, and Cognition Issues , 2009 .

[24]  Mandayam A. Srinivasan,et al.  Visually impaired person's use of the PHANToM for information about texture and 3D form of virtual objects. , 1998 .

[25]  Pattie Maes,et al.  SixthSense: a wearable gestural interface , 2009, SIGGRAPH ASIA Art Gallery & Emerging Technologies.

[26]  Ali Israr,et al.  Tactile display for the visually impaired using TeslaTouch , 2011, CHI Extended Abstracts.

[27]  Peter B. L. Meijer,et al.  An experimental system for auditory image representations , 1992, IEEE Transactions on Biomedical Engineering.

[28]  Xu Liu,et al.  A camera phone based currency reader for the visually impaired , 2008, Assets '08.

[29]  D. Bolgiano,et al.  A laser cane for the blind , 1967 .

[30]  Céline Mancas-Thillou,et al.  Embedded reading device for blind people: a user-centered design , 2004, 33rd Applied Imagery Pattern Recognition Workshop (AIPR'04).

[31]  Shwetak N. Patel,et al.  The haptic laser: multi-sensation tactile feedback for at-a-distance physical space perception and interaction , 2011, CHI.

[32]  Seung-Chan Kim,et al.  Tactile feedback on flat surfaces for the visually impaired , 2012, CHI Extended Abstracts.

[33]  S. Upson Tongue Vision , 2007, IEEE Spectrum.

[34]  Roberto Manduchi,et al.  (Computer) vision without sight , 2012, Commun. ACM.

[35]  Sethuraman Panchanathan,et al.  iCARE interaction assistant: a wearable face recognition system for individuals with visual impairments , 2005, Assets '05.

[36]  Rob Miller,et al.  VizWiz: nearly real-time answers to visual questions , 2010, UIST.

[37]  David Mioduser,et al.  Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping – a Case Study , 2001 .