Predicting Successful Tactile Mapping of Virtual Objects

Improving spatial ability of blind and visually impaired people is the main target of orientation and mobility (O&M) programs. In this study, we use a minimalistic mouse-shaped haptic device to show a new approach aimed at evaluating devices providing tactile representations of virtual objects. We consider psychophysical, behavioral, and subjective parameters to clarify under which circumstances mental representations of spaces (cognitive maps) can be efficiently constructed with touch by blindfolded sighted subjects. We study two complementary processes that determine map construction: low-level perception (in a passive stimulation task) and high-level information integration (in an active exploration task). We show that jointly considering a behavioral measure of information acquisition and a subjective measure of cognitive load can give an accurate prediction and a practical interpretation of mapping performance. Our simple TActile MOuse (TAMO) uses haptics to assess spatial ability: this may help individuals who are blind or visually impaired to be better evaluated by O&M practitioners or to evaluate their own performance.

[1]  P. Bach-y-Rita,et al.  Sensory substitution and the human–machine interface , 2003, Trends in Cognitive Sciences.

[2]  William R. Provancher,et al.  Design of a Fingertip-Mounted Tactile Display with Tangential Skin Displacement Feedback , 2010, IEEE Transactions on Haptics.

[3]  M. Sheelagh T. Carpendale,et al.  The Haptic Tabletop Puck: tactile feedback for interactive tabletops , 2009, ITS '09.

[4]  Jack L. Nasar,et al.  Relation of Physical Form to Spatial Knowledge in Largescale Virtual Environments , 2005 .

[5]  Darwin G. Caldwell,et al.  Virtual Sequencing with a Tactile Feedback Device , 2010, HAID.

[6]  Ryad Chellali,et al.  How taxel-based displaying devices can help visually impaired people to navigate safely , 2000, 2009 4th International Conference on Autonomous Robots and Agents.

[7]  Motoyuki Akamatsu,et al.  A multi-modal mouse with tactile and force feedback , 1994, Int. J. Hum. Comput. Stud..

[8]  Abdulmotaleb El-Saddik,et al.  Serious games , 2011, ACM Multimedia.

[9]  Simon Ungar,et al.  Cognitive mapping without visual experience , 2018, Cognitive Mapping.

[10]  R. Downs,et al.  Image and Environment: Cognitive Mapping and Spatial Behavior , 2017 .

[11]  Ravi Rastogi,et al.  Issues of Using Tactile Mice by Individuals Who Are Blind and Visually Impaired , 2010, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[12]  Cagatay Basdogan,et al.  A Ray-Based Haptic Rendering Technique for Displaying Shape and Texture of 3D Objects in Virtual Environments , 1997, Dynamic Systems and Control.

[13]  Robert Stevens,et al.  Using haptic cues to aid nonvisual structure recognition , 2008, TAP.

[14]  A. Streri,et al.  Touching for knowing : cognitive psychology of haptic manual perception , 2003 .

[15]  Michael G. Strintzis,et al.  Haptic Rendering of Visual Data for the Visually Impaired , 2007, IEEE MultiMedia.

[16]  Joaquin A. Anguera,et al.  A spatial explicit strategy reduces error but interferes with sensorimotor adaptation. , 2011, Journal of neurophysiology.

[17]  Luca Giulio Brayda,et al.  An investigation of search behaviour in a tactile exploration task for sighted and non-sighted adults. , 2011, CHI EA '11.

[18]  Arthur C. Grant,et al.  Tactile perception in blind Braille readers: A psychophysical study of acuity and hyperacuity using gratings and dot patterns , 2000, Perception & psychophysics.

[19]  K. Turano,et al.  Estimating the amount of mental effort required for independent mobility: persons with glaucoma. , 2007, Investigative ophthalmology & visual science.

[20]  Xiaolong Zhang Adaptive haptic exploration of geometrical structures in map navigation for people with visual impairment , 2010, 2010 IEEE International Symposium on Haptic Audio Visual Environments and Games.

[21]  R Core Team,et al.  R: A language and environment for statistical computing. , 2014 .

[22]  Yvonne Rogers,et al.  Low-fi skin vision: a case study in rapid prototyping a sensory substitution system , 2009, BCS HCI.

[23]  Sandra G. Hart,et al.  Nasa-Task Load Index (NASA-TLX); 20 Years Later , 2006 .

[24]  M. Siegel,et al.  Tactile display development: the driving-force for tactile sensor development , 2002, IEEE International Workshop HAVE Haptic Virtual Environments and Their.

[25]  S. Haymes,et al.  Mobility of People with Retinitis Pigmentosa as a Function of Vision and Psychological Variables , 1996, Optometry and vision science : official publication of the American Academy of Optometry.

[26]  J. J. Neve,et al.  Improved mobility and independence of night-blind people using night-vision goggles. , 2004, Investigative ophthalmology & visual science.

[27]  R. Chellali,et al.  Tactile exploration of virtual objects for blind and sighted people: the role of beta 1 EEG band in sensory substitution and supramodal mental mapping. , 2012, Journal of neurophysiology.

[28]  Gunnar Jansson,et al.  Obtaining geographical information from a virtual map with a haptic mouse , 2005 .

[29]  Tina Seufert,et al.  Cognitive load and the format of instructional aids for coherence formation , 2006 .

[30]  Mandayam A. Srinivasan,et al.  Tangential versus normal displacements of skin: relative effectiveness for producing tactile sensations , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.

[31]  Stephen A. Brewster,et al.  Tac-tiles: multimodal pie charts for visually impaired users , 2006, NordiCHI '06.

[32]  Luca Giulio Brayda,et al.  Conveying perceptible virtual tactile maps with a minimalist sensory substitution device , 2012, 2012 IEEE International Workshop on Haptic Audio Visual Environments and Games (HAVE 2012) Proceedings.

[33]  Simon Ungar,et al.  Tactile Elevation Perception in Blind and Sighted Participants and Its Implications for Tactile Map Creation , 2009, Hum. Factors.

[34]  Ingo Fründ,et al.  Inference for psychometric functions in the presence of nonstationary behavior. , 2011, Journal of vision.

[35]  Mandayam A. Srinivasan,et al.  Virtual environment support orientation skills of newly blind , 2011, 2011 International Conference on Virtual Rehabilitation.

[36]  Hanspeter A. Mallot,et al.  Navigation and Acquisition of Spatial Knowledge in a Virtual Maze , 1998, Journal of Cognitive Neuroscience.

[37]  S. L. Grow,et al.  Orientation and Mobility: Techniques for Independence , 2011 .

[38]  F. Vidal-Verdu,et al.  Graphical Tactile Displays for Visually-Impaired People , 2007, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[39]  Vincent Hayward,et al.  Adaptive level of detail in dynamic, refreshable tactile graphics , 2012, 2012 IEEE Haptics Symposium (HAPTICS).

[40]  Peter Dalgaard,et al.  R Development Core Team (2010): R: A language and environment for statistical computing , 2010 .

[41]  Frank Jäkel,et al.  Bayesian inference for psychometric functions. , 2005, Journal of vision.

[42]  Simon Ungar,et al.  An empirical approach on the design of tactile maps and diagrams: The cognitive tactualization approach , 2006 .

[43]  A. Mihailidis,et al.  Assistive technology for cognitive rehabilitation: State of the art , 2004 .

[44]  Louis H. Goldish,et al.  The Optacon: A Valuable Device for Blind Persons , 1974 .

[45]  William M. Stern,et al.  Shape conveyed by visual-to-auditory sensory substitution activates the lateral occipital complex , 2007, Nature Neuroscience.

[46]  R. Johansson,et al.  Encoding of Direction of Fingertip Forces by Human Tactile Afferents , 2001, The Journal of Neuroscience.