Multimodal Trajectory Playback for Teaching Shape Information and Trajectories to Visually Impaired Computer Users

There are difficulties in presenting nontextual or dynamic information to blind or visually impaired users through computers. This article examines the potential of haptic and auditory trajectory playback as a method of teaching shapes and gestures to visually impaired people. Two studies are described which test the success of teaching simple shapes. The first study examines haptic trajectory playback alone, played through a force-feedback device, and compares performance of visually impaired users with sighted users. It demonstrates that the task is significantly harder for visually impaired users. The second study builds on these results, combining force-feedback with audio to teach visually impaired users to recreate shapes. The results suggest that users performed significantly better when presented with multimodal haptic and audio playback of the shape, rather than haptic only. Finally, an initial test of these ideas in an application context is described, with sighted participants describing drawings to visually impaired participants through touch and sound. This study demonstrates in what situations trajectory playback can prove a useful role in a collaborative setting.

[1]  Nicolas Courty,et al.  Gesture in Human-Computer Interaction and Simulation , 2006 .

[2]  Ian Oakley,et al.  Haptic augmentation of the cursor : transforming virtual actions into physical actions , 2003 .

[3]  Cornelius Hagen,et al.  TeDUB: Automatic interpretation and presentation of technical diagrams for blind people , 2004 .

[4]  Tsuneo Yoshikawa,et al.  Virtual lesson and its application to virtual calligraphy system , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[5]  Stephen A. Brewster,et al.  Evaluation of multimodal graphs for blind people , 2003, Universal Access in the Information Society.

[6]  Lorna M. Brown,et al.  DRAWING BY EAR: INTERPRETING SONIFIED LINE GRAPHS , 2003 .

[7]  Ben Shneiderman,et al.  "I hear the pattern": interactive sonification of geographical data patterns , 2005, CHI EA '05.

[8]  Stephen A. Brewster,et al.  MultiVis: improving access to visualisations for visually impaired people , 2006, CHI Extended Abstracts.

[9]  M A Srinivasan,et al.  Development and evaluation of an epidural injection simulator with force feedback for medical training. , 2001, Studies in health technology and informatics.

[10]  Christopher M. Bishop,et al.  Neural networks for pattern recognition , 1995 .

[11]  Frank Tendick,et al.  Haptic guidance: experimental evaluation of a haptic training method for a perceptual motor skill , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.

[12]  P. Eriksson How to make tactile pictures understandable to the blind reader , 2010 .

[13]  William S. Harwin,et al.  Minimum jerk trajectory control for rehabilitation and haptic applications , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[14]  Yuru Zhang,et al.  Stroke-based modeling and haptic skill display for Chinese calligraphy simulation system , 2006, Virtual Reality.

[15]  James C. Bliss,et al.  Optical-to-Tactile Image Conversion for the Blind , 1970 .

[16]  Alistair D. N. Edwards,et al.  Lambda:: a multimodal approach to making mathematics accessible to blind students , 2006, Assets '06.

[17]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[18]  James L. Alty,et al.  Communicating graphical information to blind users using music: the role of context , 1998, CHI.

[19]  Enrico Pontelli,et al.  Towards a Universal Maths Conversion Library , 2004, ICCHP.

[20]  Martin Kurze,et al.  TDraw: a computer-based tactile drawing tool for blind people , 1996, Assets '96.

[21]  Ben P. Challis,et al.  Design Principles for Tactile Interaction , 2000, Haptic Human-Computer Interaction.

[22]  Etienne Burdet,et al.  A robotic teacher of Chinese handwriting , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.

[23]  Stephen A. Brewster,et al.  Feeling what you hear: tactile feedback for navigation of audio graphs , 2006, CHI.

[24]  Stephen A. Brewster,et al.  Two-handed navigation in a haptic virtual environment , 2006, CHI EA '06.

[25]  Richard E. Ladner,et al.  Automating tactile graphics translation , 2005, Assets '05.

[26]  Roderick Murray-Smith,et al.  Got rhythm? Haptic-only lead and follow dancing , 2003 .

[27]  Tsuneo Yoshikawa,et al.  Toward machine mediated training of motor skills. Skill transfer from human to human via virtual environment , 1996, Proceedings 5th IEEE International Workshop on Robot and Human Communication. RO-MAN'96 TSUKUBA.

[28]  M. Argyle,et al.  A Cross-Cultural Study of the Communication of Extra-Verbal Meaning by Gesture , 1975 .

[29]  Patrick Roth,et al.  Graphics and User's Exploration via Simple Sonics (GUESS): Providing Interrelational Representation of Objects in a Non-visual Environment , 2001 .