Multimodal collaborative handwriting training for visually-impaired people

"McSig" is a multimodal teaching and learning environ-ment for visually-impaired students to learn character shapes, handwriting and signatures collaboratively with their teachers. It combines haptic and audio output to realize the teacher's pen input in parallel non-visual modalities. McSig is intended for teaching visually-impaired children how to handwrite characters (and from that signatures), something that is very difficult without visual feedback. We conducted an evaluation with eight visually-impaired children with a pretest to assess their current skills with a set of character shapes, a training phase using McSig and then a post-test of the same character shapes to see if there were any improvements. The children could all use McSig and we saw significant improvements in the character shapes drawn, particularly by the completely blind children (many of whom could draw almost none of the characters before the test). In particular, the blind participants all expressed enjoyment and excitement about the system and using a computer to learn to handwrite.

[1]  William S. Harwin,et al.  Minimum jerk trajectory control for rehabilitation and haptic applications , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[2]  Etienne Burdet,et al.  A robotic teacher of Chinese handwriting , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.

[3]  Robert J. Wood,et al.  Towards a 3g crawling robot through the integration of microrobot technologies , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[4]  Charlotte Magnusson,et al.  User Evaluations of a Virtual Haptic-Audio Line Drawing Prototype , 2006, HAID.

[5]  Tsuneo Yoshikawa,et al.  Virtual lesson and its application to virtual calligraphy system , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[6]  Tsuneo Yoshikawa,et al.  Toward machine mediated training of motor skills. Skill transfer from human to human via virtual environment , 1996, Proceedings 5th IEEE International Workshop on Robot and Human Communication. RO-MAN'96 TSUKUBA.

[7]  M A Srinivasan,et al.  Development and evaluation of an epidural injection simulator with force feedback for medical training. , 2001, Studies in health technology and informatics.

[8]  Stephen A. Brewster,et al.  Feeling what you hear: tactile feedback for navigation of audio graphs , 2006, CHI.

[9]  Cornelius Hagen,et al.  TeDUB: Automatic interpretation and presentation of technical diagrams for blind people , 2004 .

[10]  Martin Kurze,et al.  TDraw: a computer-based tactile drawing tool for blind people , 1996, Assets '96.

[11]  Stephen A. Brewster,et al.  Multimodal Trajectory Playback for Teaching Shape Information and Trajectories to Visually Impaired Computer Users , 2008, TACC.

[12]  Frank Tendick,et al.  Haptic guidance: experimental evaluation of a haptic training method for a perceptual motor skill , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.

[13]  John Williamson,et al.  A General Purpose Control-Based Trajectory Playback for Force-Feedback Systems , 2006 .

[14]  Stephen A. Brewster,et al.  Evaluation of multimodal graphs for blind people , 2003, Universal Access in the Information Society.