Multisource Sonification for Visual Substitution in an Auditory Memory Game: One, or Two Fingers?

The See ColOr project aims at developing a mobility system for blind persons based on image color sonification. Within this project the present work addresses the optimal use of auditory-multi-touch interaction, and in particular the matter of the number of fingers needed for efficient exploration. To determine the actual significance of mono and multi-touch interaction onto the auditory feedback, a color matching memory game was implemented. Sounds of this game were generated by touching a tablet with one or two fingers. A group of 20 blindfolded users was tasked to find color matches into an image grid represented on the tablet by listening to their associated color-sound representation. Our results show that for an easy task aiming at matching few objects, the use of two fingers is moderately more efficient than the use of one finger. Whereas, against our intuition, this cannot be statistically confirmed in the case of similar tasks of increasing difficulty.

[1]  I. Scott MacKenzie,et al.  Text Entry for Mobile Computing: Models and Methods,Theory and Practice , 2002, Hum. Comput. Interact..

[2]  Guido Bologna,et al.  Transforming 3D Coloured Pixels into Musical Instrument Notes for Vision Substitution Applications , 2007, EURASIP J. Image Video Process..

[3]  Guido Bologna,et al.  Toward local and global perception modules for vision substitution , 2011, Neurocomputing.

[4]  Guido Bologna,et al.  On the use of the auditory pathway to represent image scenes in real-time , 2009, Neurocomputing.

[5]  J. Cronly-Dillon,et al.  The perception of visual images encoded in musical form: a study in cross-modality information transfer , 1999, Proceedings of the Royal Society of London. Series B: Biological Sciences.

[6]  Guido Bologna,et al.  Blind Navigation along a Sinuous Path by Means of the See ColOr Interface , 2009, IWINAC.

[7]  Stephen A. Brewster,et al.  Investigating touchscreen accessibility for people with visual impairments , 2008, NordiCHI.

[8]  Guido Bologna,et al.  Color-audio encoding interface for visual substitution: see color matlab-based demo , 2010, ASSETS '10.

[9]  Guido Bologna,et al.  A perceptual interface for vision substitution in a color matching experiment , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[10]  Guido Bologna,et al.  See ColOr: Seeing Colours with an Orchestra , 2009, Human Machine Interaction.

[11]  Jeanny Hérault,et al.  Visuo-auditory sensory substitution for mobility assistance: testing TheVIBE , 2008 .

[12]  Albert S. Bregman,et al.  The Auditory Scene. (Book Reviews: Auditory Scene Analysis. The Perceptual Organization of Sound.) , 1990 .

[13]  C. Avendano,et al.  The CIPIC HRTF database , 2001, Proceedings of the 2001 IEEE Workshop on the Applications of Signal Processing to Audio and Acoustics (Cat. No.01TH8575).

[14]  Ikuko Eguchi Yairi,et al.  The evaluation of visually impaired people's ability of defining the object location on touch-screen , 2010, ASSETS '10.

[15]  L. Kay,et al.  A sonar aid to enhance spatial perception of the blind: engineering design and evaluation , 1974 .

[16]  Peter B. L. Meijer,et al.  An experimental system for auditory image representations , 1992, IEEE Transactions on Biomedical Engineering.

[17]  José L. González-Mora,et al.  Development of a New Space Perception System for Blind People, Based on the Creation of a Virtual Acoustic Space , 1999, IWANN.

[18]  Durand R. Begault,et al.  3-D Sound for Virtual Reality and Multimedia Cambridge , 1994 .

[19]  C. Trullemans,et al.  A real-time experimental prototype for enhancement of vision rehabilitation using auditory substitution , 1998, IEEE Transactions on Biomedical Engineering.