Sensorial substitution system from vision to audition using transparent digital earplugs
暂无分享,去创建一个
Since the TVSS (Tactile Vision Substitution System) developed by Bach-Y-Rita in 1960's, several sensorial substitution systems have been developed. In general, the so-called "sensorial substitution" system transform stimuli characteristic of one sensory modality (for example, vision) into stimuli of another sensory modality (for example, audition). These systems are developed to help handicapped persons. We developed a sensorial substitution system from vision to audition. An artificial neural network is used to identify the important parts in the image. The Virtual Acoustic Space technic is used to generate localizable sounds. A sound is associated to each important parts of the image. The entire real-time system has been implemented on iOS platforms (iPhone/iPad/iPod Touch{trade mark, serif}). We associated our system with transparent digital earplugs. This way the user is aware of the audio scene happening around him. The system has been tested on non-blind persons and the results are presented.