An Investigation into Incorporating Visual Information in Audio Processing

The number of persons with hearing and vision loss is on the rise as lifespans increase. Vision plays an important role in communication, especially in the presence of background noise or for persons with hearing loss. However, persons with vision loss cannot make use of this extra modality to overcome their hearing deficits. We propose automatically utilizing some visual information in hearing aids through the addition of a small wearable camera. Our initial results show potentially significant benefits to incorporating low level robust visual cues when the background noise is high. This technique can potentially benefit all persons with hearing loss, with substantial improvements possible for the speech perception performance of persons with dual sensory loss.

[1]  Stuart Gatehouse Electronic aids to hearing. , 2002, British medical bulletin.

[2]  A. Macleod,et al.  Quantifying the contribution of vision to speech perception in noise. , 1987, British journal of audiology.

[3]  Philipos C. Loizou,et al.  Speech Enhancement: Theory and Practice , 2007 .

[4]  Javier Ramírez,et al.  A new adaptive long-term spectral estimation voice activity detector , 2003, INTERSPEECH.

[5]  R. Freyman,et al.  The role of visual speech cues in reducing energetic and informational masking. , 2005, The Journal of the Acoustical Society of America.

[6]  J. Brabyn,et al.  Seeing into old age: vision function beyond acuity. , 1999, Optometry and vision science : official publication of the American Academy of Optometry.

[7]  Harry Levitt,et al.  Performance of directional microphones for hearing aids: real-world versus simulation. , 2004, Journal of the American Academy of Audiology.