Sonically-Enhanced Tabular Screen-Reading

The World Wide Web has made more information readily available than at any time in human history. This information is often presented visually, which can be an inaccessible medium for people with blindness or lowvision. Presently, screen readers are able to verbalize on-screen text using text-to-speech (TTS) synthesis. However, much of this vocalization is inadequate for browsing the Internet, as it cannot properly convey the structure and relationships that exist in a visual presentation. We have created and tested an auditory interface that incorporates auditory-spatial orientation within a tabular structure. When information is structured as a two-dimensional table, links can be semantically grouped as cells in a row within the auditory table, which provides a consistent structure for auditory navigation. Our auditory display prototype was tested with sixteen legally blind participants, who each navigated four sonified tables enhanced with prepended tones, which were varied with stereo spatialization and tonal variation. The sonified tables were presented in a randomized order to avoid ordering/learning effects. Results from the experiment showed that stereo panning was an effective technique for audio-spatially orienting non-visual navigation in a five-row, six-column HTML table as compared to a centered, stationary synthesized voice.