Audio-Visual Information Clues about Geographic Data on Mobile Interfaces

The auditory channel represents a primary means of human interaction and several researches are meant to exploit this interaction modality in the design of usable interactive systems. The goal of our present research has been to exploit some interactive sonification capabilities and enhance the solely-visual version of Framy in order to convey the same information clues as those visualized on its interface. The basic version of Framy exploits a visual metaphor to provide hints about off-screen objects. Based on tactile input and non-speech sound output as alternative interaction modalities, the enhanced version of the resulting prototype is now capable to offer an appropriate tradeoff between a zoom level and the amount of information provided, and has motivated the design of multimodal interfaces that support ultimate users providing them with an additional means to access information.