Combining auditory and visual menus

Despite growing interest in touch screen and gesture interfaces for auditory menus, there usually is lacking interoperability between visual and auditory menus. The same control logic for both visual and auditory domains could facilitate switching to eyes-free use when needed and improve accessibility for visually impaired users. This paper presents an efficient control interface for both domains and usability tests with three interaction methods. Results show that auditory and visual menus with the same control logic can provide a fast and usable interface to control devices. Furthermore, the same auditory menu can be accessed with a gesture interface. Overall, the touch screen interaction with a visual display was fastest, the touch screen interaction with auditory display was almost as fast, while the gesture interface with an auditory display was slowest. The novel interface paradigm is explained by an example application that allows eyes-free touch screen and gesture access to a music collection on a mobile phone.

[1]  William W. Gaver Auditory Icons: Using Sound in Computer Interfaces , 1986, Hum. Comput. Interact..

[2]  M. Weiser,et al.  An empirical comparison of pie vs. linear menus , 1988, CHI '88.

[3]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[4]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[5]  Constantine Stephanidis,et al.  A generic direct-manipulation 3D-auditory environment for hierarchical navigation in non-visual interaction , 1996, Assets '96.

[6]  Tapio Lokki,et al.  Creating Interactive Virtual Acoustic Environments , 1999 .

[7]  Chris Schmandt,et al.  Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments , 2000, TCHI.

[8]  Angelo Farina,et al.  Simultaneous Measurement of Impulse Response and Distortion with a Swept-Sine Technique , 2000 .

[9]  David McGookin,et al.  Diary in the Sky: A Spatial Audio Display for a Mobile Calendar , 2001, BCS HCI/IHM.

[10]  Stephen A. Brewster,et al.  Gestural and audio metaphors as a means of control for mobile devices , 2002, CHI.

[11]  Stephen A. Brewster,et al.  Multimodal 'eyes-free' interaction techniques for wearable devices , 2003, CHI '03.

[12]  John Bowers,et al.  Assembling the senses: towards the design of cooperative interfaces for visually impaired users , 2004, CSCW.

[13]  Shumin Zhai,et al.  The benefits of augmenting telephone voice menu navigation with visual browsing and search , 2006, CHI.

[14]  Stephen A. Brewster,et al.  Effects of feedback, mobility and index of difficulty on deictic spatial audio target acquisition in the horizontal plane , 2006, CHI.

[15]  Bruce N. Walker,et al.  SPEARCONS: SPEECH-BASED EARCONS IMPROVE NAVIGATION PERFORMANCE IN AUDITORY MENUS , 2006 .

[16]  Pierre Dragicevic,et al.  Earpod: eyes-free menu selection using touch input and reactive audio feedback , 2007, CHI.

[17]  Mark Billinghurst,et al.  A user study of auditory versus visual interfaces for use while driving , 2008, Int. J. Hum. Comput. Stud..

[18]  Joaquim A. Jorge,et al.  From Tapping to Touching: Making Touch Screens Accessible to Blind Users , 2008, IEEE MultiMedia.

[19]  Jacob O. Wobbrock,et al.  Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.

[20]  Tapio Lokki,et al.  Funkypod: Integrating Auditory and Visual Menus , 2009 .

[21]  Myounghoon Jeon,et al.  Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies , 2009, AutomotiveUI.

[22]  Philip T. Kortum,et al.  Determining what individual SUS scores mean: adding an adjective rating scale , 2009 .

[23]  Tapio Lokki,et al.  A gesture-based and eyes-free control method for mobile devices , 2009, CHI Extended Abstracts.

[24]  Erik Sikström,et al.  Designing Auditory Display Menu Interfaces - Cues for Users Current Location in Extensive Menus , 2009 .

[25]  Myounghoon Jeon,et al.  “Spindex”: Accelerated Initial Speech Sounds Improve Navigation Performance in Auditory Menus , 2009 .

[26]  I. Scott MacKenzie,et al.  Eyes-free text entry with error correction on touchscreen mobile devices , 2010, NordiCHI.

[27]  Tapio Lokki,et al.  Eyes-free Methods for Accessing Large Auditory Menus , 2010 .

[28]  Katrin Wolf,et al.  Foogue: eyes-free interaction for smartphones , 2010, Mobile HCI.

[29]  Ville Pulkki,et al.  HRTF Measurements with a Continuously Moving Loudspeaker and Swept Sines , 2010 .

[30]  Ikuko Eguchi Yairi,et al.  The evaluation of visually impaired people's ability of defining the object location on touch-screen , 2010, ASSETS '10.

[31]  Myounghoon Jeon,et al.  Auditory menus are not just spoken visual menus: a case study of "unavailable" menu items , 2010, CHI Extended Abstracts.

[32]  Gregory D. Abowd,et al.  No-Look Notes: Accessible Eyes-Free Multi-touch Text Entry , 2010, Pervasive.

[33]  Stephen A. Brewster,et al.  Eyes-free multitasking: the effect of cognitive load on mobile spatial audio interfaces , 2011, CHI.

[34]  Richard E. Ladner,et al.  Usable gestures for blind people: understanding preference and performance , 2011, CHI.

[35]  Ian Oakley,et al.  Auditory display design for exploration in mobile audio-augmented reality , 2012, Personal and Ubiquitous Computing.