Eyes-free Methods for Accessing Large Auditory Menus

Two interaction methods for eyes-free control of a mobile phone or a media player are introduced. The methods include a gestural pointing interface and a touchscreen interface to a spherical auditory menu where feedback is provided using spatially reproduced speech. The methods could facilitate the eyes-free use of devices and also make them accessible for visually impaired users. The effectiveness of gestural and touchscreen interaction is compared to traditional visual interface when accessing large menus. Evaluation results prove that moderately fast and accurate selection of menu items is possible without visual feedback. Combining eyes-free interfaces, positions of menu items in 3D and a browsing method with a dynamically adjustable target size of the menu items allows the use of large menus with intuitive easy access.

[1]  Hongan Wang,et al.  Tilt menu: using the 3D orientation information of pen devices to extend the selection capability of pen-based user interfaces , 2008, CHI.

[2]  Stephen A. Brewster,et al.  Effects of feedback, mobility and index of difficulty on deictic spatial audio target acquisition in the horizontal plane , 2006, CHI.

[3]  Myounghoon Jeon,et al.  “Spindex”: Accelerated Initial Speech Sounds Improve Navigation Performance in Auditory Menus , 2009 .

[4]  Daniel J. Wigdor,et al.  TiltText: using tilt for text input to mobile phones , 2003, UIST '03.

[5]  Constantine Stephanidis,et al.  A generic direct-manipulation 3D-auditory environment for hierarchical navigation in non-visual interaction , 1996, Assets '96.

[6]  Mark Levy,et al.  3D INTERACTIVE ENVIRONMENT FOR MUSIC COLLECTION NAVIGATION , 2008 .

[7]  Stephen A. Brewster,et al.  Wrist rotation for interaction in mobile contexts , 2008, Mobile HCI.

[8]  Pierre Dragicevic,et al.  Earpod: eyes-free menu selection using touch input and reactive audio feedback , 2007, CHI.

[9]  Jacob O. Wobbrock,et al.  Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.

[10]  Tapio Lokki,et al.  Funkypod: Integrating Auditory and Visual Menus , 2009 .

[11]  Stephen A. Brewster,et al.  Gestural and audio metaphors as a means of control for mobile devices , 2002, CHI.

[12]  Bruce N. Walker,et al.  SPEARCONS: SPEECH-BASED EARCONS IMPROVE NAVIGATION PERFORMANCE IN AUDITORY MENUS , 2006 .

[13]  Tapio Lokki,et al.  A gesture-based and eyes-free control method for mobile devices , 2009, CHI Extended Abstracts.

[14]  Jarmo Hiipakka,et al.  A SPATIAL AUDIO USER INTERFACE FOR GENERATING MUSIC PLAYLISTS , 2003 .

[15]  David McGookin,et al.  Diary in the Sky: A Spatial Audio Display for a Mobile Calendar , 2001, BCS HCI/IHM.

[16]  William W. Gaver Auditory Icons: Using Sound in Computer Interfaces , 1986, Hum. Comput. Interact..

[17]  Mark Apperley,et al.  Data base navigation: an office environment for the professional , 1982 .

[18]  Gaetan Lorho,et al.  FEASIBILITY OF MULTIPLE NON-SPEECH SOUNDS PRESENTATION USING HEADPHONES , 2001 .

[19]  Ian Oakley,et al.  A motion-based marking menu system , 2007, CHI Extended Abstracts.

[20]  Jun Rekimoto,et al.  Tilting operations for small screen interfaces , 1996, UIST '96.

[21]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[22]  Joaquim A. Jorge,et al.  From Tapping to Touching: Making Touch Screens Accessible to Blind Users , 2008, IEEE MultiMedia.

[23]  G. W. Furnas,et al.  Generalized fisheye views , 1986, CHI '86.

[24]  Vibha Sazawal,et al.  TiltType: accelerometer-supported text entry for very small devices , 2002, UIST '02.

[25]  Stephen A. Brewster,et al.  Multimodal 'eyes-free' interaction techniques for wearable devices , 2003, CHI '03.

[26]  Chris Schmandt,et al.  Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments , 2000, TCHI.