Mobile Web Browsing with Aural Flows: An Exploratory Study

Existing web applications force users focus their visual attention on mobile devices while browsing content and services while on the go. To support eyes-free, mobile experiences, designers can minimize interaction with a device by leveraging the auditory channel. Whereas acoustic interfaces have proven to be effective in reducing visual attention, a perplexing challenge is designing aural information architectures for the web. To address this problem, techniques to remodel existing information architectures as linear, aural flows were introduced and evaluated. Mobile web browsing with aural flows is exemplified in ANFORA News, a semiaural mobile site designed to browse large collections of news stories. An exploratory study involving frequent news readers (n = 20) investigated the usability and navigation experience with ANFORA News in a mobile setting. Initial evidence suggests that aural flows are a promising paradigm to support eyes-free mobile navigation while on the go, but users still require assistance and additional learning to fully master the aural mechanics of the flows while on the go. Future work will improve on the mechanisms to customize content and control the aural navigation.

[1]  Duncan P. Brumby,et al.  Fast or safe?: how performance objectives determine modality output choices while interacting on the move , 2011, CHI.

[2]  M. Zancanaro,et al.  Using Cinematic Techniques in a Multimedia Museum Guide. , 2003 .

[3]  Pierre Dragicevic,et al.  Earpod: eyes-free menu selection using touch input and reactive audio feedback , 2007, CHI.

[4]  Gopal Gupta,et al.  VoxBoox:: a system for automatic generation of interactive talking books , 2006, Assets '06.

[5]  Constantine Stephanidis,et al.  Universal Access in Human-Computer Interaction , 2011 .

[6]  Patrick Baudisch,et al.  Blindsight: eyes-free access to mobile phones , 2008, CHI.

[7]  Rob Miller,et al.  Automation and customization of rendered web pages , 2005, UIST.

[8]  Kwan Min Lee,et al.  Speech Versus Touch: A Comparative Study of the Use of Speech and DTMF Keypad for Navigation , 2005, Int. J. Hum. Comput. Interact..

[9]  Romisa Rohani Ghahari,et al.  ANFORA: Investigating aural navigation flows on rich architectures , 2011, 2011 13th IEEE International Symposium on Web Systems Evolution (WSE).

[10]  Mexhid Ferati,et al.  Aural browsing on-the-go: listening-based back navigation in large web architectures , 2012, CHI.

[11]  Marek Hatala,et al.  Situated play in a tangible interface and adaptive audio museum guide , 2007, Personal and Ubiquitous Computing.

[12]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[13]  Luís Carriço,et al.  When You Can't Read It, Listen to It! An Audio-Visual Interface for Book Reading , 2009, HCI.

[14]  Juliana Freire,et al.  Automating Web navigation with the WebVCR , 2000, Comput. Networks.

[15]  Katsuhiko Ogawa,et al.  VoiceBlog: Universally Designed Voice Browser , 2007, Int. J. Hum. Comput. Interact..

[16]  Yevgen Borodin Automation of repetitive web browsing tasks with voice-enabled macros , 2008, Assets '08.

[17]  Dongsong Zhang,et al.  Can Convenience and Effectiveness Converge in Mobile Web? A Critique of the State-of-the-Art Adaptation Techniques for Web Navigation on Mobile Handheld Devices , 2011, Int. J. Hum. Comput. Interact..

[18]  Paolo Paolini,et al.  Interactive dialogue model: a design technique for multichannel applications , 2006, IEEE Transactions on Multimedia.

[19]  Sarah Morley,et al.  Digital talking books on a PC: a usability evaluation of the prototype DAISY playback software , 2000, Assets '98.

[20]  Paul A Lucas,et al.  An evaluation of the communicative ability of auditory icons and earcons , 1994 .

[21]  Gopal Gupta,et al.  VoxBoox: A System for Automatic Generation of Interactive Talking Books , 2007, HCI.