It's All About the Message: Visual Experience is a Precursor to Accurate Auditory Interaction

Anecdotal evidence suggests there is a disjoint between the interaction experiences of sighted and visually disabled web users. However, we propose the converse and suggest that this disjoint is created by the lack of understanding of the interplay between the two domains. Current research shows that there is one single locus of attention at a given time in the context of web interaction, and therefore sighted users form a serialisation of the things they look at and pay attention - an exemplar of which can be seen in eye movement sequences of users. We also suggest that web designers have a narrative in mind to be experienced by users, and they create a visual sequence they wish their audience to perceive for supporting this narrative. However, this sequence is typically lost when we move from visual presentations to auditory ones. Current audio interactions centre around page linearisation based on the sequence of the underlying source code. This linearisation typically falls short of the kind of comprehensive interaction which can be expected in the visual domain. In this paper, we use an eye tracking dataset to illustrate that the linearisation of web page component based on the underlying source code differs from what is experienced by sighted users. We then show that the web experience of visually disabled users can be improved by re-ordering the most commonly used web page components based on the order in which they are used. We also suggest that it is critical to conduct formative experimentation with sighted users to establish a visual narrative and serialisation, thereby informing the design of the auditory conversation.

[1]  Robert Stevens,et al.  How people use presentation to search for a link: expanding the understanding of accessibility on the Web , 2006, W4A '06.

[2]  Michael E. Holmes,et al.  Visual attention to repeated internet images: testing the scanpath theory on the world wide web , 2002, ETRA.

[3]  Junji Maeda,et al.  Accessibility designer: visualizing usability for the blind , 2004, Assets '04.

[4]  H. Zettl Sight, Sound, Motion: Applied Media Aesthetics , 1973 .

[5]  Kirstin Krauss Visual aesthetics and its effect on communication intent: a theoretical study and website evaluation , 2005 .

[6]  Yeliz Yesilada,et al.  Identifying Patterns in Eyetracking Scanpaths in Terms of Visual Elements of Web Pages , 2014, ICWE.

[7]  Jef Raskin The humane interface (book excerpt) , 2000, UBIQ.

[8]  M. Elgin Akpinar,et al.  Vision Based Page Segmentation Algorithm: Extended and Perceived Success , 2013, ICWE Workshops.

[9]  Yeliz Yesilada,et al.  Patterns in Eyetracking Scanpaths and the Affecting Factors , 2015, J. Web Eng..

[10]  Stephanie Wilson,et al.  Identifying web usability problems from eye-tracking data , 2007, BCS HCI.

[11]  Christof Koch,et al.  A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 2009 .

[12]  M. Elgin Akpinar,et al.  "Old habits die hard!": eyetracking based experiential transcoding: a study with mobile users , 2015, W4A.

[13]  Sean Bechhofer,et al.  Visual complexity and aesthetic perception of web pages , 2008, SIGDOC '08.

[14]  Yeliz Yesilada,et al.  Scanpath Trend Analysis on Web Pages , 2016, ACM Trans. Web.

[15]  Gary Marchionini,et al.  Exploratory search , 2006, Commun. ACM.

[16]  Joseph H. Goldberg,et al.  Eye tracking in web search tasks: design implications , 2002, ETRA.

[17]  Jens Riegelsberger,et al.  Could I have the Menu Please? An Eye Tracking Study of Design Conventions , 2004 .

[18]  Kirstin Krauss,et al.  A critical evaluation of literature on visual aesthetics for the web , 2004 .

[19]  Yeliz Yesilada,et al.  Experiential transcoding: an EyeTracking approach , 2013, W4A.