Older adults and visual impairment: what do exposure times and accuracy tell us about performance gains associated with multimodal feedback?

This study examines the effects of multimodal feedback on the performance of older adults with different visual abilities. Older adults possessing normal vision (n=29) and those who have been diagnosed with Age-Related Macular Degeneration (n=30) performed a series of drag-and-drop tasks under varying forms of feedback. User performance was assessed with measures of feedback exposure times and accuracy. Results indicated that for some cases, non-visual (e.g. auditory or haptic) and multimodal (bi- and trimodal) feedback forms demonstrated significant performance gains over the visual feedback form, for both AMD and normally sighted users. In addition to visual acuity, effects of manual dexterity and computer experience are considered.

[1]  Stephen Brewster Sonically-enhanced drag and drop , 1998 .

[2]  Constantine Stephanidis,et al.  Universal access in the information society , 1999, HCI.

[3]  Richard W. Pew,et al.  Evolution of human-computer interaction: from Memex to Bluetooth and beyond , 2002 .

[4]  J. Tiffin,et al.  The Purdue pegboard; norms and studies of reliability and validity. , 1948, The Journal of applied psychology.

[5]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[6]  Carl Gutwin,et al.  The Effects of Feedback on Targeting Performance in Visually Stressed Conditions , 2000, Graphics Interface.

[7]  William W. Gaver The SonicFinder: An Interface That Uses Auditory Icons , 1989, Hum. Comput. Interact..

[8]  Jon R. Gunderson,et al.  American with Disabilities Act (ADA): human computer interactin for persons with disablities , 1994, CHI Conference Companion.

[9]  Abigail Sellen,et al.  A comparison of input devices in element pointing and dragging tasks , 1991, CHI.

[10]  Motoyuki Akamatsu,et al.  Movement characteristics using a mouse with tactile and force feedback , 1996, Int. J. Hum. Comput. Stud..

[11]  R. Nelson,et al.  Reaction times for hand movements made in response to visual versus vibratory cues. , 1990, Somatosensory & motor research.

[12]  C. Kirchner,et al.  Who's Surfing? Internet Access and Computer Use by Visually Impaired Youths and Adults , 2001 .

[13]  Andrew Sears,et al.  Designing interfaces for an overlooked user group: considering the visual profiles of partially sighted users , 1998, Assets '98.

[14]  Kori Inkpen,et al.  Drag-and-drop versus point-and-click mouse interaction styles for children , 2001 .

[15]  S. Mackenzie,et al.  A comparison of input device in elemental pointing and dragging task , 1991, CHI 1991.

[16]  Max A. Dixon,et al.  Visual profiles: a critical component of universal access , 1999, CHI '99.

[17]  M. J. Tobin,et al.  Normative data for assessing the manual dexterity of visually handicapped adults in vocational rehabilitation , 1987 .

[18]  H S Vitense,et al.  Multimodal feedback: an assessment of performance and mental workload , 2003, Ergonomics.

[19]  Douglas J. Gillan,et al.  How does Fitts' law fit pointing and dragging? , 1990, CHI '90.

[20]  M Akamatsu,et al.  Please Scroll down for Article Ergonomics a Comparison of Tactile, Auditory, and Visual Feedback in a Pointing Task Using a Mouse-type Device , 2022 .

[21]  Justin D. Whittle,et al.  Identifying Critical Interaction Scenarios for Innovative User Modeling , 2001, HCI.

[22]  Max A. Dixon,et al.  Visual Impairment: The Use of Visual Profiles in Evaluations of Icon Use in Computer-Based Tasks , 2000, Int. J. Hum. Comput. Interact..

[23]  Marilyn Rose McGee A haptically enhanced scrollbar : Force-Feedback as a means of reducing the problems associated with scrolling , 1999 .

[24]  Julie A. Jacko,et al.  Foundation for improved interaction by individuals with visual impairments through multimodal feedback , 2002, Universal Access in the Information Society.

[25]  I. Scott MacKenzie,et al.  Accuracy measures for evaluating computer pointing devices , 2001, CHI.

[26]  William W. Gayer The SonicFinder: An Interface that Uses Auditory Icons (Abstract Only) , 1989, SGCH.

[27]  Ian Oakley,et al.  Solving multi-target haptic problems in menu interaction , 2001, CHI Extended Abstracts.