Comparing Haptic Pattern Matching on Tablets and Phones: Large Screens Are Not Necessarily Better.

Touchscreen-based, multimodal graphics represent an area of increasing research in digital access for individuals with blindness or visual impairments; yet, little empirical research on the effects of screen size on graphical exploration exists. This work probes if and when more screen area is necessary in supporting a pattern-matching task. PURPOSE Larger touchscreens are thought to have distinct benefit over smaller touchscreens for the amount of space available to convey graphical information nonvisually. The current study investigates two questions: (1) Do screen size and grid density impact a user's accuracy on pattern-matching tasks? (2) Do screen size and grid density impact a user's time on task? METHODS Fourteen blind and visually impaired individuals were given a pattern-matching task to complete on either a 10.5-in tablet or a 5.1-in phone. The patterns consisted of five vibrating targets imposed on sonified grids that varied in density (higher density = more grid squares). At test, participants compared the touchscreen pattern with a group of physical, embossed patterns and selected the matching pattern. Participants were evaluated on time exploring the pattern on the device and their pattern-matching accuracy. Multiple and logistic regressions were performed on the data. RESULTS Device size, grid density, and age had no statistically significant effects on the model of pattern-matching accuracy. However, device size, grid density, and age had significant effects on the model for grid exploration. Using the phone, exploring low-density grids, and being older were indicative of faster exploration time. CONCLUSIONS A trade-off of time and accuracy exists between devices that seems to be task dependent. Users may find a tablet most useful in situations where the accuracy of graphic interpretation is important and is not limited by time. Smaller screen sizes afforded comparable accuracy performance to tablets and were faster to explore overall.

[1]  Nicholas A. Giudice,et al.  Learning non-visual graphical information using a touch-based vibro-audio interface , 2012, ASSETS '12.

[2]  J. Randall Flanagan,et al.  Coding and use of tactile signals from the fingertips in object manipulation tasks , 2009, Nature Reviews Neuroscience.

[3]  Jaehoon Jung,et al.  Psychophysical Model for Vibrotactile Rendering in Mobile Devices , 2010, PRESENCE: Teleoperators and Virtual Environments.

[4]  S. Lederman,et al.  Human Hand Function , 2006 .

[5]  John H. Wright,et al.  Information-Processing Channels in the Tactile Sensory System: A Psychophysical and Physiological Analysis , 2008 .

[6]  Emerson Foulke,et al.  Tactile acuity, aging, and braille reading in long-term blindness , 1996 .

[7]  Jenna L. Gorlewicz,et al.  Toward Non-visual Graphics Representations on Vibratory Touchscreens: Shape Exploration and Identification , 2016, EuroHaptics.

[8]  J. Ponsford Tactile spatial resolution in blind Braille readers , 2000, Neurology.

[9]  Tony Stockman,et al.  A survey of assistive technologies and applications for blind users on mobile platforms: a review and foundation for research , 2015, Journal on Multimodal User Interfaces.

[10]  W. M. Rabinowitz,et al.  Information transmission with a multifinger tactual display , 1999, Perception & psychophysics.

[11]  R. Hébert,et al.  The Purdue Pegboard Test: normative data for people aged 60 and over. , 1995, Disability and rehabilitation.

[12]  Morton A. Heller,et al.  Tangible Picture Matching by People who are Visually Impaired , 2002 .

[13]  C. Victor,et al.  Psychosocial impact of visual impairment in working-age adults , 2009, British Journal of Ophthalmology.

[14]  H. Heffner,et al.  Hearing ranges of laboratory animals. , 2007, Journal of the American Association for Laboratory Animal Science : JAALAS.

[15]  Kathryn E. Maxfield,et al.  Performance of Blind Vocational Rehabilitation Clients on the Purdue Pegboard , 1960 .

[16]  Balasundar I Raju,et al.  3-D finite-element models of human and monkey fingertips to investigate the mechanics of tactile sense. , 2003, Journal of biomechanical engineering.

[17]  W. Wittich,et al.  The Purdue Pegboard test: normative data for older adults with low vision , 2017, Disability and rehabilitation. Assistive technology.

[18]  Kim Marriott,et al.  GraVVITAS: Generic Multi-touch Presentation of Accessible Graphics , 2011, INTERACT.

[19]  Michiel M. Spapé,et al.  Distilling the underlying dimensions of tactile melodies , 2003 .

[20]  Seungmoon Choi,et al.  Vibrotactile Display: Perception, Technology, and Applications , 2013, Proceedings of the IEEE.

[21]  Gordon E. Legge,et al.  Designing Media for Visually-Impaired Users of Refreshable Touch Displays: Possibilities and Pitfalls , 2015, IEEE Transactions on Haptics.

[22]  Christopher R. Bennett,et al.  Touch-screen technology for the dynamic display of -2D spatial information without vision: promise and progress. , 2014, Multisensory research.

[23]  M A Heller,et al.  Tactual picture identification by blind and sighted people: Effects of providing categorical information , 1996, Perception & psychophysics.

[24]  J. Simon CONSEQUENCES OF AMBLYOPIA ON EDUCATION, OCCUPATION, AND LONG TERM VISION LOSS , 2005 .

[25]  Nicholas A. Giudice,et al.  Evaluation of Non-visual Zooming Operations on Touchscreen Devices , 2016, HCI.