Leveraging proprioception to make mobile phones more accessible to users with visual impairments

Accessing the advanced functions of a mobile phone is not a trivial task for users with visual impairments. They rely on screen readers and voice commands to discover and execute functions. In mobile situations, however, screen readers are not ideal because users may depend on their hearing for safety, and voice commands are difficult for a system to recognize in noisy environments. In this paper, we extend Virtual Shelves--an interaction technique that leverages proprioception to access application shortcuts--for visually impaired users. We measured the directional accuracy of visually impaired participants and found that they were less accurate than people with vision. We then built a functional prototype that uses an accelerometer and a gyroscope to sense its position and orientation. Finally, we evaluated the interaction and prototype by allowing participants to customize the placement of seven shortcuts within 15 regions. Participants were able to access shortcuts in their personal layout with 88.3% accuracy in an average of 1.74 seconds.

[1]  Steven Dow,et al.  Mobile ADVICE: an accessible device for visually impaired capability enhancement , 2003, CHI Extended Abstracts.

[2]  Andreas Paepcke,et al.  Piles across space: Breaking the real-estate barrier on small-display devices , 2009, Int. J. Hum. Comput. Stud..

[3]  Jacob O. Wobbrock,et al.  Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.

[4]  Khai N. Truong,et al.  Virtual shelves: interactions with orientation aware devices , 2009, UIST '09.

[5]  Patrick Baudisch,et al.  Blindsight: eyes-free access to mobile phones , 2008, CHI.

[6]  Steven K. Feiner,et al.  Hybrid user interfaces: breeding virtually bigger interfaces for physically smaller computers , 1991, UIST '91.

[7]  Chris Schmandt,et al.  Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments , 2000, TCHI.

[8]  Ian Oakley,et al.  A motion-based marking menu system , 2007, CHI Extended Abstracts.

[9]  Jun Rekimoto,et al.  Tilting operations for small screen interfaces , 1996, UIST '96.

[10]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[11]  Chris Harrison,et al.  Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices , 2009, UIST '09.

[12]  Grigori E. Evreinov,et al.  Adaptive blind interaction technique for touchscreens , 2006, Universal Access in the Information Society.

[13]  Barry Arons,et al.  VoiceNotes: a speech interface for a hand-held voice notetaker , 1993, INTERCHI.

[14]  Marie-José Aldon,et al.  Mobile robot attitude estimation by fusion of inertial data , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[15]  Stephen A. Brewster,et al.  Multimodal 'eyes-free' interaction techniques for wearable devices , 2003, CHI '03.

[16]  Steven A. Shafer,et al.  XWand: UI for intelligent spaces , 2003, CHI '03.

[17]  Steven K. Feiner,et al.  Prototyping retractable string-based interaction techniques for dual-display mobile devices , 2006, CHI.

[18]  Hiroo Iwata,et al.  Haptic interfaces , 2002 .

[19]  F. J. Clark,et al.  HOW ACCURATELY CAN WE PERCEIVE THE POSITIONS OF OUR LIMBS , 1992 .

[20]  Xiang Cao,et al.  Flashlight jigsaw: an exploratory study of an ad-hoc multi-player game on public displays , 2008, CSCW.

[21]  Yoshitake Suzuki,et al.  Development of an integrated wristwatch-type PHS telephone , 1998 .

[22]  Eric Foxlin,et al.  Inertial head-tracker sensor fusion by a complementary separate-bias Kalman filter , 1996, Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium.

[23]  Neil B Alexander,et al.  The influence of age and physical activity on upper limb proprioceptive ability. , 2009, Journal of aging and physical activity.

[24]  Eva Eriksson,et al.  Use your head: exploring face tracking for mobile interaction , 2006, CHI Extended Abstracts.

[25]  George A. Bekey,et al.  An Extended Kalman Filter for frequent local and infrequentglobal sensor data fusionStergios , 1997 .

[26]  G. A. Miller THE PSYCHOLOGICAL REVIEW THE MAGICAL NUMBER SEVEN, PLUS OR MINUS TWO: SOME LIMITS ON OUR CAPACITY FOR PROCESSING INFORMATION 1 , 1956 .

[27]  Pierre Dragicevic,et al.  Earpod: eyes-free menu selection using touch input and reactive audio feedback , 2007, CHI.

[28]  Vassilis Kostakos,et al.  Can we do without GUIs? Gesture and speech interaction with a patient information system , 2005, Personal and Ubiquitous Computing.

[29]  Stephen A. Brewster,et al.  Gestural and audio metaphors as a means of control for mobile devices , 2002, CHI.

[30]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.