Variability in Reactions to Instructional Guidance during Smartphone-Based Assisted Navigation of Blind Users

'Turn slightly to the left' the navigational system announces, with the aim of directing a blind user to merge into a corridor. Yet, due to long reaction time, the user turns too late and proceeds into the wrong hallway. Observations of such user behavior in real-world navigation settings motivate us to study the manner in which blind users react to the instructional feedback of a turn-by-turn guidance system. We found little previous work analyzing the extent of the variability among blind users in reaction to different instructional guidance during assisted navigation. To gain insight into how navigational interfaces can be better designed to accommodate the information needs of different users, we conduct a data-driven analysis of reaction variability as defined by motion and timing measures. Based on continuously tracked user motion during real-world navigation with a deployed system, we find significant variability between users in their reaction characteristics. Specifically, the statistical analysis reveals significant variability during the crucial elements of the navigation (e.g., turning and encountering obstacles). With the end-user experience in mind, we identify the need to not only adjust interface timing and content to each user's personal walking pace, but also their individual navigation skill and style. The design implications of our study inform the development of assistive systems which consider such user-specific behavior to ensure successful navigation.

[1]  Jun Wang,et al.  iSee: obstacle detection and feedback system for the blind , 2015, UbiComp/ISWC Adjunct.

[2]  Xia Wang,et al.  Extended KLM for mobile phone interaction: a user study result , 2010, CHI Extended Abstracts.

[3]  K. Turano,et al.  Mental Effort Required for Walking: Effects of Retinitis Pigmentosa , 1998, Optometry and vision science : official publication of the American Academy of Optometry.

[4]  Chieko Asakawa,et al.  Environmental Factors in Indoor Navigation Based on Real-World Trajectories of Blind Users , 2018, CHI.

[5]  Amy Hurst,et al.  "Pray before you step out": describing personal and situational blind navigation behaviors , 2013, ASSETS.

[6]  J. Weiland,et al.  Assessment of feedback modalities for wearable visual aids in blind mobility , 2017, PloS one.

[7]  Paul M. Mather Computational Methods of Multivariate Analysis in Physical Geography , 1976 .

[8]  Gordon E. Legge,et al.  Blind Navigation and the Role of Technology , 2008 .

[9]  Hironobu Takagi,et al.  Supporting Orientation of People with Visual Impairment: Analysis of Large Scale Usage Data , 2016, ASSETS.

[10]  Mikael B. Skov,et al.  Studying driver attention and behaviour for three configurations of GPS navigation in real traffic driving , 2010, CHI.

[11]  Feng Zhao,et al.  A reliable and accurate indoor localization method using phone inertial sensors , 2012, UbiComp.

[12]  JAN E. LOVIE-KITCHIN,et al.  Does Mobility Performance of Visually Impaired Adults Improve Immediately After Orientation and Mobility Training? , 2001, Optometry and vision science : official publication of the American Academy of Optometry.

[13]  Nikolaos G. Bourbakis,et al.  Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[14]  Chieko Asakawa,et al.  Personalized Dynamics Models for Adaptive Assistive Navigation Systems , 2018, CoRL.

[15]  Chieko Asakawa,et al.  Modeling Expertise in Assistive Navigation Interfaces for Blind People , 2018, IUI.

[16]  Takehisa Onisawa,et al.  Personalized Pedestrian Navigation System with Subjective Preference Based Route Selection , 2008 .

[17]  Kostas E. Bekris,et al.  The user as a sensor: navigating users with visual impairments in indoor spaces using tactile landmarks , 2012, CHI.

[18]  Wolfram Burgard,et al.  Navigating blind people with a smart walker , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[19]  J. Lovie-Kitchin,et al.  Mobility performance with retinitis pigmentosa , 1997 .

[20]  Laura Giarré,et al.  Enabling independent navigation for visually impaired people through a wearable vision-based feedback system , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[21]  A. Gallet,et al.  Use of vehicle navigation in driver assistance systems , 2000, Proceedings of the IEEE Intelligent Vehicles Symposium 2000 (Cat. No.00TH8511).

[22]  Albrecht Schmidt,et al.  Keystroke-level model for advanced mobile phone interaction , 2007, CHI.

[23]  Yunhao Liu,et al.  Indoor localization via multi-modal sensing on smartphones , 2016, UbiComp.

[24]  Dierna Giovanni Luca Towards accurate indoor localization using iBeacons , fingerprinting and particle filtering , 2016 .

[25]  Wolfram Burgard,et al.  Navigating blind people with walking impairments using a smart walker , 2017, Auton. Robots.

[26]  Gerhard Weber,et al.  RouteCheckr: personalized multicriteria routing for mobility impaired pedestrians , 2008, Assets '08.

[27]  John Krumm,et al.  Warming Up to Cold Start Personalization , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[28]  Michel Denis,et al.  NAVIG: augmented reality guidance system for the visually impaired , 2012, Virtual Reality.

[29]  Lu Luo,et al.  Predicting task execution time on handheld devices using the keystroke-level model , 2005, CHI Extended Abstracts.

[30]  Eckehard Steinbach,et al.  Graph-based data fusion of pedometer and WiFi measurements for mobile indoor positioning , 2014, UbiComp.

[31]  Koji Tsukada,et al.  ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation , 2004, UbiComp.

[32]  Lei Yu,et al.  Calibration-free fusion of step counter and wireless fingerprints for indoor localization , 2015, UbiComp.

[33]  Neil J. Gordon,et al.  A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking , 2002, IEEE Trans. Signal Process..

[34]  Yuhang Zhao,et al.  Finding a store, searching for a product: a study of daily challenges of low vision people , 2016, UbiComp.

[35]  A. Arditi,et al.  User Interface Preferences in the Design of a Camera-Based Navigation and Wayfinding Aid , 2013 .

[36]  Tyler Thrash,et al.  Spatial navigation by congenitally blind individuals , 2015, Wiley interdisciplinary reviews. Cognitive science.

[37]  Allen Newell,et al.  The keystroke-level model for user performance time with interactive systems , 1980, CACM.

[38]  Jeremy R. Cooperstock,et al.  Listen to it yourself!: evaluating usability of what's around me? for the blind , 2013, CHI.

[39]  Anthony J. Hornof,et al.  A minimal model for predicting visual search in human-computer interaction , 2007, CHI.

[40]  Maya Cakmak,et al.  Enabling building service robots to guide blind people a participatory design approach , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[41]  Jeremy R. Cooperstock,et al.  What's around Me? Spatialized Audio Augmented Reality for Blind Users with a Smartphone , 2011, MobiQuitous.

[42]  Anke M. Brock,et al.  Accessible Interactive Maps for Visually Impaired Users , 2022, ArXiv.

[43]  Judy Kay,et al.  Harnessing Long Term Physical Activity Data—How Long-term Trackers Use Data and How an Adherence-based Interface Supports New Insights , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[44]  J Faria,et al.  Electronic white cane for blind people navigation assistance , 2010, 2010 World Automation Congress.

[45]  Anil K. Jain,et al.  Algorithms for Clustering Data , 1988 .

[46]  João Guerreiro,et al.  Virtual Navigation for Blind People: Building Sequential Representations of the Real-World , 2017, ASSETS.

[47]  Hironobu Takagi,et al.  NavCog3: An Evaluation of a Smartphone-Based Blind Indoor Navigation Assistant with Semantic Features in a Large-Scale Environment , 2017, ASSETS.

[48]  Henrik Tonn-Eichstädt,et al.  Measuring website usability for visually impaired people-a modified GOMS analysis , 2006, Assets '06.

[49]  Kostas E. Bekris,et al.  Indoor Human Navigation Systems: A Survey , 2013, Interact. Comput..

[50]  Hironobu Takagi,et al.  NavCog: a navigational cognitive assistant for the blind , 2016, MobileHCI.

[51]  Vinod Namboodiri,et al.  GuideBeacon: Beacon-based indoor wayfinding for the blind, visually impaired, and disoriented , 2017, 2017 IEEE International Conference on Pervasive Computing and Communications (PerCom).

[52]  Hideaki Kuzuoka,et al.  Precision timing in human-robot interaction: coordination of head movement and utterance , 2008, CHI.

[53]  Clayton Shepard,et al.  An empirical analysis of smartphone personalisation: measurement and user variability , 2012, Behav. Inf. Technol..

[54]  Hironobu Takagi,et al.  Assessment of Semantic Taxonomies for Blind Indoor Navigation Based on a Shopping Center Use Case , 2017, W4A.

[55]  Per Ola Kristensson,et al.  Supporting blind navigation using depth sensing and sonification , 2013, UbiComp.

[56]  Rosen Ivanov,et al.  Indoor navigation system for visually impaired , 2010, CompSysTech '10.

[57]  Nicholas A. Giudice,et al.  Indoor inertial waypoint navigation for the blind , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[58]  Agata Brajdic,et al.  Walk detection and step counting on unconstrained smartphones , 2013, UbiComp.

[59]  Chris Harrison,et al.  Where to locate wearable displays?: reaction time performance of visual alerts from tip to toe , 2009, CHI.

[60]  Byung-Cheol Min,et al.  Indoor Navigation Aid System Using No Positioning Technique for Visually Impaired People , 2017, HCI.

[61]  Gang Wang,et al.  Unsupervised Clickstream Clustering for User Behavior Analysis , 2016, CHI.

[62]  Chieko Asakawa,et al.  How Context and User Behavior Affect Indoor Navigation Assistance for Blind People , 2018, W4A.

[63]  T. Nakamura,et al.  Quantitative analysis of gait in the visually impaired. , 1997, Disability and rehabilitation.

[64]  Gary E. Burnett,et al.  An extended keystroke level model (KLM) for predicting the visual demand of in-vehicle information systems , 2007, CHI.

[65]  Roberto Manduchi,et al.  The last meter: blind visual guidance to a target , 2014, CHI.

[66]  Zdenek Míkovec,et al.  Software architecture for a distributed in-hospital navigation system , 2015, RACS.

[67]  Eyal de Lara,et al.  Accurate GSM Indoor Localization , 2005, UbiComp.

[68]  Nadir Weibel,et al.  Context Recognition In-the-Wild , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..