Being Aware of the World: Toward Using Social Media to Support the Blind With Navigation

This paper lays the ground work for assistive navigation using wearable sensors and social sensors to foster situational awareness for the blind. Our system acquires social media messages to gauge the relevant aspects of an event and to create alerts. We propose social semantics that captures the parameters required for querying and reasoning an event-of-interest, such as what, where, who, when, severity, and action from the Internet of things, using an event summarization algorithm. Our approach integrates wearable sensors in the physical world to estimate user location based on metric and landmark localization. Streaming data from the cyber world are employed to provide awareness by summarizing the events around the user based on the situation awareness factor. It is illustrated using disaster and socialization event scenarios. Discovered local events are fed back using sound localization so that the user can actively participate in a social event or get early warning of any hazardous events. A feasibility evaluation of our proposed algorithm included comparing the output of the algorithm to ground truth, a survey with sighted participants about the algorithm output, and a sound localization user interface study with blind-folded sighted participants. Thus, our framework supports the navigation problem for the blind by combining the advantages of our real-time localization technologies so that the user is being made aware of the world, a necessity for independent travel.

[1]  Beate Stollberg,et al.  The use of social media within the global disaster alert and coordination system (GDACS) , 2012, WWW.

[2]  Yingli Tian,et al.  Text extraction from scene images by character appearance and structure modeling , 2013, Comput. Vis. Image Underst..

[3]  Jizhong Xiao,et al.  Fast visual odometry and mapping from RGB-D data , 2013, 2013 IEEE International Conference on Robotics and Automation.

[4]  Bertrand De Longueville,et al.  "OMG, from here, I can see the flames!": a use case of mining location based social networks to acquire spatio-temporal data on forest fires , 2009, LBSN '09.

[5]  Jizhong Xiao,et al.  Visual semantic parameterization - To enhance blind user perception for indoor navigation , 2013, 2013 IEEE International Conference on Multimedia and Expo Workshops (ICMEW).

[6]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[7]  Nikolaos G. Bourbakis,et al.  Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[8]  Salvatore Baglio,et al.  A Sensing Architecture for Mutual User-Environment Awareness Case of Study: A Mobility Aid for the Visually Impaired , 2011, IEEE Sensors Journal.

[9]  Matt Jones,et al.  Audvert: Using Spatial Audio to Gain a Sense of Place , 2013, INTERACT.

[10]  R. Velazquez,et al.  Walking Using Touch: Design and Preliminary Prototype of a Non-Invasive ETA for the Visually Impaired , 2005, 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference.

[11]  Pelin Angin,et al.  Real-time Mobile-Cloud Computing for Context- Aware Blind Navigation , 2011, Int. J. Next Gener. Comput..

[12]  Leysia Palen,et al.  Twitter adoption and use in mass convergence and emergency events , 2009 .

[13]  David McGookin,et al.  PULSE: an auditory display to provide a social vibe , 2011, IwS '11.

[14]  Timothy W. Finin,et al.  Why we twitter: understanding microblogging usage and communities , 2007, WebKDD/SNA-KDD '07.

[15]  Daniel Povey,et al.  The Kaldi Speech Recognition Toolkit , 2011 .

[16]  Jizhong Xiao,et al.  Semantic Indoor Navigation with a Blind-User Oriented Augmented Reality , 2013, 2013 IEEE International Conference on Systems, Man, and Cybernetics.