Augmented Reality for People with Low Vision: Symbolic and Alphanumeric Representation of Information

Many individuals with visual impairments have residual vision that often remains underused by assistive technologies. Head-mounted augmented reality (AR) devices can provide assistance, by recoding difficult-to-perceive information into a visual format that is more accessible. Here, we evaluate symbolic and alphanumeric information representations for their efficiency and usability in two prototypical AR applications: namely, recognizing facial expressions of conversational partners and reading the time. We find that while AR provides a general benefit, the complexity of the visual representations has to be matched to the user’s visual acuity.

[1]  John Magee,et al.  Conversation Aid for People with Low Vision Using Head Mounted Display and Computer Vision Emotion Detection , 2018, ICCHP.

[2]  Steven K. Feiner,et al.  Designing AR Visualizations to Facilitate Stair Navigation for People with Low Vision , 2019, UIST.

[3]  Jisha John,et al.  Facial Expression Recognition System for Visually Impaired , 2018, International Conference on Intelligent Data Communication Technologies and Internet of Things (ICICI) 2018.

[4]  J. G. Taylor,et al.  Emotion recognition in human-computer interaction , 2005, Neural Networks.

[5]  Vassilis Charissis,et al.  Symbolic vs alphanumeric representations in human machine interface design , 2007 .

[6]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .

[7]  Ramachandra Pararajasegaram Low vision care: the need to maximise visual potential. , 2004, Community eye health.

[8]  B. Prabhakaran,et al.  Real-Time Mobile Facial Expression Recognition System -- A Case Study , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops.

[9]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[10]  Hendrik P. Buimer,et al.  Conveying facial expressions to blind and visually impaired persons through a wearable vibrotactile device , 2018, PloS one.

[11]  Jon Froehlich,et al.  Design of an Augmented Reality Magnification Aid for Low Vision Users , 2018, ASSETS.

[12]  Aurobinda Routray,et al.  A real time facial expression classification system using Local Binary Patterns , 2015, 2012 4th International Conference on Intelligent Human Computer Interaction (IHCI).

[13]  FragopanagosN.,et al.  2005 Special Issue , 2005 .

[14]  M. Scherer,et al.  Assistive Technology Use and Stigma , 2004 .

[15]  M. Potter,et al.  Time to understand pictures and words , 1975, Nature.