An interdisciplinary approach to the design, development and deployment of person-centered accessible technologies

Over the last few years, significant strides have been made toward enhancing the naturalness and acceptance of multimedia systems through the principles of Human-Centered Multimedia Computing (HCMC), a field of computational science where user needs, expectations, adoption and adaptation preferences guide interface and system design. While progress has greatly benefited the broader “able” population, those with disabilities are largely ignored, and must often force-fit available solutions or wait for add-on features that only partially solve usability issues. Given the diversity of disabilities, a person-centered rather than human-centered approach is needed. Previously, we proposed enriching the design philosophy of HCMC by considering perspectives from disabilities. More recently, we have proposed Person-Centered Multimedia Computing (PCMC) to emphasize individual user needs and co-adaptive systems. In this paper, we present an interdisciplinary approach to realizing person-centered accessible technologies. The basis of the approach is three complementary research thrusts: human-centered design, socio-personal dynamics, and socio-technological practices. These research thrusts are interconnected through three perspectives of disability research: technology, adaptation and policy. The results of several case studies will be presented to highlight how this approach has aided the development of person-centered accessible technologies from early conceptualization to commercialization.

[1]  Thomas W. King,et al.  Assistive Technology: Essential Human Factors , 1998 .

[2]  Gerhard Tröster,et al.  Non-interrupting user interfaces for electronic body-worn swim devices , 2009, PETRA '09.

[3]  Mark Guzdial Human-centered computing: a new degree for Licklider's world , 2013, CACM.

[4]  Sethuraman Panchanathan,et al.  iCARE interaction assistant: a wearable face recognition system for individuals with visual impairments , 2005, Assets '05.

[5]  Donald A. Norman,et al.  Human-centered design considered harmful , 2005, INTR.

[6]  Sethuraman Panchanathan,et al.  Motor learning using a kinematic-vibrotactile mapping targeting fundamental movements , 2011, MM '11.

[7]  Sethuraman Panchanathan,et al.  Enriched human-centered multimedia computing through inspirations from disabilities and deficit-centered computing solutions , 2008, HCC '08.

[8]  Sethuraman Panchanathan,et al.  Using tactile rhythm to convey interpersonal distances to individuals who are blind , 2009, CHI Extended Abstracts.

[9]  F. A. Geldard,et al.  The Cutaneous "Rabbit": A Perceptual Illusion , 1972, Science.

[10]  Sethuraman Panchanathan,et al.  Framework for performance evaluation of face recognition algorithms , 2002, SPIE ITCom.

[11]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[12]  Peter Gregor,et al.  Design for older and disabled people – where do we go from here? , 2002, Universal Access in the Information Society.

[13]  Peter Gregor,et al.  Extra-Ordinary Human–Machine Interaction: What can be Learned from People with Disabilities? , 1999, Cognition, Technology & Work.

[14]  Sethuraman Panchanathan,et al.  Person-Independent Head Pose Estimation Using Biased Manifold Embedding , 2008, EURASIP J. Adv. Signal Process..

[15]  Yvonne Rogers,et al.  Buzzing to play: lessons learned from an in the wild study of real-time vibrotactile feedback , 2011, CHI.

[16]  Nicu Sebe,et al.  Human-centered computing: a multimedia perspective , 2006, MM '06.

[17]  Sethuraman Panchanathan,et al.  Person-centered accessible technologies: improved usability and adaptation through inspirations from disability research , 2012, UXeLATE '12.

[18]  V. Balasubramanian,et al.  Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind , 2008, 2008 IEEE International Workshop on Haptic Audio visual Environments and Games.

[19]  N. Miller,et al.  An operant approach to rehabilitation medicine: overcoming learned nonuse by shaping. , 1994, Journal of the experimental analysis of behavior.

[20]  Ahmed Elgammal,et al.  Human-centered multimedia: representations and challenges , 2006, HCM '06.

[21]  C. L. Philip Chen,et al.  Optimization of Sensor Locations and Sensitivity Analysis for Engine Health Monitoring Using Minimum Interference Algorithms , 2007, 2007 IEEE International Conference on System of Systems Engineering.

[22]  Sethuraman Panchanathan,et al.  A Systematic Requirements Analysis and Development of an Assistive Device to Enhance the Social Interaction of People Who are Blind or Visually Impaired , 2008 .

[23]  Jan O. Borchers,et al.  Tactile motion instructions for physical activities , 2009, CHI.

[24]  Sethuraman Panchanathan,et al.  Vibrotactile feedback of motor performance errors for enhancing motor learning , 2012, ACM Multimedia.

[25]  Sethuraman Panchanathan,et al.  VibroGlove: an assistive technology aid for conveying facial expressions , 2010, CHI Extended Abstracts.

[26]  Carol Armbruster,et al.  Methods of Group Exercise Instruction , 2004 .

[27]  David S. Hayden,et al.  Note-taker 3.0: an assistive technology enabling students who are legally blind to take notes in class , 2011, ASSETS '11.

[28]  HUMAN KINETICS , 1952 .

[29]  Alejandro Jaimes Human-centered multimedia: culture, deployment, and access , 2006, IEEE Multimedia.

[30]  David S. Hayden,et al.  Note-taker 2.0: the next step toward enabling students who are legally blind to take notes in class , 2010, ASSETS '10.