Person-centered accessible technologies and computing solutions through interdisciplinary and integrated perspectives from disability research

Abstract Human-centered computing and human-centered multimedia computing (HCMC) have emerged as important subfields of computational science which leverage the social and behavioral sciences toward improving the usability of technologies and multimedia systems. While technological solutions have made significant strides for the broader population, individuals with disabilities have been largely ignored, often having to force-fit or adapt themselves to available solutions. The authors first introduced a methodology to enrich HCMC by considering perspectives from individuals with disabilities. They subsequently introduced a person-centered approach to HCMC known as person-centered multimedia computing (PCMC). In the proposed work, they seek to further enrich the PCMC methodology by incorporating interdisciplinary inspirations that take into account the diverse challenges associated with assistive technology design and deployment. Several applications are presented, highlighting how considerations of technology, adaptation and policy from a disability perspective can enrich the design of person-centered accessible technologies. This approach has been implemented through ongoing work on a NSF IGERT project, “Alliance for Person-centered Accessible Technologies,” details of which are also provided in this paper.

[1]  Sethuraman Panchanathan,et al.  Using tactile rhythm to convey interpersonal distances to individuals who are blind , 2009, CHI Extended Abstracts.

[2]  Sethuraman Panchanathan,et al.  VibroGlove: an assistive technology aid for conveying facial expressions , 2010, CHI Extended Abstracts.

[3]  Alejandro Jaimes Human-centered multimedia: culture, deployment, and access , 2006, IEEE Multimedia.

[4]  Sethuraman Panchanathan,et al.  Person-centered accessible technologies: improved usability and adaptation through inspirations from disability research , 2012, UXeLATE '12.

[5]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[6]  Sethuraman Panchanathan,et al.  Latent Facial Topics for affect analysis , 2013, 2013 IEEE International Conference on Multimedia and Expo Workshops (ICMEW).

[7]  Mark Guzdial Human-centered computing: a new degree for Licklider's world , 2013, CACM.

[8]  Sethuraman Panchanathan,et al.  Framework for performance evaluation of face recognition algorithms , 2002, SPIE ITCom.

[9]  A. Manley Physical Activity And Health: A Report Of The Surgeon General , 2004 .

[10]  Sethuraman Panchanathan,et al.  Person-Independent Head Pose Estimation Using Biased Manifold Embedding , 2008, EURASIP J. Adv. Signal Process..

[11]  Luca Chittaro,et al.  Turning the Classic Snake Mobile Game into a Location-Based Exergame that Encourages Walking , 2012, PERSUASIVE.

[12]  Peter Gregor,et al.  Extra-Ordinary Human–Machine Interaction: What can be Learned from People with Disabilities? , 1999, Cognition, Technology & Work.

[13]  Peter Gregor,et al.  Design for older and disabled people – where do we go from here? , 2002, Universal Access in the Information Society.

[14]  V. Balasubramanian,et al.  Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind , 2008, 2008 IEEE International Workshop on Haptic Audio visual Environments and Games.

[15]  Usdhhs Physical Activity and Health: A Report of the Surgeon General , 1996 .

[16]  Nicu Sebe,et al.  Human-centered computing: a multimedia perspective , 2006, MM '06.

[17]  Mark D. Huffman,et al.  Heart disease and stroke statistics--2013 update: a report from the American Heart Association. , 2013, Circulation.

[18]  Sethuraman Panchanathan,et al.  A Systematic Requirements Analysis and Development of an Assistive Device to Enhance the Social Interaction of People Who are Blind or Visually Impaired , 2008 .

[19]  Roddy Cowie,et al.  AVEC 2012: the continuous audio/visual emotion challenge - an introduction , 2012, ICMI.

[20]  Sethuraman Panchanathan,et al.  iCARE interaction assistant: a wearable face recognition system for individuals with visual impairments , 2005, Assets '05.

[21]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[22]  Sethuraman Panchanathan,et al.  An interdisciplinary approach to the design, development and deployment of person-centered accessible technologies , 2013, 2013 International Conference on Recent Trends in Information Technology (ICRTIT).

[23]  David S. Hayden,et al.  Note-taker 2.0: the next step toward enabling students who are legally blind to take notes in class , 2010, ASSETS '10.

[24]  David S. Hayden,et al.  Note-taker 3.0: an assistive technology enabling students who are legally blind to take notes in class , 2011, ASSETS '11.

[25]  Ahmed Elgammal,et al.  Human-centered multimedia: representations and challenges , 2006, HCM '06.

[26]  Björn W. Schuller,et al.  AVEC 2012: the continuous audio/visual emotion challenge , 2012, ICMI '12.

[27]  Sethuraman Panchanathan,et al.  Enriched human-centered multimedia computing through inspirations from disabilities and deficit-centered computing solutions , 2008, HCC '08.

[28]  Sethuraman Panchanathan,et al.  Detection of changes in human affect dimensions using an Adaptive Temporal Topic model , 2013, 2013 IEEE International Conference on Multimedia and Expo (ICME).