Coaching through smart objects

We explore the ways in which smart objects can be used to cue actions as part of coaching for Activities of Daily Living (ADL) following brain damage or injury, such as might arise following a stroke. In this case, appropriate actions are cued for a given context. The context is defined by the intention of the users, the state of the objects and the tasks for which these objects can be used. This requires objects to be instrumented so that they can recognize the actions that users perform. In order to provide appropriate cues, the objects also need to be able to display information to users, e.g., by changing their physical appearance or by providing auditory output. We discuss the ways in which information can be displayed to cue user action.

[1]  Martin J. Russell,et al.  CogWatch - Automated Assistance and Rehabilitation of Stroke-Induced Action Disorders in the Home Environment , 2013, HCI.

[2]  A W Young,et al.  Dyspraxia in a patient with corticobasal degeneration: the role of visual and tactile inputs to action , 1999, Journal of neurology, neurosurgery, and psychiatry.

[3]  Colin Potts,et al.  Design of Everyday Things , 1988 .

[4]  Agnès Roby-Brami,et al.  Analysis of grasping strategies and function in hemiparetic patients using an instrumented object , 2013, 2013 IEEE 13th International Conference on Rehabilitation Robotics (ICORR).

[5]  Chris Baber,et al.  Rule and theme discovery in human interactions with an 'internet of things' , 2015, BCS HCI.

[6]  Chris Baber,et al.  The application of SHERPA (Systematic Human Error Reduction and Prediction Approach) in the development of compensatory cognitive rehabilitation strategies for stroke patients with left and right brain damage , 2015, Ergonomics.

[7]  Winnie Jensen,et al.  Replace, Repair, Restore, Relieve : Bridging Clinical and Engineering Solutions in Neurorehabilitation: Proceedings of the 2nd International Conference on NeuroRehabilitation, ICNR2014, 24-26 June 2014, Aalborg, Denmark , 2014 .

[8]  Chen-Wei Chiang,et al.  The Research of Using Magnetic Pillbox as Smart Pillbox System's Interactive Tangible User Interface , 2016, HCI.

[9]  Matteo Pastorino,et al.  Preliminary Evaluation of a Personal Healthcare System Prototype for Cognitive eRehabilitation in a Living Assistance Domain , 2014, Sensors.

[10]  Hermie Hermens,et al.  Tailoring real-time physical activity coaching systems: a literature survey and model , 2014, User Modeling and User-Adapted Interaction.

[11]  Martin J. Russell,et al.  POMDP Based Action Planning and Human Error Detection , 2015, AIAI.

[12]  Oren Zuckerman,et al.  Objects for Change: A Case Study of a Tangible User Interface for Behavior Change , 2015, TEI.

[13]  Hiroshi Ishii,et al.  The tangible user interface and its evolution , 2008, CACM.

[14]  C. Craig,et al.  Auditory cueing in Parkinson's patients with freezing of gait. What matters most: Action-relevance or cue-continuity? , 2016, Neuropsychologia.

[15]  Charlotte Magnusson,et al.  CogWatch: Cognitive Rehabilitation for Apraxia and Action Disorganization Syndrome Patients , 2012 .

[16]  Bruce H. Dobkin,et al.  A Rehabilitation-Internet-of-Things in the Home to Augment Motor Skills and Exercise Training , 2017, Neurorehabilitation and neural repair.

[17]  William W. Gaver Technology affordances , 1991, CHI.

[18]  Myrna F Schwartz,et al.  The cognitive neuropsychology of everyday action and planning , 2006, Cognitive neuropsychology.

[19]  Ivan Poupyrev,et al.  Actuation and tangible user interfaces: the Vaucanson duck, robots, and shape displays , 2007, TEI.

[20]  Leif Johannsen,et al.  Tool use without a tool: kinematic characteristics of pantomiming as compared to actual use and the effect of brain damage , 2012, Experimental Brain Research.

[21]  Michael Beigl,et al.  The MediaCup: Awareness Technology Embedded in a Everyday Object , 1999, HUC.

[22]  J. Hermsdörfer,et al.  The Use of Ecological Sounds in Facilitation of Tool Use in Apraxia , 2014 .

[23]  Gerd Kortuem,et al.  Smart objects as building blocks for the Internet of things , 2010, IEEE Internet Computing.

[24]  J. Hermsdörfer,et al.  Grasping tools: Effects of task and apraxia , 2009, Neuropsychologia.

[25]  Eli Blevis,et al.  Ecological Perspectives in HCI: Promise, Problems, and Potential , 2015, CHI Extended Abstracts.

[26]  Leonardo Bonanni,et al.  CounterIntelligence: Augmented Reality Kitchen , 2005 .

[27]  Martin J. Russell,et al.  CogWatch: Automatic prompting system for stroke survivors during activities of daily living , 2016, J. Innov. Digit. Ecosyst..

[28]  Aaron J. Newman,et al.  Handles of manipulable objects attract covert visual attention: ERP evidence , 2014, Brain and Cognition.

[29]  Hiroshi Ishii,et al.  TRANSFORM: Embodiment of "Radical Atoms" at Milano Design Week , 2015, CHI Extended Abstracts.

[30]  Martin J. Russell,et al.  Intelligent Assistive System Using Real-Time Action Recognition for Stroke Survivors , 2014, 2014 IEEE International Conference on Healthcare Informatics.

[31]  J. Gibson The Ecological Approach to Visual Perception , 1979 .