Recognition of Activities of Daily Living in natural “at home” scenario for assessment of Alzheimer's disease patients

In this paper we tackle the problem of Instrumental Activities of Daily Living (IADLs) recognition from wearable videos in a Home Clinical scenario. The aim of this research is to provide an accessible and yet detailed video-based navigation interface of patients with dementia/Alzheimer disease to doctors and caregivers. A joint work between a memory clinic and computer vision scientists enabled studying real-case life scenarios of a dyad couple consisting of a caregiver and patient with Alzheimer. As a result of this collaboration, a new @Home, real-life video dataset was recorded, from which a truly relevant taxonomy of activities was extracted. Following a state of the art Activity Recognition framework we further studied and assessed these IADLs in term of recognition performances with different calibration approaches.

[1]  Ali Farhadi,et al.  Understanding egocentric activities , 2011, 2011 International Conference on Computer Vision.

[2]  John Platt,et al.  Probabilistic Outputs for Support vector Machines and Comparisons to Regularized Likelihood Methods , 1999 .

[3]  J. Forlizzi,et al.  Intelligent assistive technology applications to dementia care: current capabilities, limitations, and future challenges. , 2009, The American journal of geriatric psychiatry : official journal of the American Association for Geriatric Psychiatry.

[4]  Jenny Benois-Pineau,et al.  Fusion of Multiple Visual Cues for Object Recognition in Videos , 2014, Fusion in Computer Vision.

[5]  A. Mihailidis,et al.  The COACH prompting system to assist older adults with dementia through handwashing: An efficacy study , 2008, BMC geriatrics.

[6]  L. Harrell,et al.  The Clinical Validity of the Mattis Dementia Rating Scale in Staging Alzheimer's Dementia , 1991, Journal of geriatric psychiatry and neurology.

[7]  H. Dinse,et al.  Questionnaire-based evaluation of everyday competence in older adults , 2011, Clinical interventions in aging.

[8]  Yong Jae Lee,et al.  Predicting Important Objects for Egocentric Video Summarization , 2015, International Journal of Computer Vision.

[9]  R. Bucks,et al.  Assessment of activities of daily living in dementia: development of the Bristol Activities of Daily Living Scale. , 1996, Age and ageing.

[10]  Glenn E. Smith,et al.  Telehealth Home Monitoring of Solitary Persons With Mild Dementia , 2007 .

[11]  L. Nygård,et al.  Everyday Technology Use Questionnaire: Psychometric Evaluation of a New Assessment of Competence in Technology Use , 2009 .

[12]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[13]  A. R. Mahoney,et al.  The Paradox of Aging in Place in Assisted Living , 2002 .

[14]  Jenny Benois-Pineau,et al.  Modeling instrumental activities of daily living in egocentric vision as sequences of active objects and context for alzheimer disease research , 2013, MIIRH '13.

[15]  Deva Ramanan,et al.  Detecting activities of daily living in first-person camera views , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[16]  Jenny Benois-Pineau,et al.  Hierarchical Hidden Markov Model in detecting activities of daily living in wearable videos for studies of dementia , 2011, Multimedia Tools and Applications.

[17]  J. Leeuw,et al.  Isotone Optimization in R: Pool-Adjacent-Violators Algorithm (PAVA) and Active Set Methods , 2009 .

[18]  Takahiro Okabe,et al.  Fast unsupervised ego-action learning for first-person sports videos , 2011, CVPR 2011.

[19]  J. Kaye Home-based technologies: A new paradigm for conducting dementia prevention trials , 2008, Alzheimer's & Dementia.