Egocentric video: a new tool for capturing hand use of individuals with spinal cord injury at home

BackgroundCurrent upper extremity outcome measures for persons with cervical spinal cord injury (cSCI) lack the ability to directly collect quantitative information in home and community environments. A wearable first-person (egocentric) camera system is presented that aims to monitor functional hand use outside of clinical settings.MethodsThe system is based on computer vision algorithms that detect the hand, segment the hand outline, distinguish the user’s left or right hand, and detect functional interactions of the hand with objects during activities of daily living. The algorithm was evaluated using egocentric video recordings from 9 participants with cSCI, obtained in a home simulation laboratory. The system produces a binary hand-object interaction decision for each video frame, based on features reflecting motion cues of the hand, hand shape and colour characteristics of the scene.ResultsThe output from the algorithm was compared with a manual labelling of the video, yielding F1-scores of 0.74 ± 0.15 for the left hand and 0.73 ± 0.15 for the right hand. From the resulting frame-by-frame binary data, functional hand use measures were extracted: the amount of total interaction as a percentage of testing time, the average duration of interactions in seconds, and the number of interactions per hour. Moderate and significant correlations were found when comparing these output measures to the results of the manual labelling, with ρ = 0.40, 0.54 and 0.55 respectively.ConclusionsThese results demonstrate the potential of a wearable egocentric camera for capturing quantitative measures of hand use at home.

[1]  Ali Farhadi,et al.  YOLO9000: Better, Faster, Stronger , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[2]  José Zariffa,et al.  Arm Angle Detection in Egocentric Video of Upper Extremity Tasks , 2015 .

[3]  I Gelernter,et al.  The Spinal Cord Independence Measure (SCIM) version III: Reliability and validity in a multi-center international study , 2007, Disability and rehabilitation.

[4]  Matthai Philipose,et al.  Egocentric recognition of handled objects: Benchmark and analysis , 2009, 2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[5]  C. Lang,et al.  Does Task-Specific Training Improve Upper Limb Performance in Daily Life Poststroke? , 2017, Neurorehabilitation and neural repair.

[6]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, International Journal of Computer Vision.

[7]  Gunnar Farnebäck,et al.  Two-Frame Motion Estimation Based on Polynomial Expansion , 2003, SCIA.

[8]  James M. Rehg,et al.  Learning to recognize objects in egocentric activities , 2011, CVPR 2011.

[9]  Ali Farhadi,et al.  Understanding egocentric activities , 2011, 2011 International Conference on Computer Vision.

[10]  Sei Naito,et al.  An Attention-Based Activity Recognition for Egocentric Video , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops.

[11]  Cheng Li,et al.  Model Recommendation with Virtual Probes for Egocentric Hand Detection , 2013, 2013 IEEE International Conference on Computer Vision.

[12]  Alejandro Betancourt,et al.  A Sequential Classifier for Hand Detection in the Framework of Egocentric Vision , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops.

[13]  R. Gassert,et al.  Novel Sensor Technology To Assess Independence and Limb-Use Laterality in Cervical Spinal Cord Injury. , 2016, Journal of neurotrauma.

[14]  Roger Gassert,et al.  Monitoring Upper Limb Recovery after Cervical Spinal Cord Injury: Insights beyond Assessment Scores , 2016, Front. Neurol..

[15]  Cheng Li,et al.  Pixel-Level Hand Detection in Ego-centric Videos , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[16]  Rita Cucchiara,et al.  Hand segmentation for gesture recognition in EGO-vision , 2013, IMMPD '13.

[17]  H. Dickson,et al.  SCIM–Spinal Cord Independence Measure: a new disability scale for patients with spinal cord lesions , 1998, Spinal Cord.

[18]  M. Verrier,et al.  Development of the Graded Redefined Assessment of Strength, Sensibility and Prehension (GRASSP): reviewing measurement specific to the upper limb in tetraplegia. , 2012, Journal of neurosurgery. Spine.

[19]  K. Anderson Targeting recovery: priorities of the spinal cord-injured population. , 2004, Journal of neurotrauma.

[20]  Milos R Popovic,et al.  Hand contour detection in wearable camera video using an adaptive histogram region of interest , 2013, Journal of NeuroEngineering and Rehabilitation.

[21]  Matthias Rauterberg,et al.  Towards a unified framework for hand-based methods in First Person Vision , 2015, 2015 IEEE International Conference on Multimedia & Expo Workshops (ICMEW).

[22]  Deva Ramanan,et al.  Detecting activities of daily living in first-person camera views , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[23]  R. Marino Domains of outcomes in spinal cord injury for clinical trials to improve neurological function. , 2007, Journal of rehabilitation research and development.

[24]  C. Granger,et al.  Relationships between impairment and physical disability as measured by the functional independence measure. , 1993, Archives of physical medicine and rehabilitation.

[25]  C. Granger,et al.  The structure and stability of the Functional Independence Measure. , 1994, Archives of physical medicine and rehabilitation.

[26]  Mary Jane Youngstrom,et al.  The Occupational Therapy Practice Framework: the evolution of our professional language. , 2002, The American journal of occupational therapy : official publication of the American Occupational Therapy Association.

[27]  Hilde van der Togt,et al.  Publisher's Note , 2003, J. Netw. Comput. Appl..

[28]  Hironobu Takagi,et al.  Recognizing hand-object interactions in wearable camera videos , 2015, 2015 IEEE International Conference on Image Processing (ICIP).

[29]  Alejandro Cartas,et al.  Detecting Hands in Egocentric Videos: Towards Action Recognition , 2017, EUROCAST.

[30]  Janet V. DeLany,et al.  Occupational therapy practice framework: domain & practice, 2nd edition. , 2008, The American journal of occupational therapy : official publication of the American Occupational Therapy Association.

[31]  Ryanne J. M. Lemmens,et al.  Accelerometry Measuring the Outcome of Robot-Supported Upper Limb Training in Chronic Stroke: A Randomized Controlled Trial , 2014, PloS one.

[32]  C. Lawrence Zitnick,et al.  Structured Forests for Fast Edge Detection , 2013, 2013 IEEE International Conference on Computer Vision.

[33]  Kaiming He,et al.  Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[34]  Ali Borji,et al.  Analysis of Hand Segmentation in the Wild , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[35]  Jirapat Likitlersuang,et al.  Interaction Detection in Egocentric Video: Toward a Novel Outcome Measure for Upper Extremity Function , 2018, IEEE Journal of Biomedical and Health Informatics.

[36]  James M. Rehg,et al.  Statistical Color Models with Application to Skin Detection , 2004, International Journal of Computer Vision.

[37]  Stefan Lee,et al.  Lending A Hand: Detecting Hands and Recognizing Activities in Complex Egocentric Interactions , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[38]  H. Rodgers,et al.  Accelerometer measurement of upper extremity movement after stroke: a systematic review of clinical studies , 2014, Journal of NeuroEngineering and Rehabilitation.

[39]  José Zariffa,et al.  Views of individuals with spinal cord injury on the use of wearable cameras to monitor upper limb function in the home and community , 2017, The journal of spinal cord medicine.

[40]  Gregory C. Colati,et al.  Better, Faster, Stronger , 2009 .

[41]  James M. Rehg,et al.  Modeling Actions through State Changes , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.