Can machine learning assess trunk alignment directly from raw video?

Background Current physical therapy assessments of head and trunk control status in children with cerebral palsy are subjective. Previous work has established that objective replication of an existing clinical test (Segmental Assessment of Trunk Control (SATCo)) can be made using a 2D video-based method with semi-automated tracking of the video sequences. A markerless full automation of the analysis of live camera data would provide an objective and clinically-friendly tool for both assessor and patient. The use of high-definition depth (HD+D) cameras would also address the limitations of 2D video, such as body movement out of camera plane. Research question This study was to examine whether HD+D analysis is suitable for the classification of the alignment of given head and trunk segments in sitting by comparing expert opinion (labelling) to machine learning classification. Methods Sixteen healthy male adults were recruited and a SATCo was conducted for each participant and recorded using a Kinect V2. Two different trials were collected (Control and No-Control) to simulate the physical therapy test with children. Three of the seven SATCo segmental levels were selected to perform this feasibility analysis. Classification of alignment obtained with the machine learning classification (convolutional neural networks) of all frames was compared to an expert clinician’s labelling, and to a randomly selected reference aligned frame. Results At the optimal operating point of Receiver Operating Characteristics the neural network analysis correctly classified alignment and misalignment with an accuracy of 79.15%; with 64.66% precision and 76.79% recall. Significance This communication demonstrates, for the first time, an automated classification of trunk alignment directly from raw images (HD+D) and which requires minimal operator interaction. This demonstrates the potential of machine learning to provide a fully automated objective tool for the classification of the alignment component of head/trunk control in sitting that is suitable for clinical use.

[1]  T. Pountney,et al.  Content and Criterion Validation of the Chailey Levels of Ability , 1999 .

[2]  Kaat Desloovere,et al.  A clinical tool to measure trunk control in children with cerebral palsy: the Trunk Control Measurement Scale. , 2011, Research in developmental disabilities.

[3]  Dianne J. Russell,et al.  Gross Motor Function Measure (GMFM-66 and GMFM-88) User's Manual , 2013 .

[4]  M. Woollacott,et al.  Refinement, Reliability, and Validity of the Segmental Assessment of Trunk Control , 2010, Pediatric physical therapy : the official publication of the Section on Pediatrics of the American Physical Therapy Association.

[5]  Graham W. Taylor,et al.  Deconvolutional networks , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[6]  María B. Sánchez,et al.  The potential of an automated system to identify the upper limb component of a controlled sitting posture. , 2017, Gait & posture.

[7]  D. Reid,et al.  Reliability of the Sitting Assessment for Children with Neuromotor Dysfunction (SACND) , 1996 .

[8]  Soumith Chintala,et al.  Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks , 2015, ICLR.

[9]  María B. Sánchez,et al.  Working towards an objective segmental assessment of trunk control in children with cerebral palsy. , 2018, Gait & posture.

[10]  John Darby,et al.  A video based method to quantify posture of the head and trunk in sitting. , 2017, Gait & posture.

[11]  T. Korff,et al.  Paediatric Biomechanics and Motor Control: Theory and Application , 2013 .