For almost ten years, the marketing of low cost motion capture devices such as the Microsoft Kinect sensor has enabled numerous studies in real-life settings (Mousavi Hondori & Khademi, 2014; Webster & Celik, 2014; Springer & Yogev Seligmann, 2016). Whereas most of this work with elderly people is studying gait and fall risks (see Rougier, Auvinet, Rousseau, Mignotte, & Meunier, 2011 for example), in the present paper we propose to focus on the building of the SignAge corpus dedicated to the study of signing in elderly deaf participants with low cost motion capture devices. Up to now, a (preferably multi-)camera setup was considered a basic requirement in sign language studies, sometimes completed with much more intrusive or expensive equipment such as data gloves, optical motion capture systems (Channon, 2015, p. 132–133). But latest technology advancements allow us to quantify 3D-motions and their time derivatives at a reasonable price. Our newly built SignAge corpus1 of interactions between elderly deaf signers in LSF takes advantage of such advancements. The SignAge corpus combines data acquired by: - 2 digital video cameras, with a plan on the signing interviewee’s upper body and a plan on the whole interaction (similarly to the protocol developed in CorpAGEst, Bolly & Boutet, 2016) - 2 Noitom Perception Neuron body straps, each equipped with 25 IMU (Inertial Measurement Units), recorded with Axis Neuron software - 1 Kinect for Windows v2 (also known as Kinect for Xbox One) depth sensor centered on the interviewee, recorded with Brekel Pro Body v2. Thus, for each participant, our 5 timed data flows are: 2 video streams at 25 fps – synchronised, visualised and annotated in the ELAN software (Sloetjes & Seibert, 2016) –, and BioVision Hierarchy (BVH) files (see Meredith & Maddock, 2001). These BVH files are visualised in Motion Inspector, and the following descriptors are computed in Matlab for the three age groups in our sample: the global quantity of motion (Sarasua & Guaus, 2014) and the variation of the signing amplitude for each joint of the upper limbs, with the attempt to establish a correlation between age and the articulatory segment involved. For now, after manually post-synchronising our flows thanks to start and end claps realised by both the interviewer and the interviewee during the recording session, we are assessing the quality of our data. Preliminary results show that temporal resolution can be questionable for the Kinect used in conjunction with Brekel, whereas in the Perception Neuron, spatial drift is a problem. Hence, these two devices seem to be complementary: the Kinect gives the absolute movements of the body, whereas the neuron is much precise for relative movements. Furthermore, a post interview feedback with each subject showed a high acceptance of the protocol by elderly signers (see figure below). As a conclusion, knowing the technical limitations it appears that Kinect and Neuron might still be an interesting choice to get usable additional 3D data in aging studies because of their portability and the ease with which participants get accustomed to wearing the body straps.
[1]
C. Bolly,et al.
The multimodal CorpAGEst corpus: keeping an eye on pragmatic competence in later life
,
2018,
Corpora.
[2]
Max Mignotte,et al.
Fall Detection from Depth Map Video Sequences
,
2011,
ICOST.
[3]
Shmuel Springer,et al.
Validity of the Kinect for Gait Assessment: A Focused Review
,
2016,
Sensors.
[4]
Han Sloetjes,et al.
Measuring by marking; the multimedia annotation tool ELAN
,
2016
.
[5]
Álvaro Sarasúa,et al.
Dynamics in Music Conducting: A Computational Comparative Study Among Subjects
,
2014,
NIME.
[6]
Rachel Channon.
Research Methods for Studying the Form of Signs
,
2015
.
[7]
Hossein Mousavi Hondori,et al.
A Review on Technical and Clinical Impact of Microsoft Kinect on Physical Therapy and Rehabilitation
,
2014,
Journal of medical engineering.
[8]
O. Celik,et al.
Systematic review of Kinect applications in elderly care and stroke rehabilitation
,
2014,
Journal of NeuroEngineering and Rehabilitation.