One Small Step for a Man: Estimation of Gender, Age and Height from Recordings of One Step by a Single Inertial Sensor

A number of previous works have shown that information about a subject is encoded in sparse kinematic information, such as the one revealed by so-called point light walkers. With the work at hand, we extend these results to classifications of soft biometrics from inertial sensor recordings at a single body location from a single step. We recorded accelerations and angular velocities of 26 subjects using integrated measurement units (IMUs) attached at four locations (chest, lower back, right wrist and left ankle) when performing standardized gait tasks. The collected data were segmented into individual walking steps. We trained random forest classifiers in order to estimate soft biometrics (gender, age and height). We applied two different validation methods to the process, 10-fold cross-validation and subject-wise cross-validation. For all three classification tasks, we achieve high accuracy values for all four sensor locations. From these results, we can conclude that the data of a single walking step (6D: accelerations and angular velocities) allow for a robust estimation of the gender, height and age of a person.

[1]  Yasushi Makihara,et al.  Gait-based age estimation using a whole-generation gait database , 2011, 2011 International Joint Conference on Biometrics (IJCB).

[2]  Gary M. Weiss,et al.  Activity recognition using cell phone accelerometers , 2011, SKDD.

[3]  Catrine Tudor-Locke,et al.  Comparison of pedometer and accelerometer accuracy under controlled conditions. , 2003, Medicine and science in sports and exercise.

[4]  H. S. Wolff,et al.  iRun: Horizontal and Vertical Shape of a Region-Based Graph Compression , 2022, Sensors.

[5]  Tao Liu,et al.  Gait Analysis Using Wearable Sensors , 2012, Sensors.

[6]  Gilles Louppe,et al.  Understanding variable importances in forests of randomized trees , 2013, NIPS.

[7]  Mathieu Barnachon,et al.  Ongoing human action recognition with motion capture , 2014, Pattern Recognit..

[8]  Andy Liaw,et al.  Classification and Regression by randomForest , 2007 .

[9]  Patrick Bours,et al.  Gait and activity recognition using commercial phones , 2013, Comput. Secur..

[10]  G. Johansson Visual perception of biological motion and a model for its analysis , 1973 .

[11]  Wiebren Zijlstra,et al.  Assessment of spatio-temporal parameters during unconstrained walking , 2004, European Journal of Applied Physiology.

[12]  Patrick Pérez,et al.  View-Independent Action Recognition from Temporal Self-Similarities , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[14]  Gentiane Venture,et al.  Motion capture based identification of the human body inertial parameters , 2008, 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[15]  Christoph Busch,et al.  Unobtrusive User-Authentication on Mobile Phones Using Biometric Gait Recognition , 2010, 2010 Sixth International Conference on Intelligent Information Hiding and Multimedia Signal Processing.

[16]  Ying Wah Teh,et al.  Mining Personal Data Using Smartphones and Wearable Devices: A Survey , 2015, Sensors.

[17]  Hans-Peter Seidel,et al.  Motion reconstruction using sparse accelerometer data , 2011, TOGS.

[18]  Gwenn Englebienne,et al.  Classifying social actions with a single accelerometer , 2013, UbiComp.

[19]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[20]  Ramakant Nevatia,et al.  Single View Human Action Recognition using Key Pose Matching and Viterbi Path Searching , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[21]  Adrian Hilton,et al.  A survey of advances in vision-based human motion capture and analysis , 2006, Comput. Vis. Image Underst..

[22]  Kenton R Kaufman,et al.  Precision and accuracy of an ankle-worn accelerometer-based pedometer in step counting and energy expenditure. , 2005, Preventive medicine.

[23]  David A. Forsyth,et al.  Skeletal parameter estimation from optical motion capture data , 2004, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[24]  Karon E. MacLean,et al.  Real-time gait classification for persuasive smartphone apps: structuring the literature and pushing the limits , 2013, IUI '13.

[25]  Scott E. Umbaugh,et al.  Digital image processing and analysis : human and computer vision applications with CVIPtools , 2011 .

[26]  Alex Pentland,et al.  Coupled hidden Markov models for complex action recognition , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[27]  Dae-Hyeong Kim,et al.  Multifunctional wearable devices for diagnosis and therapy of movement disorders. , 2014, Nature nanotechnology.

[28]  V. von Tscharner,et al.  Subspace Identification and Classification of Healthy Human Gait , 2013, PloS one.

[29]  Björn Krüger,et al.  Motion reconstruction using very few accelerometers and ground contacts , 2015, Graph. Model..

[30]  Wiebren Zijlstra,et al.  Detection of gait and postures using a miniaturised triaxial accelerometer-based system: accuracy in community-dwelling older adults. , 2010, Age and ageing.

[31]  Martin J. Russell,et al.  Intelligent Assistive System Using Real-Time Action Recognition for Stroke Survivors , 2014, 2014 IEEE International Conference on Healthcare Informatics.

[32]  Darryl Stewart,et al.  Gender classification via lips: static and dynamic features , 2013, IET Biom..

[33]  Konrad Paul Kording,et al.  Fall Classification by Machine Learning Using Mobile Phones , 2012, PloS one.

[34]  David A. Hawkins,et al.  Estimating Youth Locomotion Ground Reaction Forces Using an Accelerometer-Based Activity Monitor , 2012, PloS one.

[35]  B. Caputo,et al.  Recognizing human actions: a local SVM approach , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[36]  Carl-Erik Särndal,et al.  Model Assisted Survey Sampling , 1997 .

[37]  Dan Morris,et al.  RecoFit: using a wearable sensor to find, recognize, and count repetitive exercises , 2014, CHI.

[38]  Feng Zhao,et al.  A reliable and accurate indoor localization method using phone inertial sensors , 2012, UbiComp.

[39]  Johannes Peltola,et al.  Activity classification using realistic data from wearable sensors , 2006, IEEE Transactions on Information Technology in Biomedicine.

[40]  N. Troje Decomposing biological motion: a framework for analysis and synthesis of human gait patterns. , 2002, Journal of vision.

[41]  Andrew Gilbert,et al.  Capturing relative motion and finding modes for action recognition in the wild , 2014, Comput. Vis. Image Underst..

[42]  Thomas Phan,et al.  Improving activity recognition via automatic decision tree pruning , 2014, UbiComp Adjunct.