Identifying Emotion from Natural Walking

Emotion identification from gait aims to automatically determine persons affective state, it has attracted a great deal of interests and offered immense potential value in action tendency, health care, psychological detection and human-computer(robot) interaction.In this paper, we propose a new method of identifying emotion from natural walking, and analyze the relevance between the traits of walking and affective states. After obtaining the pure acceleration data of wrist and ankle, we set a moving average filter window with different sizes w, then extract 114 features including time-domain, frequency-domain, power and distribution features from each data slice, and run principal component analysis (PCA) to reduce dimension. In experiments, we train SVM, Decision Tree, multilayerperception, Random Tree and Random Forest classification models, and compare the classification accuracy on data of wrist and ankle with respect to different w. The performance of emotion identification on acceleration data of ankle is better than wrist.Comparing different classification models' results, SVM has best accuracy of identifying anger and happy could achieve 90:31% and 89:76% respectively, and identification ratio of anger-happy is 87:10%.The anger-neutral-happy classification reaches 85%-78%-78%.The results show that it is capable of identifying personal emotional states through the gait of walking.

[1]  Michelle Karg,et al.  A Two-fold PCA-Approach for Inter-Individual Recognition of Emotions in Natural Walking , 2009, MLDM Posters.

[2]  T Chau,et al.  A review of analytical techniques for gait data. Part 2: neural network and wavelet methods. , 2001, Gait & posture.

[3]  Elizabeth A. Crane,et al.  Motion Capture and Emotion: Affect Detection in Whole Body Movement , 2007, ACII.

[4]  J. Russell,et al.  Evidence for a three-factor theory of emotions , 1977 .

[5]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Michelle Karg,et al.  A comparison of PCA, KPCA and LDA for feature extraction to recognize affect in gait kinematics , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[7]  William H. Offenhauser,et al.  Wild Boars as Hosts of Human-Pathogenic Anaplasma phagocytophilum Variants , 2012, Emerging infectious diseases.

[8]  J. Cutting,et al.  Recognizing friends by their walk: Gait perception without familiarity cues , 1977 .

[9]  Armin Bruderlin,et al.  Perceiving affect from arm movement , 2001, Cognition.

[10]  J. Montepare,et al.  The identification of emotions from gait information , 1987 .

[11]  Lijuan Cao,et al.  A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine , 2003, Neurocomputing.

[12]  P. Ekman,et al.  A new pan-cultural facial expression of emotion , 1986 .

[13]  Jianning Wu,et al.  Feature extraction via KPCA for classification of gait patterns. , 2007, Human movement science.

[14]  T Chau,et al.  A review of analytical techniques for gait data. Part 1: Fuzzy, statistical and fractal methods. , 2001, Gait & posture.

[15]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Russell Beale,et al.  Affect and Emotion in Human-Computer Interaction, From Theory to Applications , 2008, Affect and Emotion in Human-Computer Interaction.

[17]  Avinash C. Kak,et al.  PCA versus LDA , 2001, IEEE Trans. Pattern Anal. Mach. Intell..