Emotion is a natural instinctive state of mind deriving from one’s circumstances, mood, or relationships with others. Emotion can be characterized primarily by the psycho-physiological expressions, biological reactions, body interaction, and mental states. The emotional component is to be important for social interaction to serve the communication, response, and conveying information. The problem in controlling and maintaining human emotion can lead to emotional disorder. According to the National Institute of Mental Health (NIMH), approximation of 10-15% of the children tend to have an emotional and behavioral disorder. In this paper, discrete wavelet transform (DWT) was proposed to recognize human emotions in gait patterns. Four discrete categories of emotion such as fear, happy, normal, and sad were analyzed. Data was extracted from a single stride of gait. Daubechies wavelet of order 1 and order 4 was utilized to investigate their performance in recognizing emotional expression in gait patterns. Six statistical features namely mean, maximum, minimum, standard deviation, skewness, and kurtosis were derived from both approximation and detail coefficients at every level of decomposition. The discrete emotion was classified using kNN and fkNN classifier. The maximum classification accuracy of 96.07% was obtained at the first level of decomposition using kNN. DOI: 10.4018/ijbce.2012010107 International Journal of Biomedical and Clinical Engineering, 1(1), 86-93, January-June 2012 87 Copyright © 2012, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited. traits (extraversion–introversion), neuroticism and psychoticism stability. The problem with thoughts and behavior of humans comes from uncontrolled emotion that can lead to personality disorders such as borderline, antisocial, narcissistic and others (Zuckerman, 1991) An article from Pacer Centre has explicated that the childhood of teenager who experiences an emotional disorder will most likely have difficulties in growing up. In addition from DSM-IVR diagnostic criteria, there are several types of emotional disorder that affect a child and youth. In psychosocial aspects, emotional disorders are quite complex. Many psychologists will normally investigate the patient emotional states through questionnaires and counseling and this technique are more subjective. In order to develop objective method to understand the emotional state of an individual, several studies using physiological signal, facial expressions, acoustic analysis of speech, gesture and body motion were investigated (Hassan et al., 2010; Reddy et al., 2011; Russo et al., 2009; Yamada & Watanabe, 2007; Asha et al., 2005; Friberg, 2004; Kobayashi, 2007; Kleinsmith et al., 2011; Omlor et al., 2006; Karg et al., 2010; Roether et al., 2009; Janssen et al., 2008; Venture, 2010). Although several studies are available in the literature, there is still some limitation. Physiological signal is hard to deal with since it can be affected by internal factors of a user. External factor on the other hand deals with the different physiological or acoustic signal characteristic where they are easily exposed to the environmental noise during experimental session. In order to improve the performance to develop the removable method, the researchers used motion capture based marker techniques. The data obtained is considered more accurate since it could represent the orientations of the joints and bone structure as presented in previous works. Developing machine learning models for recognizing human emotion is far more challenging and it is an active research field which generally referred to as affective computing. Table 1 depicts some of the significant research works that was conducted in the human emotions for various types of motion. There are different methods presented in previous work as shown in Table 1. Body expressions have recently been recognized as an important matter of nonverbal communication. Many studies have examined the configurations of body expressions to evaluate specific features of the body that can be attributed to the recognition of emotional states (Kleinsmith & Bianchi, 2012). From the previous work, two main things were highlighted which are features extraction and classification. In this paper, different techniques of feature extraction using Discrete Wavelet Transform (DWT) were proposed to investigate the emotional states from the gait patterns. The reason behind using gait for this study is due to its possibility as a source to provide useful social information as explained by Montepare et al. (1987). Gait patterns were recorded under different emotional conditions collected from Carnegie Mellon University (CMU) human gait database. Four emotional expressions have been selected. Statistical features were extracted from the decomposed gait patterns. To investigate the usefulness of the statistical parameter, kNN and fkNN were used as a classifier. From the results, it can be concluded that the proposed method can recognize the human emotions from their gait patterns efficiently. The database used in this paper is described in “ Database” section. In “ Feature Extraction using Discrete Wavelet Transform (DWT)” section, the introduction of designing DWT and feature extraction are explained. In “ Classification” section, the fundamental of kNN and fkNN are presented. The experimental results of emotion gait recognition are discussed in “ Result and Discussion” section. The conclusion of the paper is given in “ Conclusion” section.
[1]
Gentiane Venture.
Human characterization and emotion characterization from gait
,
2010,
2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.
[2]
Takashi Yamada,et al.
Virtual Facial Image Synthesis with Facial Color Enhancement and Expression under Emotional Change ofAnger
,
2007,
RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.
[3]
Sazali Yaacob,et al.
Classification of speech dysfluencies with MFCC and LPCC features
,
2012,
Expert Syst. Appl..
[4]
Frank A. Russo,et al.
Facial Expressions and Emotional Singing: A Study of Perception and Production with Motion Capture and Electromyography
,
2009
.
[5]
R. Adolphs,et al.
Cortical Regions for Judgments of Emotions and Personality Traits from Point-light Walkers
,
2004,
Journal of Cognitive Neuroscience.
[6]
Bhavani M. Thuraisingham,et al.
Face Recognition Using Multiple Classifiers
,
2006,
2006 18th IEEE International Conference on Tools with Artificial Intelligence (ICTAI'06).
[7]
Peter F. Driessen,et al.
Gesture-Based Affective Computing on Motion Capture Data
,
2005,
ACII.
[8]
Mamiko Sakata,et al.
An Analysis of Body Movement on Music Expressivity Using Motion Capture
,
2009,
2009 Fifth International Conference on Intelligent Information Hiding and Multimedia Signal Processing.
[9]
Andrea Kleinsmith,et al.
Affective Body Expression Perception and Recognition: A Survey
,
2013,
IEEE Transactions on Affective Computing.
[10]
James M. Keller,et al.
A fuzzy K-nearest neighbor algorithm
,
1985,
IEEE Transactions on Systems, Man, and Cybernetics.
[11]
Yuichi Kobayashi,et al.
The EMOSIGN - analyzing the emotion signature in human motion
,
2007,
2007 IEEE International Conference on Systems, Man and Cybernetics.
[12]
Amarjot Singh,et al.
The decisive emotion identifier?
,
2011,
2011 3rd International Conference on Electronics Computer Technology.
[13]
B M Nigg,et al.
Identification of individual walking patterns using time discrete and time continuous data sets.
,
2002,
Gait & posture.
[14]
D. Janssen,et al.
Recognition of Emotions in Gait Patterns by Means of Artificial Neural Nets
,
2008
.
[15]
Michelle Karg,et al.
A Two-fold PCA-Approach for Inter-Individual Recognition of Emotions in Natural Walking
,
2009,
MLDM Posters.
[16]
M. Hariharan,et al.
Luminance Sticker Based Facial Expression Recognition Using Discrete Wavelet Transform for Physically Disabled Persons
,
2012,
Journal of Medical Systems.
[17]
Peter E. Hart,et al.
Nearest neighbor pattern classification
,
1967,
IEEE Trans. Inf. Theory.
[18]
Claire L. Roether,et al.
Critical features for the perception of emotion from gait.
,
2009,
Journal of vision.
[19]
J. Montepare,et al.
The identification of emotions from gait information
,
1987
.
[20]
Anthony Steed,et al.
Automatic Recognition of Non-Acted Affective Postures
,
2011,
IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).
[21]
Neamat El Gayar,et al.
Emotions analysis of speech for call classification
,
2010,
2010 10th International Conference on Intelligent Systems Design and Applications.
[22]
R. Adolphs.
Recognizing emotion from facial expressions: psychological and neurological mechanisms.
,
2002,
Behavioral and cognitive neuroscience reviews.
[23]
Michelle Karg,et al.
Recognition of Affect Based on Gait Patterns
,
2010,
IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).
[24]
Peter Robinson,et al.
Detecting Affect from Non-stylised Body Motions
,
2007,
ACII.
[25]
Frank E. Pollick,et al.
The Features People Use to Recognize Human Movement Style
,
2003,
Gesture Workshop.
[26]
Bernadette Bouchon-Meunier,et al.
Characterizing player's experience from physiological signals using fuzzy decision trees
,
2010,
Proceedings of the 2010 IEEE Conference on Computational Intelligence and Games.
[27]
N. Troje.
Decomposing biological motion: a framework for analysis and synthesis of human gait patterns.
,
2002,
Journal of vision.