Identifying Kinematic Cues for Action Style Recognition Shohei Hidaka (shhidaka@jaist.ac.jp) Japan Advanced Institute of Science and Technology, 1-1 Asahidai, Nomi, Ishikawa 923-1292, Japan Abstract Recognition of emotional states from other’s actions is one of key capability for smooth social interaction. The present study provides a computational-theory-level analysis on which feature may take a crucial role in recognition of emotional attributes in human actions represented as point-light display. Lead by the previous theoretical works and empirical findings, the velocity and acceleration profile was investigated as a major feature of emotional attributes classification. The results showed that emotional attributes in actions as well as action types could be identified by covariance of velocity profiles among multiple body parts. Since, despite different velocity profiles in different actions, these features for emotional attributes were found commonly in multiple different actions, it suggests that the action styles may be mediated by an information channel parallel to action types per se. Keywords: Action style recognition; biological motion; emotion; social cognition. Introduction Our bodily motion is coherent, smooth and effortless. From bodily motion, we perceive other’s state such as mood, emotional expression, and intention (Blake & Shiffrar, 2007). Perception of other’s state takes a crucial role in social context. Although most of us can easily “read” what others intend to do through their actions, there is a significant gap from the physical motion – a set of trajectories of multiple body parts with a large degree of freedom (Bernstein, 1967). Recognition of motion is vitally important to any animal kinds. Detection of another animal, possibly a pray, a predator, or a conspecific, and the following detailed identification what it is and how it may behave is essential to take an emergent actions to it (Johnson, Bolhuis, & Horn, 1985). Humans are social animals. Not surprisingly, our visual system is highly specialized to recognize others’ state. The present study aims to provide a computational-level description on how people recognize emotional status in others’ actions. Perception of biological motion How do we recognize implicit patterns in different styles of actions? The past experimental literature has explored capacity of motion perception using point-light displays (Johansson, 1973) in which the point-lights attached in major joints are only visible in the dark background (Figure 1a). Thus the available information is point-wise kinematic motion in multiple body parts. Despite of the limited information, people can recognize identity (Troje, Westhoff, & Lavrov, 2005), gender (Kozlowski & Cutting, 1977; Troje, 2002), emotions (Pollick et al., 2001; Atkinson; 2009; Hobson & Lee, 1999), dynamics such as the weight of a lifted object (Bingham, 1987) of actions from point-light displays. Not only demonstrating human capacity, the studies using point-light display have suggested features extracted in action perception. Accumulating empirical studies on action perception have suggested that velocity and its higher order derivatives in a single or multiple body parts as one of major correlates to emotional attributes in actions: duration of action (Pollick et al., 2001), velocity (DeMeijer, 1989), acceleration (force or the second order time derivatives) (Chang & Troje, 2008; 2009) and jerk or the third order time derivatives (Cook, Saygin, Swain, & Blakemore, 2009), pairwise counter-phase oscillation (Chang & Troje, 2008; 2009). In particular, we highlight the contribution of the higher order derivatives of velocity and importance of its covariational structure. Of relevance, Chang & Troje (2009) found that, not one of either but a pair of feed motion was a major cue for discrimination of walking direction. Past computational models on action recognition Consistent to these empirical findings, most of the theoretical approach works on some kind of statistical regularities among motion profiles. According to a recent review (Troje, 2008), perception of biological motion has the multi-level processing on local and global motion properties. The feature processing consists of four layers from early (low-level) to late (high-level) processing: life detection, structure-from-motion, action recognition, and style recognition. The system detect autonomous agent, and construct body structure from its detailed analysis, then is followed by more detailed action analysis. A couple of computational models are available for structure-from-motion and action recognition (Giese & Poggio, 2003; Lange & Lappe; 2006), and a few for post- action-recognition-level style perception (Troje, 2002; Pollick, Lestou, Ryu, Cho, 2002; Davis & Gao, 2004) in vision science. In the model of structure-from-motion and action recognition, the model identifies body structure and subsequently actions from the pixel-based visualization of point-light displays. In Giese & Poggio (2003), the model was built based on neuro-physiological findings on visual cortex, and was applied to recognition of action types and action direction in normal, masked, or scrambled point-light displays. While, the post-action-recognition-level models for style perception typically assume the either/both 2D or 3D point- light on the major joints and also which action is to be executed is readily available prior to the recognition of action style (Troje, 2002; Davis & Gao, 2004; Pollick et al., 2001). For example, Troje (2002) have proposed a computational model of gender identification in gait
[1]
Christine Deruelle,et al.
Recognition of emotional and non-emotional biological motion in individuals with autistic spectrum disorders
,
2007
.
[2]
Anthony P. Atkinson,et al.
Impaired recognition of emotions from body movements is associated with elevated motion coherence thresholds in autism spectrum disorders
,
2009,
Neuropsychologia.
[3]
Dorita H. F. Chang,et al.
Acceleration carries the local inversion effect in biological motion perception.
,
2009,
Journal of vision.
[4]
Rp Hobson,et al.
Components of person perception: An investigation with autistic, non‐autistic retarded and typically developing children and adolescents
,
1997
.
[5]
N. Troje.
Decomposing biological motion: a framework for analysis and synthesis of human gait patterns.
,
2002,
Journal of vision.
[6]
G. Johansson.
Visual perception of biological motion and a model for its analysis
,
1973
.
[7]
J. Lange,et al.
A Model of Biological Motion Perception from Configural Form Cues
,
2006,
The Journal of Neuroscience.
[8]
R. Blake,et al.
Perception of human motion.
,
2007,
Annual review of psychology.
[9]
M. D. Meijer.
The contribution of general features of body movement to the attribution of emotions
,
1989
.
[10]
R. Hobson,et al.
Imitation and identification in autism.
,
1999,
Journal of child psychology and psychiatry, and allied disciplines.
[11]
G. Horn,et al.
Interaction between acquired preferences and developing predispositions during imprinting
,
1985,
Animal Behaviour.
[12]
Dorita H. F. Chang,et al.
Perception of animacy and direction from local biological motion signals.
,
2008,
Journal of vision.
[13]
Masa-aki Sato,et al.
Sparse estimation automatically selects voxels relevant for the decoding of fMRI activity patterns
,
2008,
NeuroImage.
[14]
S. Blakemore,et al.
Reduced sensitivity to minimum-jerk biological motion in autism spectrum conditions
,
2009,
Neuropsychologia.
[15]
Armin Bruderlin,et al.
Perceiving affect from arm movement
,
2001,
Cognition.
[16]
G. Bingham,et al.
Kinematic form and scaling: further investigations on the visual perception of lifted weight.
,
1987,
Journal of experimental psychology. Human perception and performance.
[17]
N. Troje,et al.
Person identification from biological motion: Effects of structural and kinematic cues
,
2005,
Perception & psychophysics.
[18]
F. Pollick,et al.
Movement style, movement features, and the recognition of affect from human movement
,
2008
.
[19]
N. F. Troje,et al.
2.13 – Biological Motion Perception
,
2008
.
[20]
Geoffrey P. Bingham.
Kinematic Form and Scaling: Further Investigations on the Visual Perception of Lifted Weight
,
1987
.
[21]
Jeffrey M. Zacks,et al.
Understanding events : from perception to action
,
2008
.
[22]
F. Pollick,et al.
A motion capture library for the study of identity, gender, and emotion perception from biological motion
,
2006,
Behavior research methods.
[23]
N. A. Bernshteĭn.
The co-ordination and regulation of movements
,
1967
.
[24]
T. Poggio,et al.
Cognitive neuroscience: Neural mechanisms for the recognition of biological movements
,
2003,
Nature Reviews Neuroscience.
[25]
Eleanor Rosch,et al.
Principles of Categorization
,
1978
.
[26]
Sung-Bae Cho,et al.
Estimating the efficiency of recognizing gender and affect from biological motion
,
2002,
Vision Research.