A database of whole-body action videos for the study of action, emotion, and untrustworthiness

We present a database of high-definition (HD) videos for the study of traits inferred from whole-body actions. Twenty-nine actors (19 female) were filmed performing different actions—walking, picking up a box, putting down a box, jumping, sitting down, and standing and acting—while conveying different traits, including four emotions (anger, fear, happiness, sadness), untrustworthiness, and neutral, where no specific trait was conveyed. For the actions conveying the four emotions and untrustworthiness, the actions were filmed multiple times, with the actor conveying the traits with different levels of intensity. In total, we made 2,783 action videos (in both two-dimensional and three-dimensional format), each lasting 7 s with a frame rate of 50 fps. All videos were filmed in a green-screen studio in order to isolate the action information from all contextual detail and to provide a flexible stimulus set for future use. In order to validate the traits conveyed by each action, we asked participants to rate each of the actions corresponding to the trait that the actor portrayed in the two-dimensional videos. To provide a useful database of stimuli of multiple actions conveying multiple traits, each video name contains information on the gender of the actor, the action executed, the trait conveyed, and the rating of its perceived intensity. All videos can be downloaded free at the following address: http://www-users.york.ac.uk/~neb506/databases.html. We discuss potential uses for the database in the analysis of the perception of whole-body actions.

[1]  B. Gelder Towards the neurobiology of emotional body language , 2006, Nature Reviews Neuroscience.

[2]  Claire L. Roether,et al.  Critical features for the perception of emotion from gait. , 2009, Journal of vision.

[3]  T. J. Clarke,et al.  The Perception of Emotion from Body Movement in Point-Light Displays of Interpersonal Dialogue , 2005, Perception.

[4]  David J White,et al.  Voice recognition technology as a tool for behavioral research , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[5]  D I Perrett,et al.  Frameworks of analysis for the neural representation of animate objects and actions. , 1989, The Journal of experimental biology.

[6]  Andrew W. Young,et al.  Social inferences from faces: Ambient images generate a three-dimensional model , 2013, Cognition.

[7]  P. Ekman,et al.  Unmasking the face : a guide to recognizing emotions from facial clues , 1975 .

[8]  A. Young,et al.  Emotion Perception from Dynamic and Static Body Expressions in Point-Light and Full-Light Displays , 2004, Perception.

[9]  Karl Verfaillie,et al.  Perception of biological motion: A stimulus set of human point-light actions , 2004, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[10]  T. Jellema,et al.  Visual Aftereffects for Walking Actions Reveal Underlying Neural Mechanisms for Action Recognition , 2011, Psychological science.

[11]  D H Brainard,et al.  The Psychophysics Toolbox. , 1997, Spatial vision.

[12]  V. Caggiano,et al.  Physiologically Inspired Model for the Visual Recognition of Transitive Hand Actions , 2013, The Journal of Neuroscience.

[13]  Mike W Oram,et al.  The sensitivity of primate STS neurons to walking sequences and to the degree of articulation in static images. , 2006, Progress in brain research.

[14]  Richard D. Morey,et al.  Confidence Intervals from Normalized Data: A correction to Cousineau (2005) , 2008 .

[15]  Christophe Jallais,et al.  Inducing changes in arousal and valence: Comparison of two mood induction procedures , 2010, Behavior research methods.

[16]  Zixiang Wang,et al.  I know what you are doing , 2014 .

[17]  Marco Tamietto,et al.  Standing up for the body. Recent progress in uncovering the networks involved in the perception of bodies and bodily expressions , 2010, Neuroscience & Biobehavioral Reviews.

[18]  David I. Perrett,et al.  Visual Adaptation to Goal-directed Hand Actions , 2009, Journal of Cognitive Neuroscience.

[19]  H. Zimmer,et al.  An action video clip database rated for familiarity in China and Germany , 2012, Behavior research methods.

[20]  Daniel D. Dilks,et al.  Differential selectivity for dynamic versus static information in face-selective cortical regions , 2011, NeuroImage.

[21]  Alice J. O'Toole,et al.  A video database of moving faces and people , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[22]  Clinton D. Kilts,et al.  Dissociable Neural Pathways Are Involved in the Recognition of Emotion in Static and Dynamic Facial Expressions , 2003, NeuroImage.

[23]  T. Flash,et al.  Expression of emotion in the kinematics of locomotion , 2012, Experimental Brain Research.

[24]  N. Wade Translation and Recognition , 2004, Perception.

[25]  Andrew Linklater,et al.  Frameworks of analysis , 2005 .

[26]  F. Hesse,et al.  Relative effectiveness and validity of mood induction procedures : a meta-analysis , 1996 .

[27]  F. Pollick,et al.  A motion capture library for the study of identity, gender, and emotion perception from biological motion , 2006, Behavior research methods.

[28]  A. Todorov,et al.  The functional basis of face evaluation , 2008, Proceedings of the National Academy of Sciences.

[29]  P. Ekman Differential communication of affect by head and body cues. , 1965, Journal of personality and social psychology.

[30]  G. Rizzolatti,et al.  I Know What You Are Doing A Neurophysiological Study , 2001, Neuron.

[31]  D G Pelli,et al.  The VideoToolbox software for visual psychophysics: transforming numbers into movies. , 1997, Spatial vision.

[32]  Denis G. Pelli,et al.  ECVP '07 Abstracts , 2007, Perception.

[33]  Karl Verfaillie,et al.  Creating stimuli for the study of biological-motion perception , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[34]  R. Blake,et al.  Perception of Biological Motion , 1997, Perception.