What do your footsteps sound like? An investigation on interactive footstep sounds adjustment

This paper presents an experiment where participants were asked to adjust, while walking, the spectral content and the amplitude of synthetic footstep sounds in order to match the sounds of their own footsteps. The sounds were interactively generated by means of a shoe-based system capable of tracking footfalls and delivering real-time auditory feedback via headphones. Results allowed identification of the mean value and the range of variation of spectral centroid and peak level of footstep sounds simulating various combinations of shoe type and ground material. Results showed that the effect of ground material on centroid and peak level depended on the type of shoe. Similarly, the effect of shoe type on the two variables depended on the type of ground material. In particular, participants produced greater amplitudes for hard sole shoes than for soft sole shoes in presence of solid surfaces, while similar amplitudes for both types of shoes were found for aggregate, hybrids, and liquids. No significant correlations were found between each of the two acoustic features and participants’ body size. This result might be explained by the fact that while adjusting the sounds participants did not primarily focus on the acoustic rendering of their body. In addition, no significant differences were found between the values of the two acoustic features selected by the experimenters and those adjusted by participants. This result can therefore be considered as a measure of the goodness of the design choices to synthesize the involved footstep sounds for a generic walker. More importantly, this study showed that the relationships between the ground-shoes combinations are not changed when participants are actively walking. This represents the first active listening confirmation of this result, which had previously only been shown in passive listening studies. The results of this research can be used to design ecologically-valid auditory rendering of foot-floor interactions in virtual environments.

[1]  N. Troje,et al.  Differences in the Nature of Body Image Disturbances Between Female Obese Individuals With Versus Without a Comorbid Binge Eating Disorder: An Exploratory Study Including Static and Dynamic Aspects of Body Image , 2011, Behavior modification.

[2]  M. Jeannerod,et al.  The timing of mentally represented actions , 1989, Behavioural Brain Research.

[3]  Glenn Gamst,et al.  Applied Multivariate Research: Design and Interpretation , 2005 .

[4]  Mel Slater,et al.  Taking steps: the influence of a walking technique on presence in virtual reality , 1995, TCHI.

[5]  Davide Rocchesso,et al.  The Sonification Handbook , 2011 .

[6]  Bruno L. Giordano,et al.  The Production and Perception of Emotionally Expressive Walking Sounds: Similarities between Musical Performance and Everyday Motor Activity , 2014, PloS one.

[7]  Luca Turchet,et al.  Investigating the amplitude of interactive footstep sounds and soundscape reproduction , 2013 .

[8]  Luca Turchet Designing presence for real locomotion in immersive virtual environments: an affordance-based experiential approach , 2015, Virtual Reality.

[9]  Luca Turchet,et al.  Effects of Interactive Sonification on Emotionally Expressive Walking Styles , 2015, IEEE Transactions on Affective Computing.

[10]  Vincent Hayward,et al.  Identification of walked-upon materials in auditory, kinesthetic, haptic, and audio-haptic conditions. , 2012, The Journal of the Acoustical Society of America.

[11]  C. Urgesi,et al.  Distinct contributions of extrastriate body area and temporoparietal junction in perceiving one’s own and others’ body , 2014, Cognitive, Affective, & Behavioral Neuroscience.

[12]  Frank Steinicke,et al.  Human Walking in Virtual Environments: Perception, Technology, and Applications , 2013 .

[13]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[14]  Luca Turchet,et al.  Footstep sounds synthesis: Design, implementation, and evaluation of foot–floor interactions, surface materials, shoe types, and walkers’ features , 2016 .

[15]  Petri Toiviainen,et al.  MIR in Matlab (II): A Toolbox for Musical Feature Extraction from Audio , 2007, ISMIR.

[16]  Ana Tajadura-Jiménez,et al.  As Light as your Footsteps: Altering Walking Sounds to Change Perceived Body Weight, Emotional State and Gait , 2015, CHI.

[17]  M. Diamond,et al.  Primary Motor and Sensory Cortex Activation during Motor Performance and Motor Imagery: A Functional Magnetic Resonance Imaging Study , 1996, The Journal of Neuroscience.

[18]  Carrie Heater,et al.  Being There: The Subjective Experience of Presence , 1992, Presence: Teleoperators & Virtual Environments.

[19]  Robert J. Logan,et al.  Perception of acoustic source characteristics: walking sounds. , 1991, The Journal of the Acoustical Society of America.

[20]  Luca Turchet,et al.  Sound synthesis and evaluation of interactive footsteps for virtual reality applications , 2010, 2010 IEEE Virtual Reality Conference (VR).

[21]  Luca Turchet,et al.  Walking pace affected by interactive sounds simulating stepping on different terrains , 2013, ACM Trans. Appl. Percept..

[22]  Tom Mitchell,et al.  x-OSC: A versatile wireless I/O device for creative/music applications , 2013 .

[23]  Matthew Rodger,et al.  Perceiving and reenacting spatiotemporal characteristics of walking sounds. , 2013, Journal of experimental psychology. Human perception and performance.

[24]  Michael J. Black,et al.  Can I Recognize My Body's Weight? The Influence of Shape and Texture on the Perception of Self , 2014, ACM Trans. Appl. Percept..

[25]  Sofia Dahl,et al.  Experiments on gestures: walking, running, and hitting. , 2003 .

[26]  S. McAdams,et al.  Auditory Cognition. (Book Reviews: Thinking in Sound. The Cognitive Psychology of Human Audition.) , 1993 .

[27]  J. Ballas Common factors in the identification of an assortment of brief everyday sounds. , 1993, Journal of experimental psychology. Human perception and performance.

[28]  Luca Turchet,et al.  Localization of self-generated synthetic footstep sounds on different walked-upon materials through headphones , 2015, Virtual Reality.

[29]  J. Decety The neurophysiological basis of motor imagery , 1996, Behavioural Brain Research.

[30]  William W. Gaver What in the World Do We Hear? An Ecological Approach to Auditory Event Perception , 1993 .

[31]  J. Sundberg,et al.  Overview of the KTH rule system for musical performance. , 2006 .

[32]  David Poeppel,et al.  Analysis by Synthesis: A (Re-)Emerging Program of Research for Language and Vision , 2010, Biolinguistics.

[33]  Sabarish V. Babu,et al.  Comparison of path visualizations and cognitive measures relative to travel technique in a virtual environment , 2005, IEEE Transactions on Visualization and Computer Graphics.

[34]  Roy A. Ruddle,et al.  The benefits of using a walking interface to navigate virtual environments , 2009, TCHI.

[35]  Mary C. Whitton,et al.  Walking > walking-in-place > flying, in virtual environments , 1999, SIGGRAPH.

[36]  Mel Slater,et al.  Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[37]  Mary C. Whitton,et al.  The Design and Evaluation of a Large-Scale Real-Walking Locomotion Interface , 2012, IEEE Transactions on Visualization and Computer Graphics.

[38]  F. W. Galbraith,et al.  Ground Loading from Footsteps , 1970 .

[39]  Luca Turchet,et al.  Custom made wireless systems for interactive footstep sounds synthesis , 2014 .

[40]  Stefania Serafin,et al.  Sound design and perception in walking interactions , 2009, Int. J. Hum. Comput. Stud..

[41]  William W. Gaver How Do We Hear in the World?: Explorations in Ecological Acoustics , 1993 .

[42]  Mel Slater,et al.  How to Build an Embodiment Lab: Achieving Body Representation Illusions in Virtual Reality , 2014, Front. Robot. AI.

[43]  Luca Turchet,et al.  Emotion Rendering in Auditory Simulations of Imagined Walking Styles , 2017, IEEE Transactions on Affective Computing.