Evaluating the Effects of Material Sonification in Tactile Devices

Since the integration of internet of things technologies in our daily lives, there has been an increasing demand for new ways to interact with commodities in the domain of e-commerce, that go beyond passive product visualization. When considering materials from retailing stores, the utilization of audio cues has proven to be a simple but effective way to enhance not only the estimation of well-established physical qualities (e.g. roughness, flexibility), but also affective properties (e.g. pleasantness, value), which have an important leverage in the user decision for or against a product. In this paper, we propose to investigate augmenting visual representations of leather and fabric materials with touch-related audio feedback generated when rubbing the fingertip against the samples. For this purpose, we developed an interactive material sonification system for tactile devices that allows evaluating the impact of such audio cues on the human perception of materials by means of a psychophysical study based on rating scales for a set of relevant physical and affective material qualities. Our experimental results indicate that the evaluated touch-related audio cues do not significantly contribute to the perception of these attributes for the considered materials. In light of these findings, we suggest complementary directions of research in interactive material sonification, which may lead to profitable results.

[1]  Eitan Grinspun,et al.  Crumpling sound synthesis , 2016, ACM Trans. Graph..

[2]  Peter Brinkmann Making Musical Apps , 2012 .

[3]  Henry Fuchs,et al.  On visible surface generation by a priori tree structures , 1980, SIGGRAPH '80.

[4]  Francisco José Madrid-Cuevas,et al.  Automatic generation and detection of highly reliable fiducial markers under occlusion , 2014, Pattern Recognit..

[5]  Cristy Ho,et al.  Multisensory Augmented Reality in the Context of a Retail Clothing Application , 2013 .

[6]  Bin Liu,et al.  Using sound in multi-touch interfaces to change materiality and touch behavior , 2014, NordiCHI.

[7]  Susan J. Lederman,et al.  Multisensory Texture Perception , 2010 .

[8]  J. Kaiser,et al.  Multisensory Object Perception in the Primate Brain , 2010 .

[9]  J. Gower Generalized procrustes analysis , 1975 .

[10]  C. Spence,et al.  Auditory contributions to multisensory product perception , 2006 .

[11]  Nadia Bianchi-Berthouze,et al.  Tactile perceptions of digital textiles: a design research approach , 2013, CHI.

[12]  Matt Adcock,et al.  Interactive Granular Synthesis of Haptic Contact Sounds , 2002 .

[13]  Dinesh K. Pai,et al.  Perception of Material from Contact Sounds , 2000, Presence: Teleoperators & Virtual Environments.

[14]  Perry R. Cook,et al.  Toward Synthesized Environments: A Survey of Analysis and Synthesis Methods for Sound Designers and Composers , 2009, ICMC.

[15]  Charles Spence,et al.  Textures that we like to touch: An experimental study of aesthetic preferences for tactile stimuli , 2014, Consciousness and Cognition.

[16]  C. Spence,et al.  Audiotactile interactions in roughness perception , 2002, Experimental Brain Research.

[17]  Steve Marschner,et al.  Motion-driven concatenative synthesis of cloth sounds , 2012, ACM Trans. Graph..

[18]  Andrew Owens,et al.  Visually Indicated Sounds , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[19]  Michael Weinmann,et al.  Multimodal perception of material properties , 2015, SAP.

[20]  Midori Tokita,et al.  Perception of the material properties of wood based on vision, audition, and touch , 2015, Vision Research.

[21]  Bruno L. Giordano,et al.  Material identification of real impact sounds: effects of size variation in steel, glass, wood, and plexiglass plates. , 2006, The Journal of the Acoustical Society of America.

[22]  Michael Weinmann,et al.  Digital Transmission of Subjective Material Appearance , 2017, J. WSCG.

[23]  Christiane B. Wiebel,et al.  Perceptual qualities and material classes. , 2013, Journal of vision.