Effects of robotic companionship on music enjoyment and agent perception

We evaluate the effects of robotic listening companionship on people's enjoyment of music, and on their perception of the robot. We present a robotic speaker device designed for joint listening and embodied performance of the music played on it. The robot generates smoothed real-time beat-synchronized dance moves, uses nonverbal gestures for common ground, and can make and maintain eye-contact. In an experimental between-subject study (n=67), participants listened to songs played on the speaker device, with the robot either moving in sync with the beat, moving off-beat, or not moving at all. We found that while the robot's beat precision was not consciously detected by Ps, an on-beat robot positively affected song liking. There was no effect on overall experience enjoyment. In addition, the robot's response caused Ps to attribute more positive human-like traits to the robot, as well as rate the robot as more similar to themselves. Notably, personal listening habits (solitary vs. social) affected agent attributions. This work points to a larger question, namely how a robot's perceived response to an event might affect a human's perception of the same event.

[1]  Atsuo Takanishi,et al.  Toward enabling a natural interaction between human musicians and musical performance robots: Implementation of a real-time gestural interface , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[2]  Guy Hoffman,et al.  Dumb robots, smart phones: A case study of music listening companionship , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[3]  Jaewook Kim,et al.  Entertainment robot personality design based on basic factors of motions: A case study with ROLLY , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[4]  Matthew E. P. Davies,et al.  Context-Dependent Beat Tracking of Musical Audio , 2007, IEEE Transactions on Audio, Speech, and Language Processing.

[5]  K. M. Lee,et al.  Can robots manifest personality? : An empirical test of personality recognition, social responses, and social presence in human-robot interaction , 2006 .

[6]  Frank A. Russo,et al.  Seeing music performance: Visual influences on perception and experience , 2005 .

[7]  Kenton O'Hara,et al.  Consuming Music Together , 2005 .

[8]  Cynthia Breazeal,et al.  Expressive, Interactive Robots: Tools, Techniques, and Insights based on Collaborations , 2010 .

[9]  Adrian C. North,et al.  Uses of Music in Everyday Life , 2004 .

[10]  Mel Slater,et al.  Small group behaviour in a virtual and real environment , 1998 .

[11]  Andreas Rauber,et al.  Automatic Audio Segmentation: Segment Boundary and Structure Detection in Popular Music , 2008 .

[12]  Reed W. Larson,et al.  Television and Music , 1983 .

[13]  Marek P. Michalowski,et al.  Keepon : A Playful Robot for Research, Therapy, and Entertainment (Original Paper) , 2009 .

[14]  Thierry Bertin-Mahieux,et al.  The Million Song Dataset , 2011, ISMIR.

[15]  Brian Scassellati,et al.  Effects related to synchrony and repertoire in perceptions of robot dance , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[16]  Cynthia Breazeal,et al.  Effect of a robot on user perceptions , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[17]  Takashi Ikegami,et al.  Making a Robot Dance to Music Using Chaotic Itinerancy in a Network of FitzHugh-Nagumo Neurons , 2007, ICONIP.

[18]  Masataka Goto,et al.  An Audio-based Real-time Beat Tracking System for Music With or Without Drum-sounds , 2001 .

[19]  Gil Weinberg,et al.  A leader-follower turn-taking model incorporating beat detection in musical human-robot interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[20]  Marc Leman,et al.  Does Social Interaction Activate Music Listeners? , 2009, CMMR.

[21]  Gil Weinberg,et al.  Toward Robotic Musicianship , 2006, Computer Music Journal.

[22]  M. Argyle,et al.  The Different Functions of Gaze , 1973 .

[23]  Marek P. Michalowski,et al.  A dancing robot for rhythmic social interaction , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[24]  Brian Scassellati,et al.  Synchronization in Social Tasks: Robotic Drumming , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[25]  Michael Hansen,et al.  Real-Time Tracking of Moving Objects with an Active Camera , 1998, Real Time Imaging.

[26]  Guy Hoffman,et al.  Interactive improvisation with a robotic marimba player , 2011, Auton. Robots.

[27]  R. Cialdini,et al.  Online persuasion: An examination of gender differences in computer-mediated interpersonal influence. , 2002 .

[28]  R. Bajcsy Active perception , 1988 .

[29]  Yuichiro Yoshikawa,et al.  Responsive Robot Gaze to Interaction Partner , 2006, Robotics: Science and Systems.

[30]  Brian Scassellati,et al.  The effect of presence on human-robot interaction , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[31]  S. Feinman,et al.  Social Referencing in Infancy. , 1982 .

[32]  Andrea Lockerd Thomaz,et al.  Working collaboratively with humanoid robots , 2004, 4th IEEE/RAS International Conference on Humanoid Robots, 2004..