Modeling the Timing and Duration of Grip Behavior to Express Emotions for a Social Robot

This letter addresses the effects of grip timing to express a robot's emotions to people while they are watching a video stimulus together. Past studies on human-robot touch interaction focused on the types of touch behaviors for expressing a robot's emotion, but timing factors have received less attention. In this letter, we conducted data collection to investigate the appropriate grip timing to express heartwarming and horror emotions. Participants identified touch (grip) timing and durations by using a robot while they are watching video stimuli. Typically, they preferred a grip timing before a climax for horror videos but after a climax for heartwarming videos. We modeled the timing and durations by a fitting approach to probabilistic distribution to reproduce human-like touch behaviors, and also we implemented the models to an android robot.

[1]  Marcelo H. Ang,et al.  Why Robots? A Survey on the Roles and Benefits of Social Robots in the Therapy of Children with Autism , 2013, International Journal of Social Robotics.

[2]  M. Mather,et al.  Angry faces get noticed quickly: threat detection is not impaired among older adults. , 2006, The journals of gerontology. Series B, Psychological sciences and social sciences.

[3]  Tatsuya Kawahara,et al.  ERICA: The ERATO Intelligent Conversational Android , 2016, 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[4]  Agneta H. Fischer,et al.  Gender Differences in Motives for Regulating Emotions , 1998 .

[5]  E. Fox,et al.  The detection of fear-relevant stimuli: are guns noticed as quickly as snakes? , 2007, Emotion.

[6]  William J. Doyle,et al.  Does Hugging Provide Stress-Buffering Social Support? A Study of Susceptibility to Upper Respiratory Infection and Illness , 2015, Psychological science.

[7]  Takashi Minato,et al.  What Kinds of Robot's Touch Will Match Expressed Emotions? , 2020, IEEE Robotics and Automation Letters.

[8]  P. Philippot Inducing and assessing differentiated emotion-feeling states in the laboratory. , 1993, Cognition & emotion.

[9]  Takayuki Kanda,et al.  > Replace This Line with Your Paper Identification Number (double-click Here to Edit) < 2 , 2022 .

[10]  Silvia Rossi,et al.  User profiling and behavioral adaptation for HRI: A survey , 2017, Pattern Recognit. Lett..

[11]  Gennaro Cordasco,et al.  Elders prefer female robots with a high degree of human likeness , 2019, 2019 IEEE 23rd International Symposium on Consumer Technologies (ISCT).

[12]  Bao-Liang Lu,et al.  Emotional state classification from EEG data using machine learning approach , 2014, Neurocomputing.

[13]  Qianhua He,et al.  A survey on emotional semantic image retrieval , 2008, 2008 15th IEEE International Conference on Image Processing.

[14]  Diana Löffler,et al.  Multimodal Expression of Artificial Emotion in Social Robots Using Color, Motion and Sound , 2018, 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[15]  D. H. Hay,et al.  Gender differences in anger and fear as a function of situational context , 1995 .

[16]  Judith A. Hall,et al.  Gender Differences in Touch: An Empirical and Theoretical Review , 1984 .

[17]  Alexander Toet,et al.  Affective and Behavioral Responses to Robot-Initiated Social Touch: Toward Understanding the Opportunities and Limitations of Physical Contact in Human–Robot Interaction , 2017, Front. ICT.

[18]  Paolo Dario,et al.  Emotion Modelling for Social Robotics Applications: A Review , 2018 .

[19]  Cindy L. Bethel,et al.  A Survey of Using Vocal Prosody to Convey Emotion in Robot Speech , 2016, Int. J. Soc. Robotics.

[20]  D. Keltner,et al.  The communication of emotion via touch. , 2009, Emotion.

[21]  John-Jules Ch. Meyer,et al.  Adaptive Emotional Expression in Robot-Child Interaction , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[22]  Yutaka Nakamura,et al.  A Software Framework to Create Behaviors for Androids and Its Implementation on the Mobile Android "ibuki" , 2020, HRI.

[23]  T. Tsuji,et al.  Development of the Face Robot SAYA for Rich Facial Expressions , 2006, 2006 SICE-ICASE International Joint Conference.

[24]  Hiroshi Ishiguro,et al.  At the Department Store—Can Androids Be a Social Entity in the Real World? , 2018 .

[25]  Hiroshi Ishiguro,et al.  How Can Robots Make People Feel Intimacy Through Touch? , 2020, J. Robotics Mechatronics.

[26]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[27]  C. Spence,et al.  The science of interpersonal touch: An overview , 2010, Neuroscience & Biobehavioral Reviews.

[28]  A. Kring,et al.  Sex differences in emotion: expression, experience, and physiology. , 1998, Journal of personality and social psychology.

[29]  Robin I. M. Dunbar,et al.  Topography of social touching depends on emotional bonds between humans , 2015, Proceedings of the National Academy of Sciences.

[30]  Emily C. Collins,et al.  The effects of robot facial emotional expressions and gender on child–robot interaction in a field study , 2018, Connect. Sci..

[31]  Michelle M. Neumann,et al.  Social Robots and Young Children’s Early Language and Literacy Learning , 2020, Early Childhood Education Journal.

[32]  Dirk Heylen,et al.  Communication via warm haptic interfaces does not increase social warmth , 2018, Journal on Multimodal User Interfaces.

[33]  Tetsuya Ogata,et al.  Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music , 2012, EURASIP J. Audio Speech Music. Process..

[34]  Asta Cekaite,et al.  The Comforting Touch: Tactile Intimacy and Talk in Managing Children’s Distress , 2017 .

[35]  Chrysa D. Lithari,et al.  Are Females More Responsive to Emotional Stimuli? A Neurophysiological Study Across Arousal and Valence Dimensions , 2009, Brain Topography.

[36]  Ana Paiva,et al.  Social Robots for Long-Term Interaction: A Survey , 2013, International Journal of Social Robotics.

[37]  F. N. Willis,et al.  Relationship and touch in public settings , 1992 .

[38]  P. Ekman,et al.  DIFFERENCES Universals and Cultural Differences in the Judgments of Facial Expressions of Emotion , 2004 .

[39]  Daisuke Sakamoto,et al.  GEMINOID: REMOTE-CONTROLLED ANDROID SYSTEM FOR STUDYING HUMAN PRESENCE , 2009 .

[40]  Malte F. Jung Affective Grounding in Human-Robot Interaction , 2017, 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI.

[41]  Seiji Yamada,et al.  Expressing Emotions Through Color, Sound, and Vibration with an Appearance-Constrained Social Robot , 2017, 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI.

[42]  Brooke C. Feeney,et al.  Keep in touch: The effects of imagined touch support on stress and exploration , 2016 .

[43]  Rosemarie DiBiase,et al.  Gender and Culture Differences in Touching Behavior , 2004, The Journal of social psychology.

[44]  Akihiko Tokaji,et al.  Research for d eterminant factors and features of emotional responses of “kandoh” (the state of being emotionally moved) , 2003 .

[45]  T. Field Touch for Socioemotional and Physical Well-Being: A Review. , 2010 .

[46]  John-John Cabibihan,et al.  Physiological Responses to Affective Tele-Touch during Induced Emotional Stimuli , 2017, IEEE Transactions on Affective Computing.

[47]  Jaap Ham,et al.  Effects of Robot Facial Characteristics and Gender in Persuasive Human-Robot Interaction , 2018, Front. Robot. AI.

[48]  Laura K. Guerrero,et al.  Types of touch in cross-sex relationships between coworkers: perceptions of relational and emotional messages, inappropriateness, and sexual harassment , 2001 .

[49]  Takayuki Kanda,et al.  A Communication Robot in a Shopping Mall , 2010, IEEE Transactions on Robotics.

[50]  David Matsumoto,et al.  Culture, Emotion, and Expression , 2010 .

[51]  Lorretta Krautscheid,et al.  Original Research: 'How Should I Touch You?': A Qualitative Study of Attitudes on Intimate Touch in Nursing Care , 2011 .

[52]  Ana Paiva,et al.  The influence of empathy in human-robot relations , 2013, Int. J. Hum. Comput. Stud..

[53]  Norihiro Sadato,et al.  Interpersonal touch suppresses visual processing of aversive stimuli , 2015, Front. Hum. Neurosci..

[54]  Yutaka Nakamura,et al.  Perception of Emotional Gait-like Motion of Mobile Humanoid Robot Using Vertical Oscillation , 2020, HRI.

[55]  Carl Vogel,et al.  Seniors' Appreciation of Humanoid Robots , 2020, Neural Approaches to Dynamics of Signal Exchanges.

[56]  J. Gross,et al.  Emotion elicitation using films , 1995 .

[57]  Brooke C. Feeney,et al.  Interpersonal touch as a resource to facilitate positive personal and relational outcomes during stress discussions , 2018, Journal of Social and Personal Relationships.

[58]  Judith A. Hall Touch, status, and gender at professional meetings , 1996 .

[59]  Gentiane Venture,et al.  Robot Expressive Motions , 2019, ACM Transactions on Human-Robot Interaction.

[60]  P. Andersen,et al.  International Patterns of Interpersonal Tactile Communication: A Field Study , 1998 .

[61]  Qiang Ji,et al.  Video Affective Content Analysis: A Survey of State-of-the-Art Methods , 2015, IEEE Transactions on Affective Computing.

[62]  Takashi Minato,et al.  SŌTO: An Android Platform with a Masculine Appearance for Social Touch Interaction , 2020, HRI.

[63]  K. Light,et al.  More frequent partner hugs and higher oxytocin levels are linked to lower blood pressure and heart rate in premenopausal women , 2005, Biological Psychology.

[64]  M. Washburn,et al.  Bodily Changes in Pain, Hunger, Fear, and Rage. , 1917 .