Differences in Haptic and Visual Perception of Expressive 1DoF Motion

Humans can perceive motion through a variety of different modalities. Vision is a well explored modality; however haptics can greatly increase the richness of information provided to the user. The detailed differences in perception of motion between these two modalities are not well studied and can provide an additional avenue for communication between humans and haptic devices or robots. We analyze these differences in the context of users interactions with a non-anthropomorphic haptic device. In this study, participants experienced different levels and combinations of stiffness, jitter, and acceleration curves via a one degree of freedom linear motion display. These conditions were presented with and without the opportunity for users to touch the setup. Participants rated the experiences within the contexts of emotion, anthropomorphism, likeability, and safety using the SAM scale, HRI metrics, as well as with qualitative feedback. A positive correlation between stiffness and dominance, specifically due to the haptic condition, was found; additionally, with the introduction of jitter, decreases in perceived arousal and likeability were recorded. Trends relating acceleration curves to perceived dominance as well as stiffness and jitter to valence, arousal, dominance, likeability, and safety were also found. These results suggest the importance of considering which sensory modalities are more actively engaged during interactions and, concomitantly, which behaviors designers should employ in the creation of non-anthropomorphic interactive haptic devices to achieve a particular interpreted affective state.

[1]  Hiroshi Ishiguro,et al.  Does A Robot’s Touch Encourage Human Effort? , 2017, Int. J. Soc. Robotics.

[2]  Katherine J. Kuchenbecker,et al.  Softness, Warmth, and Responsiveness Improve Robot Hugs , 2018, International Journal of Social Robotics.

[3]  D. Keltner,et al.  Touch communicates distinct emotions. , 2006, Emotion.

[4]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[5]  J. Burgoon,et al.  Nonverbal Communication , 2018, Encyclopedia of Evolutionary Psychological Science.

[6]  Zhigang Deng,et al.  Analysis of emotion recognition using facial expressions, speech and multimodal information , 2004, ICMI '04.

[7]  Marco Dorigo,et al.  Investigating the effect of increasing robot group sizes on the human psychophysiological state in the context of human–swarm interaction , 2016, Swarm Intelligence.

[8]  Ben J. A. Kröse,et al.  The TaSSt: Tactile sleeve for social touch , 2013, 2013 World Haptics Conference (WHC).

[9]  Christoph Bartneck,et al.  Perception of affect elicited by robot motion , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[10]  Reid Simmons,et al.  Expressive motion with x, y and theta: Laban Effort Features for mobile robots , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[11]  John F. Schmerler The Visual Perception of Accelerated Motion , 1976, Perception.

[12]  Kenji Amaya,et al.  Emotion from Motion , 1996, Graphics Interface.

[13]  Julia Seebode Emotional Feedback for Mobile Devices , 2015 .

[14]  R. Byrne,et al.  Machiavellian intelligence : social expertise and the evolution of intellect in monkeys, apes, and humans , 1990 .

[15]  Jeremy N. Bailenson,et al.  Virtual Interpersonal Touch: Expressing and Recognizing Emotions Through Haptic Devices , 2007, Hum. Comput. Interact..

[16]  R. Sreerama Kumar,et al.  A Bezier curve based path planning in a multi-agent robot soccer system without violating the acceleration limits , 2009, Robotics Auton. Syst..

[17]  T. Shibata,et al.  Use of a Therapeutic, Socially Assistive Pet Robot (PARO) in Improving Mood and Stimulating Social Interaction and Communication for People With Dementia: Study Protocol for a Randomized Controlled Trial , 2015, JMIR research protocols.

[18]  Cynthia Breazeal,et al.  Design of a therapeutic robotic companion for relational, affective touch , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[19]  Hiroshi Ishiguro,et al.  Effect of robot's active touch on people's motivation , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[20]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.

[21]  Selma Sabanovic,et al.  Happy Moves, Sad Grooves: Using Theories of Biological Motion and Affect to Design Shape-Changing Interfaces , 2016, Conference on Designing Interactive Systems.

[22]  Karon E. MacLean,et al.  Haptic Interaction Design for Everyday Interfaces , 2008 .

[23]  Karon E. MacLean,et al.  Exploring affective design for physical controls , 2007, CHI.

[24]  R. Heslin,et al.  Hands touching hands: affective and evaluative effects of an interpersonal touch. , 1976, Sociometry.

[25]  A. Mehrabian Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament , 1996 .

[26]  P. Andersen Nonverbal Behavior and Communication , 1979 .

[27]  Mari Velonaki,et al.  Interpretation of Social Touch on an Artificial Arm Covered with an EIT-based Sensitive Skin , 2014, Int. J. Soc. Robotics.

[28]  P. Anderson Touching: The Human Significance of the Skin , 1978 .

[29]  Antonio Bicchi,et al.  Safety for Physical Human-Robot Interaction , 2008, Springer Handbook of Robotics.

[30]  H. Ishiguro,et al.  Huggable communication medium decreases cortisol levels , 2013, Scientific Reports.

[31]  Karon E. MacLean,et al.  The Role of Affective Touch in Human-Robot Interaction: Human Intent and Expectations in Touching the Haptic Creature , 2012, Int. J. Soc. Robotics.

[32]  Hiroshi Ishii,et al.  Tangible interfaces for remote collaboration and communication , 1998, CSCW '98.

[33]  Tek-Jin Nam,et al.  Emotional Interaction Through Physical Movement , 2007, HCI.

[34]  Lawrence H. Kim,et al.  UbiSwarm: Ubiquitous Robotic Interfaces and Investigation of Abstract Motion as a Display , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[35]  D. Watson,et al.  Development and validation of brief measures of positive and negative affect: the PANAS scales. , 1988, Journal of personality and social psychology.

[36]  Adriana Tapus,et al.  Haptic Human-Robot Affective Interaction in a Handshaking Social Protocol , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[37]  Armin Bruderlin,et al.  Perceiving affect from arm movement , 2001, Cognition.

[38]  L. Jones,et al.  Contribution of tactile feedback from the hand to the perception of force , 2005, Experimental Brain Research.

[39]  Anna L. Cox,et al.  On Posture as a Modality for Expressing and Recognizing Emotions , 2006 .

[40]  Neville Hogan,et al.  Visual perception of limb stiffness , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[41]  P. White,et al.  The experience of force: the role of haptic experience of forces in visual perception of object motion and interactions, mental simulation, and motion-related judgments. , 2012, Psychological bulletin.

[42]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[43]  Ali Israr,et al.  A social haptic device to create continuous lateral motion using sequential normal indentation , 2018, 2018 IEEE Haptics Symposium (HAPTICS).

[44]  Michael Neff,et al.  Modeling tension and relaxation for computer animation , 2002, SCA '02.

[45]  F. N. Willis,et al.  The use of interpersonal touch in securing compliance , 1980 .

[46]  Alessandro Vinciarelli,et al.  Shaping Robot Gestures to Shape Users' Perception: The Effect of Amplitude and Speed on Godspeed Ratings , 2018, HAI.

[47]  Andrea Kleinsmith,et al.  A categorical approach to affective gesture recognition , 2003, Connect. Sci..