The Effect of Robot Attentional Behaviors on User Perceptions and Behaviors in a Simulated Health Care Interaction: Randomized Controlled Trial

Background For robots to be effectively used in health applications, they need to display appropriate social behaviors. A fundamental requirement in all social interactions is the ability to engage, maintain, and demonstrate attention. Attentional behaviors include leaning forward, self-disclosure, and changes in voice pitch. Objective This study aimed to examine the effect of robot attentional behaviors on user perceptions and behaviors in a simulated health care interaction. Methods A parallel randomized controlled trial with a 1:1:1:1 allocation ratio was conducted. We randomized participants to 1 of 4 experimental conditions before engaging in a scripted face-to-face interaction with a fully automated medical receptionist robot. Experimental conditions included a self-disclosure condition, voice pitch change condition, forward lean condition, and neutral condition. Participants completed paper-based postinteraction measures relating to engagement, perceived robot attention, and perceived robot empathy. We video recorded interactions and coded for participant attentional behaviors. Results A total of 181 participants were recruited from the University of Auckland. Participants who interacted with the robot in the forward lean and self-disclosure conditions found the robot to be significantly more stimulating than those who interacted with the robot in the voice pitch or neutral conditions (P=.03). Participants in the forward lean, self-disclosure, and neutral conditions found the robot to be significantly more interesting than those in the voice pitch condition (P<.001). Participants in the forward lean and self-disclosure conditions spent significantly more time looking at the robot than participants in the neutral condition (P<.001). Significantly, more participants in the self-disclosure condition laughed during the interaction (P=.01), whereas significantly more participants in the forward lean condition leant toward the robot during the interaction (P<.001). Conclusions The use of self-disclosure and forward lean by a health care robot can increase human engagement and attentional behaviors. Voice pitch changes did not increase attention or engagement. The small effects with regard to participant perceptions are potentially because of the limitations in self-report measures or a lack of comparison for most participants who had never interacted with a robot before. Further research could explore the use of self-disclosure and forward lean using a within-subjects design and in real health care settings.

[1]  Xialing Lin,et al.  Evaluations of an artificial intelligence instructor's voice: Social Identity Theory in human-robot interactions , 2019, Comput. Hum. Behav..

[2]  Julie Shah,et al.  A New Model to Enhance Robot-Patient Communication: Applying Insights from the Medical World , 2018, ICSR.

[3]  A comparison of the cardiovascular effects of simulated and spontaneous laughter. , 2018, Complementary therapies in medicine.

[4]  Friederike Eyssel,et al.  The role of self-disclosure in human-robot interaction , 2017, 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[5]  Pascale Fung,et al.  A Long Short-Term Memory Framework for Predicting Humor in Dialogues , 2016, NAACL.

[6]  T. Saxton,et al.  A lover or a fighter? Opposing sexual selection pressures on men’s vocal pitch and facial hair , 2015, Behavioral ecology : official journal of the International Society for Behavioral Ecology.

[7]  T. Begum Doctor Patient Communication: A Review , 2015 .

[8]  Reid G. Simmons,et al.  Designing a receptionist robot: Effect of voice and appearance on anthropomorphism , 2015, 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[9]  Roger Bemelmans,et al.  Effectiveness of Robot Paro in Intramural Psychogeriatric Care: A Multicenter Quasi-Experimental Study. , 2015, Journal of the American Medical Directors Association.

[10]  S. Mercer,et al.  Measuring empathic, person-centred communication in primary care nurses: validity and reliability of the Consultation and Relational Empathy (CARE) Measure , 2015, BMC Family Practice.

[11]  B. Arroll,et al.  Physician self-disclosure in primary care: a mixed methods study of GPs' attitudes, skills, and behaviour. , 2015, The British journal of general practice : the journal of the Royal College of General Practitioners.

[12]  R. Kearns,et al.  The place of receptionists in access to primary care: Challenges in the space between community and consultation. , 2015, Social science & medicine.

[13]  Ran Zhao,et al.  Towards a Dyadic Computational Model of Rapport Management for Human-Virtual Agent Interaction , 2014, IVA.

[14]  Karon E. MacLean,et al.  Meet Me where I’m Gazing: How Shared Attention Gaze Affects Human-Robot Handover Timing , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[15]  Srikantan S. Nagarajan,et al.  A bilateral cortical network responds to pitch perturbations in speech feedback , 2014, NeuroImage.

[16]  Vanessa Evers,et al.  Picking favorites: The influence of robot eye-gaze on interactions with multiple users , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  B. MacDonald,et al.  The psychosocial effects of a companion robot: a randomized controlled trial. , 2013, Journal of the American Medical Directors Association.

[18]  Joon Yeon Choeh,et al.  User-Personality Classification Based on the Non-Verbal Cues from Spoken Conversations , 2013, Int. J. Comput. Intell. Syst..

[19]  Ana Paiva,et al.  The influence of empathy in human-robot relations , 2013, Int. J. Hum. Comput. Stud..

[20]  Haizhou Li,et al.  Making Social Robots More Attractive: The Effects of Voice Pitch, Humor and Empathy , 2013, Int. J. Soc. Robotics.

[21]  Zohar Livnat,et al.  Rapport in Negotiation , 2012 .

[22]  B. Bertenthal,et al.  Dynamic pointing triggers shifts of visual attention in young infants. , 2012, Developmental science.

[23]  D. Javitt,et al.  Auditory emotion recognition impairments in schizophrenia: relationship to acoustic features and cognition. , 2012, The American journal of psychiatry.

[24]  Tatsuya Nomura,et al.  Relationships between Robot's Self-Disclosures and Human's Anxiety toward Robots , 2011, 2011 IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology.

[25]  H. Helfrich,et al.  Impact of Voice Pitch on Text Memory , 2011 .

[26]  Jenna Ward,et al.  The unspoken work of general practitioner receptionists: a re-examination of emotion management in primary care. , 2011, Social science & medicine.

[27]  Gamini Dissanayake,et al.  Nonverbal robot-group interaction using an imitated gaze cue , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[28]  Thomas Forkmann,et al.  Evaluation of the "Consultation and Relational Empathy" (CARE) measure by means of Rasch-analysis at the example of cancer patients. , 2011, Patient education and counseling.

[29]  Robin D. Everall,et al.  Therapist self-disclosure and the therapeutic relationship: a phenomenological study from the client perspective , 2010 .

[30]  Petri Laukka,et al.  Getting the cue: sensory contributions to auditory emotion recognition impairments in schizophrenia. , 2010, Schizophrenia bulletin.

[31]  Charles R. Crowelly,et al.  Gendered voice and robot entities: Perceptions and reactions of male and female subjects , 2009 .

[32]  Brian McKinstry,et al.  Front desk talk: discourse analysis of receptionist-patient interaction. , 2009, The British journal of general practice : the journal of the Royal College of General Practitioners.

[33]  Kenji Araki,et al.  Robots Make Things Funnier , 2008, JSAI.

[34]  Pamela J. Hinds,et al.  Colleague vs. tool: Effects of disclosure in human-robot collaboration , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[35]  I. René J. A. te Boekhorst,et al.  Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion , 2008, Auton. Robots.

[36]  P. Mundy,et al.  CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE Attention, Joint Attention, and Social Cognition , 2022 .

[37]  Edgar Erdfelder,et al.  G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences , 2007, Behavior research methods.

[38]  Steven A. Haist,et al.  House staff nonverbal communication skills and standardized patient satisfaction , 2003, Journal of General Internal Medicine.

[39]  O. Crasborn,et al.  Explaining prosodic body leans in Sign Language of the Netherlands: Pragmatics required , 2006 .

[40]  Dirk Heylen,et al.  Generating expressive speech for storytelling applications , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[41]  J. Travaline,et al.  Patient-Physician Communication: Why and How , 2005, The Journal of the American Osteopathic Association.

[42]  David Heaney,et al.  The consultation and relational empathy (CARE) measure: development and preliminary validation and reliability of an empathy-based consultation process measure. , 2004, Family practice.

[43]  Candace L. Sidner,et al.  Where to look: a study of human-robot engagement , 2004, IUI '04.

[44]  H. Bennett Humor in Medicine , 2003, Southern medical journal.

[45]  S. Collins,et al.  Vocal and visual attractiveness are related in women , 2003, Animal Behaviour.

[46]  Cynthia Breazeal,et al.  Toward sociable robots , 2003, Robotics Auton. Syst..

[47]  Michael Burmester,et al.  AttrakDiff: Ein Fragebogen zur Messung wahrgenommener hedonischer und pragmatischer Qualität , 2003, MuC.

[48]  Debi A. LaPlante,et al.  Surgeons' tone of voice: a clue to malpractice history. , 2002, Surgery.

[49]  P. Sloane,et al.  Physician-patient communication in the primary care office: a systematic review. , 2002, The Journal of the American Board of Family Practice.

[50]  M. Mendelson,et al.  Measuring friendship quality in late adolescents and young adults: McGill Friendship Questionnaires. , 1999 .

[51]  J. Dale,et al.  Doctor-patient communication and patient satisfaction: a review. , 1998, Family practice.

[52]  Ronnie B. Wilbur,et al.  Body leans and the marking of contrast in American Sign Language , 1998 .

[53]  P. Butow,et al.  When the diagnosis is cancer: Patient communication experiences and preferences , 1996, Cancer.

[54]  Judith A. Hall,et al.  Nonverbal behavior in clinician—patient interaction , 1995 .

[55]  Christopher F. Sharpley,et al.  When does counsellor forward lean influence client-perceived rapport? , 1995 .

[56]  M. Stewart,et al.  Doctor-patient communication: the Toronto consensus statement. , 1991, BMJ.

[57]  R. Rosenthal,et al.  Rapport expressed through nonverbal behavior , 1985 .