Impact of Deception Information on Negotiation Dialog Management: A Case Study on Doctor-Patient Conversations

Almost all of existing negotiation systems assume that their interlocutors (the user) are telling the truth. However, in negotiations, participants can tell lies to earn a profit. In this research, we proposed a negotiation dialog management system that detects user’s lies and designed a dialog behavior on how should the system react with. As a typical case, we built a dialog model of doctor-patient conversation on living habits domain. We showed that we can use partially observable Markov decision process (POMDP) to model this conversation and use reinforcement learning to train the system’s policy.

[1]  Tatsuya Kawahara,et al.  Conversational system for information navigation based on POMDP with user focus tracking , 2015, Comput. Speech Lang..

[2]  Björn Schuller,et al.  Opensmile: the munich versatile and fast open-source audio feature extractor , 2010, ACM Multimedia.

[3]  Peter Robinson,et al.  OpenFace: An open source facial behavior analysis toolkit , 2016, 2016 IEEE Winter Conference on Applications of Computer Vision (WACV).

[4]  Blai Bonet,et al.  An epsilon-Optimal Grid-Based Algorithm for Partially Observable Markov Decision Processes , 2002, ICML.

[5]  Oliver Lemon,et al.  Learning non-cooperative dialogue behaviours , 2014, SIGDIAL Conference.

[6]  Andreas Stolcke,et al.  Distinguishing deceptive from non-deceptive speech , 2005, INTERSPEECH.

[7]  Tomoki Toda,et al.  Reinforcement Learning of Cooperative Persuasive Dialogue Policies using Framing , 2014, COLING.

[8]  R. Säljö,et al.  Taking antihypertensive medication--controlling or co-operating with patients? , 1995, International journal of cardiology.

[9]  Harri Oinas-Kukkonen,et al.  Persuasive system design: state of the art and future directions , 2009, Persuasive '09.

[10]  Peter Dayan,et al.  Technical Note: Q-Learning , 2004, Machine Learning.

[11]  Mohamed Abouelenien,et al.  Deception Detection using Real-life Trial Data , 2015, ICMI.

[12]  Johanna D. Moore,et al.  Recognizing emotions in spoken dialogue with hierarchically fused acoustic and lexical features , 2016, 2016 IEEE Spoken Language Technology Workshop (SLT).