Probabilistic Explanation Dialog Augmentation

Human-computer trust (HCT) is an important factor influencing the complexity and frequency of interaction in technical systems. Especially incomprehensible situations in human-computer interaction (HCI) may decrease the user's trust and through that the way of interaction. However, analogous to human-human interaction (HHI), providing explanations in these situations can help to remedy negative effects. In this paper, we present our approach of augmenting task-oriented dialogs with selected explanation dialogs to stabilize the HCT relationship. We conducted a study comparing the effects of different explanations on HCT. These results were used in a probabilistic trust handling architecture to augment pre-defined task-oriented dialogs.