Facial Orientation During Multi-party Interaction with Information Kiosks

We hypothesize that the performance of multimodal perceptive user interfaces during multi-party interaction may be improved by using facial orientation of users as a cue for identifying the addressee of a user utterance. Multi-party interactions were collected in a user test where one participant would both interact with an information kiosk and negotiate with another person about the information to be obtained. It was found that users indeed look at the system when they speak to the system, but that they also look at the system most of the time when they negotiate with the other person. It is concluded that facial orientation by itself does not fully identify the addressee of a user utterance, but there are promising results for a combination of facial orientation and