An Open Dataset for Impression Recognition from Multimodal Bodily Responses

We present a dataset (IMPRESSION) for multi-modal recognition of impressions on individuals and dyads. Compared to other databases, we did not only elicit impression using video stimuli, but also recorded natural impression formation of strangers meeting for the first time through video call. The database allows machine learning studies on impression recognition, using multimodal signals of individuals in relation to their emotion expressivity, and with respect to the interlocutor’s reactions. The experiment setup was arranged with 62 participants’ synchronized recordings of face videos, audio signals, eye gaze data, and peripheral nervous system physiological signals (Electrocardiogram-ECG, Blood Volume Pulse-BVP and Galvanic Skin Response-GSR) using wearable sensors. Participants reported their formed impressions in the W & C dimensions in real-time. We present the database in detail as well as baseline methods and results for impression recognition in W & C.