Temporal context and the recognition of emotion from facial expression

Facial displays are an important channel for the expression of emotions and are often thought of as projections or “read out” of a person’s mental state. While it is generally believed that emotion recognition from facial expression improves with context, there is little literature available quantifying this improvement. This paper describes an experiment in which these effects are measured in a way that is directly applicable to the design of affective user interfaces. These results are being used to inform the design of emotion spectacles, an affective user interface based on the analysis of facial expressions.