An affective evaluation tool using brain signals

We propose a new interface evaluation tool that incorporates affective metrics which are provided from the ElectroEncephaloGraphy (EEG) signals of the Emotiv EPOC neuro-headset device. The evaluation tool captures and analyzes information in real time from a multitude of sources such as EEG, affective metrics such as frustration, engagement and excitement and facial expression. The proposed tool has been used to gain detailed affective information of users interacting with a mobile multimodal (touch and speech) iPhone application, for which we investigated the effect of speech recognition errors and modality usage patterns.

[1]  Alexandros Potamianos,et al.  A Study in Efficiency and Modality Usage in Multimodal Form Filling Systems , 2008, IEEE Transactions on Audio, Speech, and Language Processing.

[2]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[3]  R Chavarriaga,et al.  Learning From EEG Error-Related Potentials in Noninvasive Brain-Computer Interfaces , 2010, IEEE Transactions on Neural Systems and Rehabilitation Engineering.