Detection of human emotions has many potential applications. One of application is to quantify attentiveness audience in order evaluate acoustic quality in concern hall. The subjective audio preference that based on from audience is used. To obtain fairness evaluation of acoustic quality, the research proposed system for multimodal emotion detection; one modality based on brain signals that measured using electroencephalogram (EEG) and the second modality is sequences of facial images. In the experiment, an audio signal was customized which consist of normal and disorder sounds. Furthermore, an audio signal was played in order to stimulate positive/negative emotion feedback of volunteers. EEG signal from temporal lobes, i.e. T3 and T4 was used to measured brain response and sequence of facial image was used to monitoring facial expression during volunteer hearing audio signal. On EEG signal, feature was extracted from change information in brain wave, particularly in alpha and beta wave. Feature of facial expression was extracted based on analysis of motion images. We implement an advance optical flow method to detect the most active facial muscle form normal to other emotion expression that represented in vector flow maps. The reduce problem on detection of emotion state, vector flow maps are transformed into compass mapping that represents major directions and velocities of facial movement. The results showed that the power of beta wave is increasing when disorder sound stimulation was given, however for each volunteer was giving different emotion feedback. Based on features derived from facial face images, an optical flow compass mapping was promising to use as additional information to make decision about emotion feedback. Keywords— Multimodal Emotion Detection, EEG, Facial Image, Optical Flow, compass mapping, Brain Wave
[1]
Dimitris N. Metaxas,et al.
Optical Flow Constraints on Deformable Models with Applications to Face Tracking
,
2000,
International Journal of Computer Vision.
[2]
Takeo Kanade,et al.
Feature-point tracking by optical flow discriminates subtle differences in facial expression
,
1998,
Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.
[3]
L. Rothkrantz,et al.
Toward an affect-sensitive multimodal human-computer interaction
,
2003,
Proc. IEEE.
[4]
Yoichi Ando,et al.
Propagation of Alpha Waves Corresponding to Subjective Preference from the Right Hemisphere to the Left with Changes in the IACC of a Sound Field
,
2003
.
[5]
Maja Pantic,et al.
Automatic Analysis of Facial Expressions: The State of the Art
,
2000,
IEEE Trans. Pattern Anal. Mach. Intell..
[6]
John J. B. Allen,et al.
Voluntary facial expression and hemispheric asymmetry over the frontal cortex.
,
2001,
Psychophysiology.
[7]
John L. Semmlow,et al.
Biosignal and biomedical image processing : MATLAB-based applications
,
2004
.
[8]
Johnson I. Agbinya,et al.
Optical flow image analysis of facial expressions of human emotion: forensic applications
,
2008,
e-Forensics '08.
[9]
Hany Farid,et al.
Elastic registration in the presence of intensity variations
,
2003,
IEEE Transactions on Medical Imaging.
[10]
John G. Webster,et al.
Medical Instrumentation: Application and Design
,
1997
.
[11]
Zhigang Deng,et al.
Analysis of emotion recognition using facial expressions, speech and multimodal information
,
2004,
ICMI '04.