User Interest Estimation Using Cross-modal Computation

This paper describes a new user–machine interactive scheme using cross-modal computation, in order to estimate a user’s interest. The scheme builds on our previous study that used eye gaze detection alone to extract visual preference from users. However, that type of interaction scheme was insufficient since it was unable to detect a user’s intensity. Thus, our proposed scheme will suggest how different sensor modalities can be used to extract emotional aspects of the intensity of interest, which could be segregated from novelty of given visual stimuli. In addition, our speculation is that repeated interest estimation would be needed to gain accurate interest estimation. Hence a user’s habituation detection, dedicated from a user’s boredom and/or aversion, must also be taken into account. Our computational results has shown that the proposed scheme was efficiently capable of achieving accurate interest estimation with our cross-modal computation.

[1]  M. Posner,et al.  Orienting of Attention* , 1980, The Quarterly journal of experimental psychology.

[2]  Jackson Beatty,et al.  Biofeedback and Behavior: Introduction to the Proceedings , 1977 .

[3]  T. Sejnowski,et al.  Spatial Transformations in the Parietal Cortex Using Basis Functions , 1997, Journal of Cognitive Neuroscience.

[4]  Fatma Nasoz,et al.  Emotion Recognition from Physiological Signals for Presence Technologies , 2004 .

[5]  D E BERLYNE Conflict and the orientation reaction. , 1961, Journal of experimental psychology.

[6]  D. Siddle,et al.  Orienting and habituation : perspectives in human research , 1983 .

[7]  R. F. Thompson,et al.  Habituation: a model phenomenon for the study of neuronal substrates of behavior. , 1966, Psychological review.

[8]  Hiroshi Tsujino,et al.  Semantic Rewiring Mechanism of Neural Cross-Modal Integration Based on Spatial and Temporal Properties of Attention , 2002, Neurocomputing.

[9]  Gershon Ben-Shakhar,et al.  The effects of serial position and frequency of presentation of common stimulus features on orienting response reinstatement. , 2003, Psychophysiology.

[10]  Luis Carretié,et al.  Cerebral patterns of attentional habituation to emotional visual stimuli. , 2003, Psychophysiology.

[11]  Ulric Neisser,et al.  Selective looking by infants , 1981, Cognitive Psychology.

[12]  Mark Steedman,et al.  APML, a Markup Language for Believable Behavior Generation , 2004, Life-like characters.

[13]  Steve R. Waterhouse,et al.  Bayesian Methods for Mixtures of Experts , 1995, NIPS.

[14]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[15]  D. Shapiro,et al.  A Monologue on Biofeedback and Psychophysiology , 1977 .

[16]  H. Zimmer,et al.  Habituation and recovery of a slow negative wave of the event-related brain potential. , 2002, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[17]  Ayse G. Canseven,et al.  Electrical Properties of The Skin , 1992 .

[18]  A. L. Yarbus Eye Movements During Perception of Complex Objects , 1967 .

[19]  John Paulin Hansen,et al.  Eye-gaze control of multimedia systems , 1995 .

[20]  Douglas DeCarlo,et al.  Robust clustering of eye movement recordings for quantification of visual interest , 2004, ETRA.

[21]  Mitsuru Ishizuka,et al.  THE EMPATHIC COMPANION: A CHARACTER-BASED INTERFACE THAT ADDRESSES USERS' AFFECTIVE STATES , 2005, Appl. Artif. Intell..

[22]  Kazuyuki Aihara,et al.  Learning to estimate user interest utilizing the variational Bayes estimator , 2005, 5th International Conference on Intelligent Systems Design and Applications (ISDA'05).

[23]  D. Kahneman,et al.  Attention and Effort , 1973 .

[24]  Kazuyuki Aihara,et al.  Cognitive User Modeling Computed by a Proposed Dialogue Strategy Based on an Inductive Game Theory , 2005, Machine Learning and Robot Perception.

[25]  Jennifer Healey,et al.  Digital processing of affective signals , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).