Our previous work developed an online learning Bayesian framework (dynamic tree) for data organization and clustering. To continuously adapt the system during operation, we concurrently seek to perform outlier detection to prevent them from incorrectly modifying the system. We propose a new Bayesian surprise metric to differentiate outliers from the training data and thus help to selectively adapt the model parameters. The metric is calculated based on the difference between the prior and the posterior distributions on the model when a new sample is introduced. A good training datum would sufficiently but not excessively change the model; consequently, the difference between the prior and the posterior distributions would be reasonable to the amount of new information present on the datum. However, an outlier carries an element of surprise that would significantly change the model. In such a case, the posterior distribution would greatly differ from the prior resulting in a large value for the surprise metric. We categorize this datum as an outlier and other means (e.g. human operator) will have to be used to handle such cases. The surprise metric is calculated based on the model distribution, and as such, it adapts with the model. The surprise factor is dependent on the state of the system. This speeds up the learning process by considering only the relevant new data. Both the model parameters and even the structure of the dynamic tree can be updated under this approach.
[1]
Robert Jenssen,et al.
Optimizing the Cauchy-Schwarz PDF Distance for Information Theoretic, Non-parametric Clustering
,
2005,
EMMCVPR.
[2]
Stephen Grossberg,et al.
The ART of adaptive pattern recognition by a self-organizing neural network
,
1988,
Computer.
[3]
Pierre Baldi,et al.
Bayesian surprise attracts human attention
,
2005,
Vision Research.
[4]
Jose C. Principe,et al.
Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives
,
2010,
Information Theoretic Learning.
[5]
Christopher K. I. Williams,et al.
Dynamic trees for image modelling
,
2003,
Image Vis. Comput..
[6]
K. Clint Slatton,et al.
Dynamic trees for sensor fusion
,
2009,
2009 IEEE International Conference on Systems, Man and Cybernetics.
[7]
Christopher K. I. Williams,et al.
Image Modeling with Position-Encoding Dynamic Trees
,
2003,
IEEE Trans. Pattern Anal. Mach. Intell..
[8]
Hagai Attias,et al.
A Variational Bayesian Framework for Graphical Models
,
1999
.
[9]
Christopher M. Bishop,et al.
Pattern Recognition and Machine Learning (Information Science and Statistics)
,
2006
.