DBN-based structural learning and optimisation for automated handwritten character recognition

Pattern recognition using Dynamic Bayesian Networks (DBNs) is currently a growing area of study. The classification performance greatly relies on the choice of a DBN model that will best describe the dependencies in each class of data. In this paper, we present DBN models trained for the classification of handwritten digit. Two approaches to improve the suitability of the models are presented. One uses a fixed DBN structure, and is based on an Evolutionary Algorithm optimisation of the selection and of the layout of the observations for each class of data. The second approach is about learning part of the structure of the models from the training set of each class. Parameter learning is then performed for each DBN. Classification results are presented for the described models, and compared with previously published results. Both approaches were found to improve the recognition rate compared to previous results.

[1]  Constantin F. Aliferis,et al.  The max-min hill-climbing Bayesian network structure learning algorithm , 2006, Machine Learning.

[2]  Curtis Smith,et al.  Bayesian inference in probabilistic risk assessment - The current state of the art , 2009, Reliab. Eng. Syst. Saf..

[3]  Chung-Lin Huang,et al.  Semantic analysis of soccer video using dynamic Bayesian network , 2006, IEEE Transactions on Multimedia.

[4]  Xinjie Yu,et al.  Introduction to evolutionary algorithms , 2010, The 40th International Conference on Computers & Indutrial Engineering.

[5]  Laurence Likforman-Sulem,et al.  Recognition of degraded characters using dynamic Bayesian networks , 2008, Pattern Recognit..

[6]  Yann LeCun,et al.  The mnist database of handwritten digits , 2005 .

[7]  Zoubin Ghahramani,et al.  Learning Dynamic Bayesian Networks , 1997, Summer School on Neural Networks.

[8]  Juan Zhou,et al.  Learning effective brain connectivity with dynamic Bayesian networks , 2007, NeuroImage.

[9]  Steve Renals,et al.  Automatic Meeting Segmentation Using Dynamic Bayesian Networks , 2007, IEEE Transactions on Multimedia.

[10]  Vojkan Mihajlovic,et al.  Dynamic Bayesian Networks: A State of the Art , 2001 .

[11]  K. Upton,et al.  A modern approach , 1995 .

[12]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[13]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[14]  Luis M. de Campos,et al.  Bayesian network learning algorithms using structural restrictions , 2007, Int. J. Approx. Reason..

[15]  Kevin P. Murphy,et al.  Learning the Structure of Dynamic Probabilistic Networks , 1998, UAI.

[16]  Nir Friedman,et al.  Learning Belief Networks in the Presence of Missing Values and Hidden Variables , 1997, ICML.

[17]  Peter Norvig,et al.  Artificial Intelligence: A Modern Approach , 1995 .

[18]  Thomas A. Runkler,et al.  Using a Local Discovery Ant Algorithm for Bayesian Network Structure Learning , 2009, IEEE Transactions on Evolutionary Computation.

[19]  Introduction to Evolutionary Algorithms June 26 , 2014 1 Darwinian Evolution In 1789 , 2014 .

[20]  Stuart J. Russell,et al.  Dynamic bayesian networks: representation, inference and learning , 2002 .

[21]  Ning Ma,et al.  Modelling the prepausal lengthening effect for speech recognition: a dynamic Bayesian network approach , 2009, 2009 IEEE International Conference on Acoustics, Speech and Signal Processing.

[22]  Trevor Darrell,et al.  Multistream Articulatory Feature-Based Models for Visual Speech Recognition , 2009, IEEE Trans. Pattern Anal. Mach. Intell..