Constructing Extensions of Bayesian Classifiers with Use of Normalizing Neural Networks

We introduce a new neural network model that generalizes the principles of the Naive Bayes classification method. It is trained with use of backpropagation-like algorithm, in purpose of obtaining optimal combination of several classifiers. Experimental results are presented.