Possibility and necessity pattern classification using neural networks
暂无分享,去创建一个
Abstract We propose two learning algorithms of neural networks for two-group discriminant problems from the view point of possibility and necessity. One algorithm corresponds to the possibility analysis and the other to the necessity analysis. The proposed algorithms are similar to the back-propagation algorithm and the difference stems from a formulation of a cost function to be minimized in each algorithm. Each cost function of the proposed algorithms is the weighted sum of squared errors, that is, the sum of squared errors with different penalties. When we discuss the possibility of Group 1, the penalty for the squared errors relating to the patterns in Group 1 is greater than that of Group 2. This means that, in the possibility analysis of Group 1, we attach greater importance to the patterns in Group 1 than to those in Group 2. On the other hand, when we discuss the necessity of Group 1, the penalty for the squared errors relating to the patterns in Group 2 is greater than that of Group 1.
[1] Geoffrey E. Hinton,et al. Learning representations by back-propagating errors , 1986, Nature.
[2] Ken-ichi Funahashi,et al. On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.