Conventional neural network training methods attempt to find a set of values for the network weights by minimizing an error function using some gradient descent based technique. In order to achieve good generalization performance, it is usually necessary to introduce a regularization term into the error function to prevent weights becoming overly large. In the conventional approach, the regularization coefficient, which controls the degree to which large weights are penalized, must be optimized outside of the weight training procedure, and this is usually done by means of a cross-validation procedure in which some training examples are held out, thereby reducing the number of examples available for weight optimization. Bayesian methods provide a means of optimizing these coefficients within the weight optimization procedure. This paper reports on the application of Bayesian MLP techniques to the task of predicting mineralization potential from geoscientific data. Results demonstrate that the Bayesian approach results in similar maps to the conventional MLP approach, while avoiding the complex cross-validation procedure required by the latter.
[1]
Martin Fodslette Møller,et al.
A scaled conjugate gradient algorithm for fast supervised learning
,
1993,
Neural Networks.
[2]
Christopher Holmes,et al.
Bayesian Methods for Nonlinear Classification and Regressing
,
2002
.
[3]
A. Skabar.
Optimization of MLP parameters on mineral potential mapping tasks
,
2004
.
[4]
G. Bonham-Carter.
Geographic Information Systems for Geoscientists: Modelling with GIS
,
1995
.
[5]
David J. C. MacKay,et al.
A Practical Bayesian Framework for Backpropagation Networks
,
1992,
Neural Computation.
[6]
Tom Gedeon,et al.
Artificial neural networks: A new method for mineral prospectivity mapping
,
2000
.
[7]
Christopher M. Bishop,et al.
Neural networks for pattern recognition
,
1995
.
[8]
Andrew Skabar,et al.
Mineral potential mapping using feed-forward neural networks
,
2003,
Proceedings of the International Joint Conference on Neural Networks, 2003..