Modelling conditional probabilities with network committees: how overfitting can be useful
暂无分享,去创建一个
Training neural networks for predicting conditional probability densities can be accelerated considerably by adopting the random vector functional link net (RVFL) approach.
In this way, a whole ensemble of models can be trained at the same computational
costs as otherwise required for training only one conventional network. The inherent
stochasticity of the RVFL method increases the diversity in this ensemble, which leads
to a signi cant reduction of the generalisation error. The application of this scheme to
a synthetic multimodal stochastic time series and a real-world benchmark problem was
found to achieve a performance better than or comparable to the best results otherwise obtained so far. Moreover, the simulations support a recent theoretical study and
show that when making predictions with network committees, it can be advantageous
to employ underregularised models that overfit the training data