Towards a generic artificial neural network model for dynamic predictions of algal abundance in freshwater lakes

A generalised architecture of a feedforward ANN for the prediction of algal abundance is suggested. It simplifies practical model applications, rationalises data collection and preprocessing, improves model validity, and enables meaningful comparison of ANN models between lakes. The generic ANN model considers the key driving variables of algal growth such as phosphorous, nitrogen, underwater light and water temperature as input nodes and predicts algal species abundance or biomass as output. Two model structures were used; one for same-day and one for 30-days ahead predictions of algal abundance. ANN models with and without hidden layers were compared to determine the impact of the addition of non-linear processing capabilities on model performance. A bootstrap aggregation method was found to reduce test set prediction error and to mitigate the effects of overfitting. The model was validated by means of time-series data from six different freshwater lakes.

[1]  S. T. Buckland,et al.  An Introduction to the Bootstrap. , 1994 .

[2]  Holger R. Maier,et al.  Use of artificial neural networks for modelling cyanobacteria Anabaena spp. in the River Murray, South Australia , 1998 .

[3]  N. Takamura,et al.  Phytoplankton species shift accompanied by transition from nitrogen dependence to phosphorus dependence of primary production in Lake Kasumigaura, Japan , 1992 .

[4]  F. Recknagel,et al.  Artificial neural network approach for modelling and prediction of algal blooms , 1997 .

[5]  Marvin Minsky,et al.  Perceptrons: An Introduction to Computational Geometry , 1969 .

[6]  Elie Bienenstock,et al.  Neural Networks and the Bias/Variance Dilemma , 1992, Neural Computation.

[7]  F. Recknagel,et al.  Elucidation and Prediction of Aquatic Ecosystems by Artificial Neuronal Networks , 2000 .

[8]  Friedrich Recknagel,et al.  Modelling and prediction of phyto‐ and zooplankton dynamics in Lake Kasumigaura by artificial neural networks , 1998 .

[9]  Casimir A. Kulikowski,et al.  Computer Systems That Learn: Classification and Prediction Methods from Statistics, Neural Nets, Machine Learning and Expert Systems , 1990 .

[10]  Arthur Flexer,et al.  Connectionists and Statisticans, Friends or Foes? , 1995, IWANN.

[11]  Friedrich Recknagel,et al.  Prediction and elucidation of phytoplankton dynamics in the Nakdong River (Korea) by means of a recurrent artificial neural network , 2001 .

[12]  I. Dimopoulos,et al.  Application of neural networks to modelling nonlinear relationships in ecology , 1996 .

[13]  Ferdinand Hergert,et al.  Improving model selection by nonconvergent methods , 1993, Neural Networks.

[14]  Martin Fodslette Møller,et al.  A scaled conjugate gradient algorithm for fast supervised learning , 1993, Neural Networks.

[15]  Philip D. Wasserman,et al.  Neural computing - theory and practice , 1989 .

[16]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[17]  John Moody,et al.  Note on generalization, regularization and architecture selection in nonlinear learning systems , 1991, Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop.

[18]  Alexander H. Waibel,et al.  Modular Construction of Time-Delay Neural Networks for Speech Recognition , 1989, Neural Computation.

[19]  Friedrich Recknagel,et al.  Predicting eutrophication effects in the Burrinjuck Reservoir (Australia) by means of the deterministic model SALMO and the recurrent neural network model ANNA , 2001 .