Numerical optimisation of the learning process in multilayer perceptron type neural networks

Attempts to investigate the effects of change in learning parameters on the learning process. Some observations have been recorded, which appear to be rather general but nevertheless have led to numerical optimisation of the learning process. The empirical technique investigated for optimising the learning process, is based on a self-adaptive type of algorithm. This self adaptive algorithm basically amounts to an added capability of the network to exploit its learning experience while still accomplishing the task of learning. Based on this added expertise the network monitors the system error and adjusts the learning parameters accordingly during the learning process. As a result, the learning times have been reduced significantly without paying any penalty in terms of local minima or system oscillations. >