Feedforward networks with monotone constraints

In many practical applications of artificial neural networks (ANN), there exist natural constraints on the model such as monotonic relations between inputs and outputs that are known in advance. It is advantageous to incorporate these constraints into the ANN structure. We propose a modified feedforward network structure that enforces monotonic relations on designated input variables. The backpropagation formulas for the gradients in the new network structure are derived which lead to various learning algorithms. The monotone properties and the backpropagation formulas for the networks are proven mathematically and verified with numerical examples. A computer program for the new network structure and a learning algorithm is implemented to test the system. Experimental results are obtained on both simulated and real data sets.