A controlled learning environment of enhanced perceptron

Some basic features of multi-layer perceptrons are sketched. Some of the shortcomings of these perceptrons are pointed out, and ways of retaining their strengths but overcoming their shortcomings are proposed. A class of networks called inference networks is introduced in order to demonstrate that logical reasoning capability can also be modeled using networks. The class of multi layer perceptrons and inference networks in then unified into a single class of networks, called enhanced perceptrons. One important theorem obtained is that for any given pair of pattern-sets, there always exists an enhanced perceptron with only one hidden layer to match the given patterns. The patterns can be image patterns, attribute patterns, or logical patterns. The proof of this theorem is by constructive algorithm. Once a solution is obtained, other solutions with a controlled degree of error tolerance can then be generated through some learning algorithms.<<ETX>>