A geometric learning algorithm for elementary perceptron and its convergence analysis

In this paper, the geometric learning algorithm (GLA) is proposed for an elementary perceptron which includes a single output neuron. The GLA is a modified version of the affine projection algorithm (APA) for adaptive filters. The weight update vector is determined geometrically towards the intersection of the k hyperplanes which are perpendicular to the patterns to be classified, and k is the order of the GLA. In the case of the APA, the target of the coefficients update is a single point which corresponds to the best identification of the unknown system. On the other hand, in the case of the GLA, the target of the weight update is an area, in which all the given patterns are classified correctly. Thus, their convergence conditions are different. In this paper, the convergence condition of the 1st order GLA for 2 patterns is theoretically derived. The new concept "the angle of the solution area" is introduced. The computer simulation results confirm that this new concept is a good estimation of the convergence properties.