An Extension of the Back-Propagation Algorithm to Complex Numbers

This paper presents a complex-valued version of the back-propagation algorithm (called 'Complex-BP'), which can be applied to multi-layered neural networks whose weights, threshold values, input and output signals are all complex numbers. Some inherent properties of this new algorithm are studied. The results may be summarized as follows. The updating rule of the Complex-BP is such that the probability for a "standstill in learning" is reduced. The average convergence speed is superior to that of the real-valued back-propagation, whereas the generalization performance remains unchanged. In addition, the number of weights and thresholds needed is only about the half of real-valued back-propagation, where a complex-valued parameter z=x+iy (where i=-1) is counted as two because it consists of a real part x and an imaginary part y. The Complex-BP can transform geometric figures, e.g. rotation, similarity transformation and parallel displacement of straight lines, circles, etc., whereas the real-valued back-propagation cannot. Mathematical analysis indicates that a Complex-BP network which has learned a transformation, has the ability to generalize that transformation with an error which is represented by the sine. It is interesting that the above characteristics appear only by extending neural networks to complex numbers.