On the capability of neural networks with complex neurons in complex valued functions approximation

The capability of neural networks with complex neurons to approximate complex valued functions is investigated. A density theorem for complex multilayer perceptrons (MLPs) with a nonanalytic activation function and one hidden layer is proved. The backpropagation algorithms for MLPs with real, complex analytic and complex non-analytic activation functions are compared with a numerical example.<<ETX>>

[1]  Francesco Piazza,et al.  On the complex backpropagation algorithm , 1992, IEEE Trans. Signal Process..

[2]  Henry Leung,et al.  The complex backpropagation algorithm , 1991, IEEE Trans. Signal Process..

[3]  W. Rudin Real and complex analysis , 1968 .

[4]  Ken-ichi Funahashi,et al.  On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.

[5]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..