Classifier neural net with complex-valued weights and square-law nonlinearities

A new pattern recognition classifier neural net (NN) is described that uses complex-valued weights and square-law nonlinearities. We show that these weights and nonlinearities inherently produce higher-order decision surfaces and thus we expect better classification performance (PC). We refer to this as the piecewise hyperquadratic neural net (PQNN) since each hidden layer neuron inherently provides a hyperquadratic decision surface and the combination of neurons provides piecewise hyperquadratic decision surfaces. We detail the learning algorithm for this PQNN and provide initial results on synthetic data showing its advantages over the backpropagation and other NNs. We also note a new technique to provide improved classification results when there are significantly different numbers of samples per class.