Twisted quaternary neural networks

The quaternary neural network (QNN) proposed by Nitta is a high-dimensional neural network. Nitta showed that its learning is faster than that of ordinary neural networks and the number of required parameters is almost one-third of that of real-valued neural networks by computer simulations. In this paper, we propose the twisted quaternary neural network (TQNN) which modifies the directions of multiplications of the QNN. Since quaternions are noncommutative on multiplication, we can get another neural network. When the activation function is linear, multilayered neural networks can be expressed by single-layered neural networks. But the TQNN cannot be expressed by a single-layered QNN even if the activation function is linear. Therefore, the TQNN is expected to produce a variety of signal-processing systems. We performed computer simulations to compare the QNN and the TQNN. Then we found that the latter's learning is a little faster. Moreover, computer simulation showed that the QNN tended to be trapped in local minima or plateaus but the TQNN did not. It is said that reducibility causes local minima and plateaus. We discuss the differences of reducibility between the QNN and the TQNN as well. © 2012 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.

[1]  Tohru Nitta,et al.  The uniqueness theorem for complex-valued neural networks and the redundancy of the parameters , 2003, Systems and Computers in Japan.

[2]  T. Nitta,et al.  A quaternary version of the back-propagation algorithm , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.

[3]  Francesco Piazza,et al.  On the complex backpropagation algorithm , 1992, IEEE Trans. Signal Process..

[4]  Tohru Nitta,et al.  On the critical points of the complex-valued neural network , 2002, Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02..

[5]  Masaki Kobayashi,et al.  Exceptional Reducibility of Complex-Valued Neural Networks , 2010, IEEE Transactions on Neural Networks.

[6]  Tohru Nitta,et al.  Reducibility of the Complex-valued Neural Network , 2004 .

[7]  Tohru Nitta,et al.  An Extension of the Back-Propagation Algorithm to Complex Numbers , 1997, Neural Networks.

[8]  Héctor J. Sussmann,et al.  Uniqueness of the weights for minimal feedforward nets with a given input-output map , 1992, Neural Networks.

[9]  Masaki Kobayashi,et al.  Construction of high-dimensional neural networks by linear connections of matrices , 2003 .

[10]  Tohru Nitta,et al.  Redundancy of the parameters of the complex-valued neural network , 2002, Neurocomputing.

[11]  Cris Koutsougeras,et al.  Complex domain backpropagation , 1992 .

[12]  Kenji Fukumizu,et al.  Local minima and plateaus in hierarchical structures of multilayer perceptrons , 2000, Neural Networks.

[13]  T. Nitta An Extension of the Back-propagation Algorithm to Quaternions , 1996 .

[14]  P. Arena,et al.  Neural networks in multidimensional domains , 1998 .