Extension of Binary Neural Networks for Multi-class Output and Finite Automata

Neural networks implementing Boolean functions are known as Binary Neural Networks (BNNs). Various methods for construction of BNNs have been introduced in the last decade. Many applications require BNNs to handle multiple classes. In this paper, we first review some basic methods proposed in the last decade for construction of BNNs. We summarize main approach in these methods, by observing that a neuron can be visualized in terms of its equivalent hypersphere. Next, we give some approaches for adopting a BNN construction process for classification problem that needs to classify data into multiple (more than two) classes. We illustrate these approaches using examples. From the theoretical view, the limited applicability of BNNs does not come in the way of expressing a Finite Automaton (FA) in terms of recurrent BNNs. We prove that recurrent BNNs simulate any deterministic as well as non-deterministic finite automaton. The proof is constructive, and the construction process is illustrated by suitable examples.

[1]  Marvin Minsky,et al.  Computation : finite and infinite machines , 2016 .

[2]  Raymond L. Watrous,et al.  Induction of Finite-State Languages Using Second-Order Recurrent Networks , 1992, Neural Computation.

[3]  Giovanni Soda,et al.  Unified Integration of Explicit Knowledge and Learning by Example in Recurrent Networks , 1995, IEEE Trans. Knowl. Data Eng..

[4]  Jordan B. Pollack,et al.  The induction of dynamical recognizers , 1991, Machine Learning.

[5]  Jing Pang,et al.  EVOLVABLE BINARY ARTIFICIAL NEURAL NETWORK FOR DATA CLASSIFICATION , 2000 .

[6]  James J. Carroll,et al.  Approximation of nonlinear systems with radial basis function neural networks , 2001, IEEE Trans. Neural Networks.

[7]  A. Yamamoto,et al.  An improved expand-and-truncate learning , 1997, Proceedings of International Conference on Neural Networks (ICNN'97).

[9]  Jeong Han Kim,et al.  Covering Cubes by Random Half Cubes with Applications to Binary Neural Networks , 1998, J. Comput. Syst. Sci..

[10]  N. H. Biswas,et al.  IMS algorithm for learning representations in Boolean neural networks , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.

[11]  Marvin Minsky,et al.  Perceptrons: expanded edition , 1988 .

[12]  Vinay Deolalikar,et al.  Mapping Boolean functions with neural networks having binary weights and zero thresholds , 2001, IEEE Trans. Neural Networks.

[13]  Sung-Kwon Park,et al.  The geometrical learning of binary neural networks , 1995, IEEE Trans. Neural Networks.

[14]  C. Lee Giles,et al.  Extracting and Learning an Unknown Grammar with Recurrent Neural Networks , 1991, NIPS.

[15]  Naum N. Aizenberg,et al.  Multi-valued and Universal Binary Neurons: Learning Algorithms, Application to Image Processing and Recognition , 1999, MLDM.

[16]  Mikel L. Forcada,et al.  Second-Order Recurrent Neural Networks Can Learn Regular Grammars from Noisy Strings , 1995, IWANN.

[17]  Mikel L. Forcada,et al.  Learning the Initial State of a Second-Order Recurrent Neural Network during Regular-Language Inference , 1995, Neural Computation.

[18]  Jeffrey D. Ullman,et al.  Introduction to automata theory, languages, and computation, 2nd edition , 2001, SIGA.

[19]  Vinay Deolalikar,et al.  A two-layer paradigm capable of forming arbitrary decision regions in input space , 2002, IEEE Trans. Neural Networks.

[20]  Jeffrey D. Ullman,et al.  Introduction to Automata Theory, Languages and Computation , 1979 .

[21]  Nageswara S. V. Rao,et al.  Learning Separations by Boolean Combinations of Half-Spaces , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  Noga Alon,et al.  Efficient simulation of finite automata by neural nets , 1991, JACM.

[23]  Reza Ghaderi,et al.  Binary labelling and decision-level fusion , 2001, Inf. Fusion.

[24]  Joon-Tark Lee,et al.  Optimal Synthesis Method for Binary Neural Network Using NETLA , 2002, AFSS.

[25]  Ma Xiao RESEARCH ON THE LEARNING ALGORITHM OF BINARY NEURAL NETWORK , 1999 .

[26]  Mansur R. Kabuka Reply to: Comments on 'Design of Supervised Classifiers Using Boolean Neural Networks' , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[27]  Terry Windeatt,et al.  Spectral technique for hidden layer neural network training , 1997, Pattern Recognit. Lett..

[28]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[29]  Di Wang,et al.  Binary Neural Network Training Algorithms Based on Linear Sequential Learning , 2003, Int. J. Neural Syst..

[30]  C. H. Chu,et al.  Pattern classification by geometrical learning of binary neural networks , 1993, Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan).

[31]  A. Engel,et al.  VAPNIK-CHERVONENKIS DIMENSION OF NEURAL NETWORKS WITH BINARY WEIGHTS , 1996, cond-mat/9608156.

[32]  Mansur R. Kabuka,et al.  Design of Supervised Classifiers Using Boolean Neural Networks , 1995, IEEE Trans. Pattern Anal. Mach. Intell..

[33]  Guy Smith Comments on 'Design of Supervised Classifiers Using Boolean Neural Neworks' , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[34]  C. Lee Giles,et al.  Constructing deterministic finite-state automata in recurrent neural networks , 1996, JACM.

[35]  Anthony N. Michel,et al.  A training algorithm for binary feedforward neural networks , 1992, IEEE Trans. Neural Networks.

[36]  Aruna Tiwari,et al.  Extending ETL for multi-class output , 2002, Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02..

[37]  Padhraic Smyth,et al.  Learning Finite State Machines With Self-Clustering Recurrent Networks , 1993, Neural Computation.

[38]  C. L. Giles,et al.  Rule refinement with recurrent neural networks , 1993, IEEE International Conference on Neural Networks.

[39]  Mikel L. Forcada,et al.  Finite-State Computation in Analog Neural Networks: Steps towards Biologically Plausible Models? , 2001, Emergent Neural Computational Architectures Based on Neuroscience.