K-Separability

Neural networks use their hidden layers to transform input data into linearly separable data clusters, with a linear or a perceptron type output layer making the final projection on the line perpendicular to the discriminating hyperplane. For complex data with multimodal distributions this transformation is difficult to learn. Projection on k ≥2 line segments is the simplest extension of linear separability, defining much easier goal for the learning process. The difficulty of learning non-linear data distributions is shifted to separation of line intervals, making the main part of the transformation much simpler. For classification of difficult Boolean problems, such as the parity problem, linear projection combined with k-separability is sufficient.

[1]  Bogdan M. Wilamowski,et al.  Solving parity-N problems with feedforward neural networks , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[2]  Stanley H. Smith,et al.  N-bit parity neural networks: new solutions based on linear programming , 2002, Neurocomputing.

[3]  Jirí Benes,et al.  On neural networks , 1990, Kybernetika.

[4]  Jörg H. Siekmann,et al.  Artificial Intelligence and Soft Computing - ICAISC 2004 , 2004, Lecture Notes in Computer Science.

[5]  Wlodzislaw Duch,et al.  A new methodology of extraction, optimization and application of crisp and fuzzy logical rules , 2001, IEEE Trans. Neural Networks.

[6]  David G. Stork,et al.  How to solve the N-bit parity problem with two hidden units , 1992, Neural Networks.

[7]  Thomas G. Dietterich,et al.  Solving Multiclass Learning Problems via Error-Correcting Output Codes , 1994, J. Artif. Intell. Res..

[8]  Wlodzislaw Duch Coloring black boxes: visualization of neural network decisions , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[9]  Kaoru Hirota,et al.  A Solution for the N-bit Parity Problem Using a Single Translated Multiplicative Neuron , 2004, Neural Processing Letters.

[10]  Wlodzislaw Duch,et al.  Visualization of Hidden Node Activity in Neural Networks: II. Application to RBF Networks , 2004, ICAISC.

[11]  Ivan Stojmenovic,et al.  On the number of multilinear partitions and the computing capacity of multiple-valued multiple-threshold perceptrons , 2003, IEEE Trans. Neural Networks.

[12]  Juan-Manuel Torres-Moreno,et al.  The Minimum Number of Errors in the N-Parity and its Solution with an Incremental Neural Network , 2002, Neural Processing Letters.

[13]  J. M. Minor Parity with two layer feedforward nets , 1993, Neural Networks.

[14]  Eugene Lavretsky On the exact solution of the Parity-N problem using ordered neural networks , 2000, Neural Networks.

[15]  Rudy Setiono,et al.  On the solution of the parity problem by a single hidden layer feedforward neural network , 1997, Neurocomputing.

[16]  Ian Witten,et al.  Data Mining , 2000 .

[17]  Włodzisław Duch,et al.  Similarity-based methods: a general framework for classification, approximation and association , 2000 .

[18]  Wlodzislaw Duch,et al.  Taxonomy of neural transfer functions , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[19]  Tao Xiong,et al.  A combined SVM and LDA approach for classification , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[20]  Wlodzislaw Duch Visualization of Hidden Node Activity in Neural Networks: I. Visualization Methods , 2004, ICAISC.

[21]  W Lodzis Law Duch Visualization of Hidden Node Activity in Neural Networks : I . Visualization Methods , 2004 .

[22]  M. Z. Arslanov,et al.  N-bit parity ordered neural networks , 2002, Neurocomputing.

[23]  Norbert Jankowski,et al.  Survey of Neural Transfer Functions , 1999 .

[24]  Geerd H. F. Diercksen,et al.  Classification, Association and Pattern Completion using Neural Similarity Based Methods , 2000 .