On-chip learning of FPGA-inspired neural nets

Neural networks are usually considered as naturally parallel computing models. But the number of operators and the complex connection graphs of standard neural models can not be handled by digital hardware devices. A new theoretical and practical framework allows to reconcile simple hardware topologies with complex neural architectures: field programmable neural arrays (FPNA) lead to powerful neural architectures that are easy to map onto digital hardware, thanks to a simplified topology and an original data exchange scheme. The paper focuses on a class of synchronous FPNAs, for which an efficient implementation with on-chip learning is described. Application and implementation results are discussed.

[1]  Bernard Girau,et al.  Generic Back-Propagation in Arbitrary FeedForward Neural Networks , 1995, ICANNGA.

[2]  Bernard Girau,et al.  FPNA: Interaction Between FPGA and Neural Computation , 2000, Int. J. Neural Syst..

[3]  Bernard Girau Digital hardware implementation of 2D compatible neural networks , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[4]  Nelson Morgan Programmable neurocomputing systems , 1998 .

[5]  Bernard Girau Simplified neural architectures for symmetric boolean functions , 2000, ESANN.

[6]  Valentina Salapura Neural networks using bit stream arithmetic: a space efficient implementation , 1994, Proceedings of IEEE International Symposium on Circuits and Systems - ISCAS '94.

[7]  Bernard Girau Building a 2D-compatible multilayer neural network , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[8]  Jenq-Neng Hwang,et al.  Finite Precision Error Analysis of Neural Network Hardware Implementations , 1993, IEEE Trans. Computers.

[9]  Bernard Girau Neural networks on FPGAs: a survey , 1999 .