Inductive inference from noisy examples using the hybrid finite state filter
暂无分享,去创建一个
[1] Mikel L. Forcada,et al. Second-Order Recurrent Neural Networks Can Learn Regular Grammars from Noisy Strings , 1995, IWANN.
[2] Mike Casey,et al. The Dynamics of Discrete-Time Computation, with Application to Recurrent Neural Networks and Finite State Machine Extraction , 1996, Neural Computation.
[3] Hava T. Siegelmann,et al. On the Computational Power of Neural Nets , 1995, J. Comput. Syst. Sci..
[4] C. Lee Giles,et al. Rule Revision With Recurrent Neural Networks , 1996, IEEE Trans. Knowl. Data Eng..
[5] C. Lee Giles,et al. Extraction, Insertion and Refinement of Symbolic Rules in Dynamically Driven Recurrent Neural Networks , 1993 .
[6] Raymond L. Watrous,et al. Induction of Finite-State Languages Using Second-Order Recurrent Networks , 1992, Neural Computation.
[7] Jing Peng,et al. An Efficient Gradient-Based Algorithm for On-Line Training of Recurrent Network Trajectories , 1990, Neural Computation.
[8] C. Lee Giles,et al. Experimental Comparison of the Effect of Order in Recurrent Neural Networks , 1993, Int. J. Pattern Recognit. Artif. Intell..