Recurrent neural network classifier for Three Layer Conceptual Network and performance evaluation

Contextual analysis in dialog is a hard problem. In this paper three layers memory structure is adopted to address the challenge which we refer to as three layer conceptual network (TLCN). This highly efficient network simulates the human brain by episodic memory, discourse memory and ground memory. An extended case structure framework is used to represent the knowledge. The knowledge database is constructed by the collection of target system information and utterances. This knowledge is updated after every dialog conversation. A Recurrent Neural Network classifier is also introduced for classifying the knowledge for the target system. This system prototype is based on doctor-patients dialogs. 78% disease classification accuracy is observed by this system prototype. Disease identification accuracy is depending on number of disease and number of utterances. This performance evaluation is also discussed in details.

[1]  Jing Peng,et al.  An Efficient Gradient-Based Algorithm for On-Line Training of Recurrent Network Trajectories , 1990, Neural Computation.

[2]  Noam Chomsky,et al.  On Certain Formal Properties of Grammars , 1959, Inf. Control..

[3]  Tsutomu Endo,et al.  STORY GENERATION BASED ON DYNAMICS OF THE MIND , 1992, Comput. Intell..

[4]  James F. Allen,et al.  Toward Conversational Human-Computer Interaction , 2001, AI Mag..

[5]  Yuan Li,et al.  A spiking recurrent neural network , 2004, IEEE Computer Society Annual Symposium on VLSI.

[6]  C. Lee Giles,et al.  Extracting and Learning an Unknown Grammar with Recurrent Neural Networks , 1991, NIPS.

[7]  Stefan Wermter,et al.  SCREEN: learning a flat syntactic and semantic spoken language analysis using artificial neural networks , 1997 .

[8]  Hirosato Nomura,et al.  Japanese Language Semantic Analyzer Based on an Extended Case Frame Model , 1983, IJCAI.

[9]  Yoram Singer,et al.  BoosTexter: A Boosting-based System for Text Categorization , 2000, Machine Learning.

[10]  Kazutaka Shimada,et al.  Interpretation of Utterances based on Relevance Theory : Toward the Formalization of Implicature with the Maximal Relevance , 2005 .

[11]  Jeffrey L. Elman,et al.  Distributed Representations, Simple Recurrent Networks, and Grammatical Structure , 1991, Mach. Learn..

[12]  R. Buckner Beyond HERA: Contributions of specific prefrontal brain areas to long-term memory retrieval , 1996, Psychonomic bulletin & review.

[13]  Noam Chomsky,et al.  The Algebraic Theory of Context-Free Languages* , 1963 .

[14]  NetworksMorten H. Christiansen,et al.  Natural Language Recursion and Recurrent Neural , 1994 .

[15]  J. Elman Distributed Representations, Simple Recurrent Networks, And Grammatical Structure , 1991 .

[16]  Kazutaka Shimada,et al.  A case study of comparison of several methods for corpus-based speech intention identification , 2007 .

[17]  Marilyn A. Walker,et al.  Evaluating Discourse Processing Algorithms , 1989, ACL.

[18]  Ye-Yi Wang,et al.  Grammar learning for spoken language understanding , 2001, IEEE Workshop on Automatic Speech Recognition and Understanding, 2001. ASRU '01..

[19]  Thomas K. Landauer,et al.  How Much do People Remember? Some Estimates of the Quantity of Learned Information in Long-Term Memory , 1986, Cogn. Sci..

[20]  Norbert Reithinger,et al.  Dialog Processing , 2004 .

[21]  Daniel Marcu,et al.  An Unsupervised Approach to Recognizing Discourse Relations , 2002, ACL.

[22]  Fahad Shahbaz Khan,et al.  The Role of Medical Expert Systems in Pakistan , 2008 .

[23]  Jordan B. Pollack,et al.  Recursive Distributed Representations , 1990, Artif. Intell..

[24]  Thomas L. Case,et al.  Expert Systems and Artificial Intelligence , 1990 .

[25]  Alaa A. Kharbouch,et al.  Three models for the description of language , 1956, IRE Trans. Inf. Theory.

[26]  Tsuneo Kagawa,et al.  Cooperative Understanding of Utterances and Gestures in a Dialogue‐Based Problem Solving System , 1999, Comput. Intell..

[27]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[28]  Bertram C. Bruce Case Systems for Natural Language , 1975, Artif. Intell..

[29]  Risto Miikkulainen,et al.  Natural Language Processing with Subsymbolic Neural Networks , 2019, Neural Network Perspectives on Cognition and Adaptive Robotics.

[30]  Sandiway Fong,et al.  Natural Language Grammatical Inference with Recurrent Neural Networks , 2000, IEEE Trans. Knowl. Data Eng..

[31]  Giovanni Soda,et al.  Towards Incremental Parsing of Natural Language Using Recursive Neural Networks , 2003, Applied Intelligence.

[32]  Tsutomu Endo,et al.  Recurrent neural network classifier for Three Layer Conceptual Network and performance evaluation , 2008, CIT 2008.

[33]  Clifford J. Weinstein,et al.  Interlingua-based English–Korean Two-way Speech Translation of Doctor–Patient Dialogues with CCLINC , 2002, Machine Translation.

[34]  Shusaku Tsumoto,et al.  Automated Extraction of Medical Expert System Rules from Clinical Databases on Rough Set Theory , 1998, Inf. Sci..

[35]  Gabriele Scheler,et al.  With raised eyebrows or the eyebrows raised ? A Neural Network Approach to Grammar Checking for Definiteness , 1996, ArXiv.

[36]  Stephan K. Chalup,et al.  Natural Language Learning by Recurrent Neural Networks: A Comparison wRh probabilistic approaches , 1998, CoNLL.