Use of an automatic content analysis tool: A technique for seeing both local and global scope

This paper examines what can be learned about bodies of literature using a concept mapping tool, Leximancer. Statistical content analysis and concept mapping were used to analyse bodies of literature from different domains in three case studies. In the first case study, concept maps were generated and analysed for two closely related document sets-a thesis on language games and the background literature for the thesis. The aim for the case study was to show how concept maps might be used to analyse related document collections for coverage. The two maps overlapped on the concept of ''language''; however, there was a stronger focus in the thesis on ''simulations'' and ''agents.'' Other concepts were not as strong in the thesis map as expected. The study showed how concept maps can help to establish the coverage of the background literature in a thesis. In the second case study, three sets of documents from the domain of conceptual and spatial navigation were collected, each discussing a separate topic: navigational strategies, the brain's role in navigation, and concept mapping. The aim was to explore emergent patterns in a set of related concept maps that may not be apparent from reading the literature alone. Separate concept maps were generated for each topic and also for the combined set of literature. It was expected that each of the topics would be situated in different parts of the combined map, with the concept of ''navigation'' central to the map. Instead, the concept of ''spatial'' was centrally situated and the areas of the map for the brain and for navigational strategies overlaid the same region. The unexpected structure provided a new perspective on the coverage of the documents. In the third and final case study, a set of documents on sponges-a domain unfamiliar to the reader-was collected from the Internet and then analysed with a concept map. The aim of this case study was to present how a concept map could aid in quickly understanding a new, technically intensive domain. Using the concept map to identify significant concepts and the Internet to look for their definitions, a basic understanding of key terms in the domain was obtained relatively quickly. It was concluded that using concept maps is effective for identifying trends within documents and document collections, for performing differential analysis on documents, and as an aid for rapidly gaining an understanding in a new domain by exploring the local detail within the global scope of the textual corpus.

[1]  J. Novak Concept maps and Vee diagrams: two metacognitive tools to facilitate meaningful learning , 1990 .

[2]  Bernard Dousset,et al.  Combining mining and visualization tools to discover the geographic structure of a domain , 2006, Comput. Environ. Urban Syst..

[3]  Ruth Schulz,et al.  Generalization in languages evolved for mobile robots , 2006 .

[4]  A. D. Gordon,et al.  Correspondence Analysis Handbook. , 1993 .

[5]  Caroline Barrière Knowledge-Rich Contexts Discovery , 2004, Canadian Conference on AI.

[6]  Chaomei Chen,et al.  VISUALIZATION OF KNOWLEDGE STRUCTURES , 2002 .

[7]  Paul Stockwell,et al.  Language Games and Generalisation Grounded in Autonomous Agents , 2005 .

[8]  Janet Wiles,et al.  The formation, generative power, and evolution of toponyms: Grounding a spatial vocabulary in a cognitive map , 2008 .

[9]  Peter W. Foltz,et al.  An introduction to latent semantic analysis , 1998 .

[10]  Ed C. M. Noyons,et al.  Using bibliometric maps to visualise term distribution in scientific papers , 2002, Proceedings Sixth International Conference on Information Visualisation.

[11]  Matthew Chalmers,et al.  Bead: explorations in information visualization , 1992, SIGIR '92.

[12]  Michael Rovatsos,et al.  Handbook of Software Engineering and Knowledge Engineering , 2005 .

[13]  Masafumi Hagiwara Self-organizing concept maps , 1995, 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century.

[14]  Andrew E. Smith,et al.  Evaluation of unsupervised semantic mapping of natural language with Leximancer concept mapping , 2006, Behavior research methods.

[15]  Gerard Salton,et al.  Automatic Text Processing: The Transformation, Analysis, and Retrieval of Information by Computer , 1989 .

[16]  Gordon Wyeth,et al.  RatSLAM: a hippocampal model for simultaneous localization and mapping , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[17]  R. Weber Basic Content Analysis , 1986 .

[18]  David Yarowsky,et al.  Unsupervised Word Sense Disambiguation Rivaling Supervised Methods , 1995, ACL.

[19]  Shi Bing,et al.  Inductive learning algorithms and representations for text categorization , 2006 .

[20]  R. Weber Basic content analysis, 2nd ed. , 1990 .

[21]  Curt Burgess,et al.  Modelling Parsing Constraints with High-dimensional Context Space , 1997 .

[22]  Joseph D. Novak,et al.  Learning How to Learn , 1984 .

[23]  Colin Seymour-Ure,et al.  Content Analysis in Communication Research. , 1972 .