In this paper an approach to developing representations jOr embodied ontologies used by other agents is proposed. Agents are assumed to posses their own systems ofmeaning captured in the form oftheir strictly private ontology. Therefore, in order to communicate successfUlly the agent should be able to circumscribe the meaning assigned by other agents to incoming symbols ofcommunication language. Simple correlation analysis, between symbols incoming from other agents and associated states of the external world, allows an agent to develop its internal reflection of meaning assigned to uttered language symbols by their senders. Referring to the representation ofother agents' embodied ontology the agent may determine the level ofconsistency between the language symbols semantics assigned byparticular agents. The novelty of this work lies in the jOct that language symbols used by communicating agents are assumed to be grounded in agent experience. As a direct consequence the introduced process of other agents' embodied ontology representation determination can beformalized.
[1]
L. Steels.
Perceptually grounded meaning creation
,
1996
.
[2]
Paul Vogt,et al.
Anchoring of semiotic symbols
,
2003,
Robotics Auton. Syst..
[3]
Radoslaw Katarzyniak,et al.
Grounding Crisp and Fuzzy Ontological Concepts in Artificial Cognitive Agents
,
2006,
KES.
[4]
Elizabeth S. Spelke,et al.
Principles of Object Perception
,
1990,
Cogn. Sci..
[5]
Stevan Harnad,et al.
Symbol grounding problem
,
1990,
Scholarpedia.
[6]
R. P. Katarzyniak,et al.
Reconciling inconsistent profiles of agents' knowledge states in distributed multiagent systems using consensus methods
,
2000
.
[7]
Radoslaw Katarzyniak,et al.
The Language Grounding Problem and its Relation to the Internal Structure of Cognitive Agents
,
2005,
J. Univers. Comput. Sci..
[8]
A. Riegler,et al.
Understanding Representation in the Cognitive Sciences
,
1999,
Springer US.