Communication management using abstraction in distributed bayesian networks

Techniques were developed in previous work for managing communication in a controlled satisficing manner in two layer distributed Bayesian Networks. DEC-MDPs were used to sequence the information transferred in order to guarantee the required confidence level. In this paper, we introduce multiple abstraction layers into the Distributed Bayesian Network as a way of carrying more useful information in transmitted data to further reduce the number of messages that need to be sent. An algorithm is developed to automatically generate appropriate abstraction data. Techniques are introduced to effectively incorporate this abstraction data set into the DEC-MDP framework. We show that the appropriate addition of abstraction data actions simplifies the DEC-MDP while reducing the expected communication cost. This work provides us with a formal view of the use of abstraction in agent cooperation and begins to give us an understanding of when the less abstract data needs to be transmitted.

[1]  Victor R. Lesser,et al.  The Hearsay-II Speech-Understanding System: Integrating Knowledge to Resolve Uncertainty , 1980, CSUR.

[2]  Victor R. Lesser,et al.  The DRESUN Testbed for Research in FA/C Distributed Situation Assessment: Extensions to the Model of External Evidence , 1995, ICMAS.

[3]  Wray L. Buntine A Guide to the Literature on Learning Probabilistic Networks from Data , 1996, IEEE Trans. Knowl. Data Eng..

[4]  Edmund H. Durfee,et al.  Theory for Coordinating Concurrent Hierarchical Planning Agents Using Summary Information , 1999, AAAI/IAAI.

[5]  Neil Immerman,et al.  The Complexity of Decentralized Control of Markov Decision Processes , 2000, UAI.

[6]  S. H. Chung,et al.  Distributed real-time model-based diagnosis , 2003 .

[7]  Claudia V. Goldman,et al.  Optimizing information exchange in cooperative multi-agent systems , 2003, AAMAS '03.

[8]  Victor R. Lesser,et al.  Minimizing communication cost in a distributed Bayesian network using a decentralized MDP , 2003, AAMAS '03.

[9]  Makoto Yokoo,et al.  Communications for improving policy computation in distributed POMDPs , 2004, Proceedings of the Third International Joint Conference on Autonomous Agents and Multiagent Systems, 2004. AAMAS 2004..

[10]  Victor R. Lesser,et al.  Domain Monotonicity and the Performance of Local Solutions Strategies for CDPS-based Distributed Sensor Interpretation and Distributed Diagnosis , 2004, Autonomous Agents and Multi-Agent Systems.

[11]  David Heckerman,et al.  A Tutorial on Learning with Bayesian Networks , 1999, Innovations in Bayesian Networks.