Communication in networks for coordinating behavior

In this work, we develop elements of a theory of coordination in networks using tools from information theory. We ask questions of this nature: If three different tasks are to be performed in a shared effort between three people, but one of them is randomly assigned his responsibility, how much must he tell the others about his assignment? If two players of a multiplayer game wish to collaborate, how should they best use communication to generate their actions? More generally, we ask for the set of all possible joint distributions p(x1, ..., x m) of actions at the nodes of a network when rate-limited communication is allowed between the nodes. Several networks are solved, including arbitrarily large cascade networks. Distributed coordination can be the solution to many problems such as distributed games, distributed control, and establishing mutual information bounds on the physical influence of one part of a system on another.

[1]  Haim H. Permuter,et al.  Capacity of Coordinated Actions , 2007, 2007 IEEE International Symposium on Information Theory.

[2]  Vivek S. Borkar,et al.  Common randomness and distributed control: A counterexample , 2007, Systems & control letters (Print).

[3]  Mayank Bakshi,et al.  On Network Coding of Independent and Dependent Sources in Line Networks , 2007, 2007 IEEE International Symposium on Information Theory.

[4]  R. Jozsa,et al.  On quantum coding for ensembles of mixed states , 2000, quant-ph/0008024.

[5]  Aaron D. Wyner,et al.  Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .

[6]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[7]  Andreas J. Winter,et al.  Quantum Reverse Shannon Theorem , 2009, ArXiv.

[8]  Rudolf Ahlswede,et al.  Network information flow , 2000, IEEE Trans. Inf. Theory.

[9]  Lossy source coding for a cascade communication system with side-informations , 2006 .

[10]  Andrew Chi-Chih Yao,et al.  Some complexity questions related to distributive computing(Preliminary Report) , 1979, STOC.

[11]  Tsachy Weissman,et al.  The empirical distribution of rate-constrained source codes , 2004, IEEE Transactions on Information Theory.

[12]  João Barros,et al.  A Note on Cooperative Multiterminal Source Coding , 2004 .

[13]  Sergio Verdú,et al.  Approximation theory of output statistics , 1993, IEEE Trans. Inf. Theory.

[14]  Paul W. Cuff,et al.  Communication requirements for generating correlated random variables , 2008, 2008 IEEE International Symposium on Information Theory.

[15]  Michelle Effros,et al.  On multi-resolution coding and a two-hop network , 2006, Data Compression Conference (DCC'06).

[16]  John N. Tsitsiklis,et al.  Distributed Asynchronous Deterministic and Stochastic Gradient Optimization Algorithms , 1984, 1984 American Control Conference.

[17]  Aaron D. Wyner,et al.  The common information of two dependent random variables , 1975, IEEE Trans. Inf. Theory.

[18]  Munther A. Dahleh,et al.  Distributed computation under bit constraints , 2008, 2008 47th IEEE Conference on Decision and Control.

[19]  Béla Bollobás The Art of Mathematics - Coffee Time in Memphis , 2006 .

[20]  Peter W. Shor,et al.  Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem , 2001, IEEE Trans. Inf. Theory.

[21]  Hirosuke Yamamoto,et al.  Source coding theory for cascade and branching communication systems , 1981, IEEE Trans. Inf. Theory.

[22]  Stephen P. Boyd,et al.  Distributed average consensus with least-mean-square deviation , 2007, J. Parallel Distributed Comput..

[23]  Sergio Verdú,et al.  Simulation of random processes and rate-distortion theory , 1996, IEEE Trans. Inf. Theory.

[24]  Haim H. Permuter,et al.  Coordination Capacity , 2009, IEEE Transactions on Information Theory.

[25]  Toby Berger,et al.  Rate-distortion for correlated sources with partially separated encoders , 1982, IEEE Trans. Inf. Theory.

[26]  Alon Orlitsky,et al.  Coding for computing , 1995, Proceedings of IEEE 36th Annual Foundations of Computer Science.

[27]  Sui Tung,et al.  Multiterminal source coding (Ph.D. Thesis abstr.) , 1978, IEEE Trans. Inf. Theory.

[28]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[29]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[30]  Serap A. Savari,et al.  Communicating Probability Distributions , 2007, IEEE Transactions on Information Theory.

[31]  Alon Orlitsky,et al.  Average and randomized communication complexity , 1990, IEEE Trans. Inf. Theory.

[32]  Han-I Su,et al.  Cascade multiterminal source coding , 2009, 2009 IEEE International Symposium on Information Theory.

[33]  Toby Berger,et al.  New results in binary multiple descriptions , 1987, IEEE Trans. Inf. Theory.