Contention Resolution with Predictions

In this paper, we consider contention resolution algorithms that are augmented with predictions about the network. We begin by studying the natural setup in which the algorithm is provided a distribution defined over the possible network sizes that predicts the likelihood of each size occurring. The goal is to leverage the predictive power of this distribution to improve on worst-case time complexity bounds. Using a novel connection between contention resolution and information theory, we prove lower bounds on the expected time complexity with respect to the Shannon entropy of the corresponding network size random variable, for both the collision detection and no collision detection assumptions. We then analyze upper bounds for these settings, assuming now that the distribution provided as input might differ from the actual distribution generating network sizes. We express their performance with respect to both entropy and the statistical divergence between the two distributions---allowing us to quantify the cost of poor predictions. Finally, we turn our attention to the related perfect advice setting, parameterized with a length b ≥ 0, in which all active processes in a given execution are provided the best possible b bits of information about their network. We provide tight bounds on the speed-up possible with respect to b for deterministic and randomized algorithms, with and without collision detection. These bounds provide a fundamental limit on the maximum power that can be provided by any predictive model with a bounded output size.

[1]  Andrea E. F. Clementi,et al.  Distributed broadcast in radio networks of unknown topology , 2003, Theor. Comput. Sci..

[2]  Seth Gilbert,et al.  Constant-Length Labelling Schemes for Faster Deterministic Radio Broadcast , 2020, SPAA.

[3]  Ran Gelles,et al.  Brief Announcement: Noisy Beeping Networks , 2020, PODC.

[4]  Sergei Vassilvitskii,et al.  Algorithms with predictions , 2020, Beyond the Worst-Case Analysis of Algorithms.

[5]  Thomas M. Cover,et al.  Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing) , 2006 .

[6]  Calvin C. Newport Radio Network Lower Bounds Made Easy , 2014, DISC.

[7]  Google,et al.  Improving Online Algorithms via ML Predictions , 2024, NeurIPS.

[8]  Michael Mitzenmacher,et al.  A Model for Learned Bloom Filters and Optimizing by Sandwiching , 2018, NeurIPS.

[9]  Janna Burman,et al.  Can Uncoordinated Beeps tell Stories? , 2020, PODC.

[10]  Reuven Bar-Yehuda,et al.  On the Time-Complexity of Broadcast in Multi-hop Radio Networks: An Exponential Gap Between Determinism and Randomization , 1992, J. Comput. Syst. Sci..

[11]  Andrzej Pelc,et al.  Fast radio broadcasting with advice , 2008, Theor. Comput. Sci..

[12]  Miguel A. Mosteiro,et al.  Lower Bounds for Clear Transmissions in Radio Networks , 2006, LATIN.

[13]  Klim Efremenko,et al.  Noisy Beeps , 2020, Electron. Colloquium Comput. Complex..

[14]  Imrich Chlamtac,et al.  On Broadcasting in Radio Networks - Problem Analysis and Protocol Design , 1985, IEEE Transactions on Communications.

[15]  Silvio Lattanzi,et al.  Online Scheduling via Learned Weights , 2020, SODA.

[16]  Maria Gradinariu Potop-Butucaru,et al.  Wireless Broadcast with Short Labels , 2020, NETYS.

[17]  Tomasz Jurdzinski,et al.  Probabilistic Algorithms for the Wakeup Problem in Single-Hop Radio Networks , 2002, ISAAC.

[18]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[19]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[20]  Dan E. Willard,et al.  Log-Logarithmic Selection Resolution Protocols in a Multiple Access Channel , 1986, SIAM J. Comput..

[21]  John O. Pliam On the Incomparability of Entropy and Marginal Guesswork in Brute-Force Attacks , 2000, INDOCRYPT.

[22]  Andrzej Pelc,et al.  Communication algorithms with advice , 2010, J. Comput. Syst. Sci..

[23]  Andrzej Pelc,et al.  Constant-Length Labeling Schemes for Deterministic Radio Broadcast , 2019, SPAA.