Critical database size for effective caching

Replicating or caching popular content in memories distributed across the network is a technique to reduce peak network loads. Conventionally, the performance gain of caching was thought to result from making part of the requested data available closer to end users. Recently, it has been shown that by using a carefully designed technique to store the contents in the cache and coding across data streams a much more significant gain can be achieved in reducing the network load. Inner and outer bounds on the network load v/s cache memory tradeoff were obtained in [1]. We give an improved outer bound on the network load v/s cache memory tradeoff. We also address the question of to what extent caching is effective in reducing the server load when the number of files becomes large as compared to the number of users. We show that the effectiveness of caching become small when the number of files becomes comparable to the square of the number of users.

[1]  Urs Niesen,et al.  Fundamental limits of caching , 2012, 2013 IEEE International Symposium on Information Theory.

[2]  Suhas N. Diggavi,et al.  Hierarchical coded caching , 2014, 2014 IEEE International Symposium on Information Theory.

[3]  Rudolf Ahlswede,et al.  Network information flow , 2000, IEEE Trans. Inf. Theory.

[4]  Urs Niesen,et al.  Online Coded Caching , 2013, IEEE/ACM Transactions on Networking.

[5]  Urs Niesen,et al.  Decentralized coded caching attains order-optimal memory-rate tradeoff , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[6]  Urs Niesen,et al.  Coded Caching With Nonuniform Demands , 2017, IEEE Transactions on Information Theory.