Polynomial-time Algorithm for Distributed Server Allocation Problem

This paper proposes an algorithm for the distributed server allocation problem, namely Minimizing the Maximum Delay (MMD), where an optimal solution is obtained when all server-server delays are the same constant value. We prove that MMD obtains an optimal solution with the polynomial time complexity.

[1]  Eiji Oki,et al.  Computational time complexity of allocation problem for distributed servers in real-time applications , 2016, 2016 18th Asia-Pacific Network Operations and Management Symposium (APNOMS).

[2]  Eiji Oki,et al.  A Real-Time Delay-Sensitive Communication Approach Based on Distributed Processing , 2017, IEEE Access.

[3]  Matti Siekkinen,et al.  Towards pervasive and mobile gaming with distributed cloud infrastructure , 2014, 2014 13th Annual Workshop on Network and Systems Support for Games.

[4]  Eiji Oki,et al.  Participating-Domain Segmentation Based Delay-Sensitive Distributed Server Selection Scheme , 2019, IEEE Access.

[5]  Weisong Shi,et al.  The Promise of Edge Computing , 2016, Computer.

[6]  Takehiro Ito,et al.  Tight Approximability of the Server Allocation Problem for Real-Time Applications , 2017, ALGOCLOUD.

[7]  Fouad A. Tobagi,et al.  Provisioning IP backbone networks to support latency sensitive traffic , 2003, IEEE INFOCOM 2003. Twenty-second Annual Joint Conference of the IEEE Computer and Communications Societies (IEEE Cat. No.03CH37428).