Research on the Stability of Load Balancing Algorithm for Scalable Parallel Computing

In parallel cluster computing, an unscalable or unstable load balancing algorithm can intensely affect the performance of computing. To aim at this case, this paper puts forward a linear dynamic load balancing model and analyzes the stability of this linear model on the condition of existing time delay. Base on analyzing results, this paper uses a load balancing gain to control this model with the increasing system scale. In the end, a more useful nonlinear model is proposed and the simulation results are given to compare with analyzing results and other load balancing methods.

[1]  Majeed M. Hayat,et al.  The effect of time delays on the stability of load balancing algorithms for parallel computations , 2005, IEEE Transactions on Control Systems Technology.

[2]  C.T. Abdallah,et al.  A time delay model for load balancing with processor resource constraints , 2004, 2004 43rd IEEE Conference on Decision and Control (CDC) (IEEE Cat. No.04CH37601).

[3]  B. Anderson,et al.  Linear Optimal Control , 1971 .

[4]  Roy M. Howard,et al.  Linear System Theory , 1992 .

[5]  Federico D. Sacerdoti,et al.  Scalable Algorithms for Molecular Dynamics Simulations on Commodity Clusters , 2006, ACM/IEEE SC 2006 Conference (SC'06).

[6]  Eskil Dekker,et al.  Architecture Scalability of Parallel Vector Computers with a Shared Memory , 1998, IEEE Trans. Computers.

[7]  Morris Driels,et al.  Linear Control Systems Engineering , 1995 .

[8]  Majeed M. Hayat,et al.  Resource-Constrained Load Balancing Controller for a Parallel Database , 2008, IEEE Transactions on Control Systems Technology.

[9]  Joel H. Saltz,et al.  Study of scalable declustering algorithms for parallel grid files , 1996, Proceedings of International Conference on Parallel Processing.

[10]  Chaouki T. Abdallah,et al.  Linear time delay model for studying load balancing instabilities in parallel computations , 2003, Int. J. Syst. Sci..