The analysis and simulation of multicast join delay

Multicast join delay is an important metric to evaluate the performance of multicast services. However, there is little analytic study of this metric. In this paper, we convert the multicast join delay into the average node-tree distance problem in the graph, and propose an area-overlay method to analyze the multicast join delay against the different topology sizes and multicast densities. We set up a simulation platform to evaluate the performance of the analytic result, and the simulation results show that it can represent the multicast join delay in lower multicast density and bigger topology size. The analytic result can be used to determine the number of the servers (such as multicast router) in network to achieve good trade-offs between delay and load.

[1]  Albert,et al.  Emergence of scaling in random networks , 1999, Science.

[2]  M. Newman,et al.  Random graphs with arbitrary degree distributions and their applications. , 2000, Physical review. E, Statistical, nonlinear, and soft matter physics.

[3]  K.Malarz,et al.  Average distance in growing trees , 2003, cond-mat/0304636.

[4]  Agata Fronczak,et al.  Average path length in random networks. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[5]  K. Malarz Numbers of n-th neighbors and node-to-node distances in growing networks , 2005 .