Stochastic behavior of inter-drop time in an M frame buffer video decoding scenario

We studied the stochastic behavior of inter-drop time in an M frame buffer scheme. The problem arises from digital television signal decoding. We can set up a random walk model with lower bound. The drop occurs when the random walk exceeds a pre-determined upper bound. Our main result is that, under certain conditions, the tail distribution of the inter-drop time is nearly geometric. Based on this, a numerical approximation of two important quantities-the mean of the inter-drop time and the mean of drop frequency-is developed. Simulation results verify our conclusion.