An Analysis of a Memory Allocation Scheme for Implementing Stacks

Consider the implementation of two stacks by letting them grow towards each other in a table of size m . Suppose a random sequence of insertions and deletions are executed, with each instruction having a fixed probability p (0 > p > 1/2) to be a deletion. Let $A_p (m) denote the expected value of max{x,y}, where x and y are the stack heights when the table first becomes full. We shall prove that, as $m \rightarrow \infty$, $A_p (m) = \sqrt{m/(2 \pi (1-2p))} + O((log m)/ \sqrt{m})$. This gives a solution to an open problem in Knuth ["The Art of Computer Programming, Vol. 1, Exercise 2.2.2-13].