In this article we study the amortized efficiency of the “move-to-front” and similar rules for dynamically maintaining a linear list. Under the assumption that accessing the ith element from the front of the list takes &thgr;(i) time, we show that move-to-front is within a constant factor of optimum among a wide class of list maintenance rules. Other natural heuristics, such as the transpose and frequency count rules, do not share this property. We generalize our results to show that move-to-front is within a constant factor of optimum as long as the access cost is a convex function. We also study paging, a setting in which the access cost is not convex. The paging rule corresponding to move-to-front is the “least recently used” (LRU) replacement rule. We analyze the amortized complexity of LRU, showing that its efficiency differs from that of the off-line paging rule (Belady's MIN algorithm) by a factor that depends on the size of fast memory. No on-line paging algorithm has better amortized performance.
[1]
Laszlo A. Belady,et al.
A Study of Replacement Algorithms for Virtual-Storage Computer
,
1966,
IBM Syst. J..
[2]
Peter J. Denning,et al.
Operating Systems Theory
,
1973
.
[3]
Donald E. Knuth,et al.
The Art of Computer Programming, Vol. 3: Sorting and Searching
,
1974
.
[4]
Peter A. Franaszek,et al.
Some Distribution-Free Aspects of Paging Algorithm Performance
,
1974,
JACM.
[5]
Ronald L. Rivest,et al.
On self-organizing sequential search heuristics
,
1976,
CACM.
[6]
Jeffrey R. Spirn,et al.
Program Behavior: Models and Measurements
,
1977
.
[7]
James R. Bitner,et al.
Heuristics That Dynamically Organize Data Structures
,
1979,
SIAM J. Comput..
[8]
R. Weber,et al.
A counterexample to a conjecture on optimal list ordering
,
1982
.
[9]
Jon Louis Bentley,et al.
Worst-Case Analyses of Self-Organizing Sequential Search Heuristics.
,
1983
.