The Effect Of Page Allocation On Caches

Medium to large physically-indexed low-associativity caches, where physical page number bits index the cache, present two problems. First, cache miss rate varies between runs, as data location in the cache depends on the placement of virtual pages in physical memory. Secondly, the virtual-to-physical address translation must precede cache indexing, increasing latency. This paper summarizes simulation results of instruction, data, and unified caches with conventional page allocation, and explores improving the mean miss rate by controlling (coloring) page allocation. A more strict page coloring algorithm reduces latency by allowing cache indexing to precede address translation.

[1]  Peter J. Denning,et al.  Virtual Memory , 1970, CSUR.

[2]  Richard Eugene Kessler Analysis of multi-megabyte secondary CPU cache memories , 1992 .

[3]  R. L. Sites,et al.  Multiprocessor cache analysis using ATUM , 1988, [1988] The 15th Annual International Symposium on Computer Architecture. Conference Proceedings.

[4]  Michael J. Flynn,et al.  Paging Performance with Page Coloring. , 1991 .

[5]  Peter Davies,et al.  The TLB slice—a low-cost high-speed address translation mechanism , 1990, ISCA '90.

[6]  Alan Jay Smith,et al.  Cache Memories , 1982, CSUR.

[7]  Mark D. Hill,et al.  A case for direct-mapped caches , 1988, Computer.

[8]  Chris Rowen,et al.  A CMOS RISC Processor with Integrated System Functions , 1986, COMPCON.

[9]  Nader Vasseghi,et al.  The Mips R4000 processor , 1992, IEEE Micro.