CHAPTER 3 – Management of Cache Contents
暂无分享,去创建一个
A cache's content-management solution has several responsibilities. These responsibilities are addressed by actions taken by three different components: Partitioning heuristics, Prefetching heuristics, and Locality optimizations. This chapter opens up with a discussion of on-line or run-time techniques, those in which the program's semantics and/or structure are unknown to the heuristic. To optimize program execution, the heuristic must either make blanket choices that are good in the average case or tailor its choices to the observed behavior of the program. Locality optimizations modify the structure of code and/or data to enhance the forms of locality most easily exploited by the cache system—namely, temporal locality. The study illustrates the range of techniques that have been explored in the literature. It presents design-time techniques, i.e., those in which the programmer and/or compiler perform some amount of analysis and realize within the application itself mechanisms that will manage the memory system directly at run time. Schemes in which hardware and application collaborate, in which there is no clear on-line/off-line divisions, are addressed. Much work in off-line partitioning focuses on embedded systems, because the typical microcontroller or DSP used in embedded systems has an exposed memory system in which the various storage units in the system are directly addressable by software. Off-line locality optimizations rely only on the (static) information known about an application at compile time. There are numerous examples of schemes that fashion a synergy between on-line and off-line mechanisms.