0%

Book Description

This book summarizes the landscape of cache replacement policies for CPU data caches.

The emphasis is on algorithmic issues, so the authors start by defining a taxonomy that places previous policies into two broad categories, which they refer to as coarse-grained and fine-grained policies. Each of these categories is then divided into three subcategories that describe different approaches to solving the cache replacement problem, along with summaries of significant work in each category. Richer factors, including solutions that optimize for metrics beyond cache miss rates, that are tailored to multi-core settings, that consider interactions with prefetchers, and that consider new memory technologies, are then explored. The book concludes by discussing trends and challenges for future work. This book, which assumes that readers will have a basic understanding of computer architecture and caches, will be useful to academics and practitioners across the field.

Table of Contents

  1. Preface
  2. Acknowledgments
  3. Introduction
  4. A Taxonomy of Cache Replacement Policies
    1. Coarse-Grained Policies
    2. Fine-Grained Policies
    3. Design Considerations
  5. Coarse-Grained Replacement Policies
    1. Recency-Based Policies
      1. Variants of LRU
      2. Beyond LRU: Insertion and Promotion Policies
      3. Extended Lifetime Recency-Based Policies
    2. Frequency-Based Policies
    3. Hybrid Policies
      1. Adaptive Replacement Cache (ARC)
      2. Set Dueling
  6. Fine-Grained Replacement Policies
    1. Reuse Distance Prediction Policies
      1. Expiration-Based Dead Block Predictors
      2. Reuse Distance Ordering
    2. Classification-Based Policies
      1. Sampling Based Dead Block Prediction (SDBP)
      2. Signature Based Hit Prediction (SHiP)
      3. Hawkeye
      4. Perceptron-Based Prediction
      5. Evicted Address Filter (EAF)
    3. Other Prediction Metrics
      1. Economic Value Added (EVA)
  7. Richer Considerations
    1. Cost-Aware Cache Replacement
      1. Memory-Level Parallelism (MLP)
    2. Criticality-Driven Cache Optimizations
      1. Critical Cache
      2. Criticality-Aware Multi-Level Cache Hierarchy
    3. Multi-Core-Aware Cache Management
      1. Cache Partitioning
      2. Shared-Cache-Aware Cache Replacement
    4. Prefetch-Aware Cache Replacement
      1. Cache Pollution
      2. Deprioritizing Prefetchable Lines
    5. Cache Architecture-Aware Cache Replacement
      1. Inclusion-Aware Cache Replacement
      2. Compression-Aware Cache Replacement
    6. New Technology Considerations
      1. NVM Caches
      2. DRAM Caches
  8. Conclusions
  9. Bibliography (1/2)
  10. Bibliography (2/2)
  11. Authors' Biographies
  12. Blank Page (1/3)
  13. Blank Page (2/3)
  14. Blank Page (3/3)
54.234.124.70