Associative Cache – Set Associative Mapping

Set-associative cache architecture is a hybrid approach between:

  • Direct-mapped cache (1 location per block)
  • Fully associative cache (any location per block)

In set-associative cache, the cache is divided into multiple sets, and each set contains multiple lines (called ways).

  • A block of main memory can map to any line (way) within a set.
  • This improves flexibility and reduces thrashing.

Example: 4-way Set Associative Cache

  • Total cache size: 4 KB
  • Cache line size: 4 words
  • Total lines: 256
  • Number of ways: 4
  • Each way: 64 lines
    So, each set contains 4 lines (1 from each way)

Instead of just 1 cache line for each main memory block (like in direct-mapped), now we have 4 possible cache lines (ways) in the same set.


Main Memory Mapping (as in Fig. 5.8)

  • A memory address is divided into:
    • Tag field
    • Set index
    • Block offset / Data index

All memory blocks with the same Set Index go to the same Set.

But inside that set, they can go into any of the N ways, improving flexibility and avoiding frequent evictions.

Working:

  1. Set Index: Selects one set among all sets.
  2. Tag Comparison: Compares input tag with all 4 tags inside the set (Way 0–3).
  3. Cache Hit:
    • If a match is found in any tag ➝ Data is served from that line.
  4. Cache Miss:
    • If no match ➝ Cache controller loads data from main memory into one of the 4 ways.
    • Least Recently Used (LRU) or similar replacement policy is used.

Thrashing Reduction

  • In direct-mapped cache, multiple memory blocks compete for the same line ➝ leads to thrashing.
  • In set-associative cache, memory blocks share the set, not one line ➝ less eviction ➝ reduced thrashing.

Example:
If routine A and B mapped to same set:

  • Direct-mapped ➝ evict each other repeatedly.
  • 4-way set-associative ➝ A in Way 0, B in Way 1 ➝ No eviction.

Higher Associativity – CAM-based Cache (Fig. 5.9)

  • More associativity ➝ More flexibility in placement ➝ Less cache miss
  • Example: 64-way set-associative cache
  • Uses Content Addressable Memory (CAM) to match tags in parallel

Leave a Reply

Your email address will not be published. Required fields are marked *