Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can a lower level cache have higher associativity and still hold inclusion?

Can a lower level cache have higher associativity and still hold inclusion?

Suppose we have 2-level of cache.(L1 being nearest to CPU and L2 being nearest to main memory) L1 cache is 2-way set associative with 4 sets and let's say L2 cache is direct mapped with 16 cache lines and assume that both caches have same block size. Then I think it will follow inclusion property even though L1(lower level) has higher associativity than L2 (upper level).

As per my understanding, lower level cache can have higher associativity (and still hold inclusion). This will only change the number of tag bits (as seen in physical address at each level), number of comparators and MUX to be used.Please let me know if this is correct.

like image 523
rohith Avatar asked Dec 18 '19 08:12

rohith


People also ask

How does cache associativity affect the cache performance?

Set sizes range from 1 (direct-mapped) to 2k (fully associative). Larger sets and higher associativity lead to fewer cache conflicts and lower miss rates, but they also increase the hardware cost. In practice, 2-way through 16-way set-associative caches strike a good balance between lower miss rates and higher costs.

How does cache increase associativity?

One method used by hardware designers to increase the set associativity of a cache includes a content addressable memory (CAM). A CAM uses a set of comparators to compare the input tag address with a cache-tag stored in each valid cache line. A CAM works in the opposite way a RAM works.

What is inclusion property of cache memory?

Inclusion Property: it implies that all information items are originally stored in level Mn. During the processing, subsets of Mn are copied into Mn-1. similarity, subsets of Mn-1 are copied into Mn-2, and so on.

What is the associativity of a cache?

In a set associative cache, there are a fixed number of locations (called a set) that a given address may be stored in. The number of locations in each set is the associative of the cache.


2 Answers

Inclusion is a property that is enforced on the contents of the included cache, and is independent of the associativity of the caches. Inclusion provides the benefit of eliminating most snoops of the included cache, which allows an implementation to get away with reduced tag bandwidth, and may also result in reduced snoop latency.

Intuitively, it makes sense that when the enclosing cache has more associativity than the included cache, the contents of the included cache should always "fit" into the enclosing cache. This "static" view is an inappropriate oversimplification. Differences in replacement policy and access patterns will almost always generate cases in which lines are chosen as victim in the enclosing cache before being chosen as victim in the included cache. The inclusion policy requires that such lines be evicted from the included cache -- independent of associativity.

The case that is intuitively more problematic occurs when the enclosing cache has less associativity than the included cache. In this case it is clear that associativity conflicts in the enclosing cache will force evictions from the included cache.

In either case, judging whether the additional evictions from the included cache outweigh the benefits of inclusion is multi-dimensional. The performance impact will depend on the specific sizes, associativities, and indexing of the caches, as well as on the application access patterns. The importance of the performance impact depends on the application characteristics -- e.g., tightly-coupled parallel applications typically show throughput proportional to the worst-case performance of the participating processors, while independent applications typically show throughput proportional to the average performance of the participating processors.

like image 87
John D McCalpin Avatar answered Sep 28 '22 15:09

John D McCalpin


Yes, but conflict evictions in the outer cache may force eviction from the inner cache to maintain inclusivity.

(I think if both caches use simple indexing, you wouldn't have that with an outer cache that's larger and at least as associative, because aliasing in the outer cache would only happen when you also alias in the inner.)

With the outer cache being larger you don't necessarily get aliasing in it for lines that would alias in L1, so it's not useless.

But it is unusual: usually outer caches are larger and more associative than inner caches because they don't have to be as fast, and high hit rate is more valuable.

If you were going to make the outer cache less associative than an inner cache, making it NINE (non-inclusive non-exclusive) might be a better idea. But you only asked if it was possible.

like image 36
Peter Cordes Avatar answered Sep 28 '22 16:09

Peter Cordes