Direct mapping
If each entry in main memory can go in just one place in the cache.

Fully Associative mapping
If the replacement policy is free to choose any entry in the cache to hold the copy. This mapping function is basically a hypothetical one, and very expensive to implemented in the real environment as it demands lots of hardwire complexity (parallel searching must be implemented).
Set Associative mapping

Associativity is a trade-off. If there are ten places the replacement policy can put a new cache entry, then when the cache is checked for a hit, all ten places must be searched. Checking more places takes more power, area, and potentially time. On the other hand, caches with more associativity suffer fewer misses (see conflict misses, below), so that the CPU spends less time servicing those misses. The rule of thumb is that doubling the associativity, from direct mapped to 2-way, or from 2-way to 4-way, has about the same effect on hit rate as doubling the cache size. Associativity increases beyond 4-way have much less effect on the hit rate, and are generally done for other reasons (see virtual aliasing, below).

Figure: Miss rate against cache size
thanks...very useful
ReplyDelete