Caching Concepts

11
Understand CPU Caching Concepts

Transcript of Caching Concepts

Page 1: Caching Concepts

7/31/2019 Caching Concepts

http://slidepdf.com/reader/full/caching-concepts 1/11

Understand CPU Caching

Concepts

Page 2: Caching Concepts

7/31/2019 Caching Concepts

http://slidepdf.com/reader/full/caching-concepts 2/11

Need for Cache has come about due to reasons :

The concept of Locality of reference.

-> 5 percent of the data is accessed 95 percent of the times, somakes sense to cache the 5 percent of the data.

The gap between CPU and main memory speeds.

-> In analogy to producer consumer problem, the CPU is the consumerand RAM, hard disks act as producers. Slow producers limit

the performance of the consumer.

Concept of Caching

Page 3: Caching Concepts

7/31/2019 Caching Concepts

http://slidepdf.com/reader/full/caching-concepts 3/11

CPU Cache and its operation

A CPU cache is a smaller, faster memory which stores copies of thedata from the most frequently used main memory locations. Theconcept of Locality of reference drives caching concept, we cache themost frequently used, data, instruction for faster data access.

CPU cache could be data cache, instruction cache. Unlike RAM, cacheis not expandable.

The CPU first checks in the L1 cache for data, if it does not find it atL1, it moves over to L2 and finally L3. If not found at L3, it’s a cache

miss and RAM is searched next, followed by the hard drive

If the CPU finds the requested data in cache, it’s a cache hit, and if 

not, it’s a cache miss.

Page 4: Caching Concepts

7/31/2019 Caching Concepts

http://slidepdf.com/reader/full/caching-concepts 4/11

Levels of caching and speed, size

comparisonsLevel Access

TimeTypicalSize

Technology ManagedBy

Level 1Cache (on-

chip)

2-8 ns 8 KB-128 KB SRAM Hardware

Level 2Cache (off-chip)

5-12 ns 0.5 MB - 8MB

SRAM Hardware

Main Memory 10-60 ns 64 MB - 2

GB

DRAM Operating

SystemHard Disk 3,000,000 -

10,000,000 ns100 GB - 2TB

Magnetic OperatingSystem

Page 5: Caching Concepts

7/31/2019 Caching Concepts

http://slidepdf.com/reader/full/caching-concepts 5/11

Cache organization

When the processor needs to read or write a location in mainmemory, it first checks whether that memory location is in the cache.This is accomplished by comparing the address of the memory location

to all tags in the cache that might contain that address.

If the processor finds that the memory location is in the cache, we saythat a cache hit has occurred; otherwise, we speak of a cache miss.

Page 6: Caching Concepts

7/31/2019 Caching Concepts

http://slidepdf.com/reader/full/caching-concepts 6/11

Cache Performance

Cache Size

Cache Handling

Replacement Strategy

Automatic pre fetching

Page 7: Caching Concepts

7/31/2019 Caching Concepts

http://slidepdf.com/reader/full/caching-concepts 7/11

Handling Cache Miss

In order to make room for the new entry on a cache miss, the cachehas to evict one of the existing entries.

The heuristic that it uses to choose the entry to evict is called thereplacement policy. The fundamental problem with any replacementpolicy is that it must predict which existing cache entry is least likelyto be used in the future

One popular replacement policy, LRU, replaces the least recently usedentry

Page 8: Caching Concepts

7/31/2019 Caching Concepts

http://slidepdf.com/reader/full/caching-concepts 8/11

Mirroring Cache to Main memory

If data are written to the cache, they must at some point be written tomain memory as well. The timing of this write is controlled by what isknown as the write policy.

A write-through cache, every write to the cache causes a write to main

memory.

Write-back or copy-back cache, writes are not immediately mirrored tothe main memory. Instead, the cache tracks which locations have beenwritten over

Page 9: Caching Concepts

7/31/2019 Caching Concepts

http://slidepdf.com/reader/full/caching-concepts 9/11

Stale data in cache

The data in main memory being cached may be changed by otherentities (e.g. peripherals using direct memory access or multi-coreprocessor), in which case the copy in the cache may become out-of-date or stale.

Alternatively, when the CPU in a multi-core processor updates thedata in the cache, copies of data in caches associated with other coreswill become stale.

Communication protocols between the cache managers. Which keepthe data consistent are known as cache coherence protocols. anotherpossibility is to share non cacheable data.

Page 10: Caching Concepts

7/31/2019 Caching Concepts

http://slidepdf.com/reader/full/caching-concepts 10/11

References

Wikipedia : http://en.wikipedia.org/wiki/CPU_cache 

ArsTechnica : http://arstechnica.com/ 

http://software.intel.com

What Every Programmer Should Know About Memory -

- Ulrich Drepper, Red Hat, Inc.

Page 11: Caching Concepts

7/31/2019 Caching Concepts

http://slidepdf.com/reader/full/caching-concepts 11/11

Q/A