Cache memory is a small, high-speed memory component that temporarily stores copies of frequently accessed data from the main memory. It sits closer to the CPU than the main memory for faster data retrieval and decreased latency.

Benefits of Cache:

  1. Faster retrieval: Faster than main memory, as it resides closer to the CPU.
  2. Reducing memory latency: Memory access latency refers to time taken for processes to retrieve data from memory. Caches are designed to exploit principle of locality.
  3. Lower bus traffic: By utilizing cache memory, the processor can save the bus from excessive traffic and congestion.
  4. Increasing CPU utilization efficiency: By lowering data retrieval time, the processor can spend more time executing instructions instead of waiting for data.

Cache levels

Level 1 Cache - Smallest and fastest cache that is present in the CPU. Often split into instructions and data cache.

Level 2 Cache - Larger and slower than L1, but still much faster than RAM. L2 Cache can be present in and out of the CPU. If not present inside the core, it can be shared between two cores depending upon the architecture and is connected to a processor with the high-speed bus.

Level 3 Cache - Larger and slower than L2, typically shared across all cores of a CPU.

Level 4 Cache - Not common in all architectures. It exists in certain high-end processors and can be even larger than L3.

Simulated-multicore-systems-showing-two-different-cache-memory-organizations.png