MEMORIA CACHÉ | ARQUITECTURA DE COMPUTADORAS
Understanding Cache Memory
What is Cache Memory?
- Cache memory is located between the main memory (RAM) and the CPU, designed to be faster than both primary and secondary storage.
- It holds frequently accessed copies of data from the main memory, facilitating quicker access for the CPU.
- The main memory stores instruction words and data words, which are sequences of bits required by the CPU.
Functionality of Cache Memory
- Cache acts as an intermediary to speed up data transfer between the CPU and main memory by organizing information into blocks.
- Accessing cache is significantly faster than accessing main memory; however, there are multiple levels of cache that vary in speed and size.
Levels of Cache
- Modern CPUs often have multiple cache levels (e.g., L1, L2, L3), where closer proximity to the CPU results in higher speeds but smaller sizes.
- The organization allows for efficient retrieval of frequently accessed data blocks stored in different cache lines.
Cache Hits and Misses
- A "cache hit" occurs when the CPU finds requested data in cache; a "cache miss" requires fetching from slower main memory.
- A cache miss leads to increased latency as it necessitates loading a block from main memory into cache before accessing it.
Design Considerations for Cache Memory
- Key design factors include size limitations due to cost versus performance benefits gained from faster data exchanges with the CPU.