What kind of cache memory




















Term of the Day. Best of Techopedia weekly. News and Special Offers occasional. Cache Memory. Techopedia Explains Cache Memory. What Does Cache Memory Mean? Techopedia Explains Cache Memory Cache memory provides faster data storage and access by storing instances of programs and data routinely accessed by the processor. Share this Term. Tech moves fast! Stay ahead of the curve with Techopedia! Join nearly , subscribers who receive actionable tech insights from Techopedia. Thank you for subscribing to our newsletter!

Level 3 or Main Memory — It is memory on which computer works currently. It is small in size and once power is off data no longer stays in this memory. Level 4 or Secondary Memory — It is external memory which is not as fast as main memory but data stays permanently in this memory.

Cache Performance: When the processor needs to read or write a location in main memory, it first checks for a corresponding entry in the cache. If the processor finds that the memory location is in the cache, a cache hit has occurred and data is read from cache If the processor does not find the memory location in the cache, a cache miss has occurred.

For a cache miss, the cache allocates a new entry and copies in data from main memory, then the request is fulfilled from the contents of the cache.

The performance of cache memory is frequently measured in terms of a quantity called Hit ratio. Cache Mapping: There are three different types of mapping used for the purpose of cache memory which are as follows: Direct mapping, Associative mapping, and Set-Associative mapping.

These are explained below. Direct Mapping — The simplest technique, known as direct mapping, maps each block of main memory into only one possible cache line. If a line is previously taken up by a memory block when a new block needs to be loaded, the old block is trashed.

An address space is split into two parts index field and a tag field. The cache is used to store the tag field whereas the rest is stored in the main memory. The least significant w bits identify a unique word or byte within a block of main memory.

In most contemporary machines, the address is at the byte level. The remaining s bits specify one of the 2 s blocks of main memory. The cache logic interprets these s bits as a tag of s-r bits most significant portion and a line field of r bits.

Associative Mapping — In this type of mapping, the associative memory is used to store content and addresses of the memory word. Any block can go into any line of the cache.

This means that the word id bits are used to identify which word in the block is needed, but the tag becomes all of the remaining bits. This enables the placement of any word at any place in the cache memory.

It is considered to be the fastest and the most flexible mapping form. Set-associative Mapping — This form of mapping is an enhanced form of direct mapping where the drawbacks of direct mapping are removed. Set associative addresses the problem of possible thrashing in the direct mapping method. It does this by saying that instead of having exactly one line that a block can map to in the cache, we will group a few lines together creating a set.

The need for the cache memory is due to the mismatch between the speeds of the main memory and the CPU. The CPU clock is very fast, whereas the main memory access time is comparatively slower.

Hence, no matter how fast the processor is, the processing speed depends more on the speed of the main memory the strength of a chain is the strength of its weakest link.

It is because of this reason that a cache memory having access time closer to the processor speed is introduced. The cache memory stores the program or its part currently being executed or which may be executed within a short period of time. The cache memory also stores temporary data that the CPU may frequently require for manipulation. The cache memory works according to various algorithms, which decide what information it has to store.

These algorithms work out the probability to decide which data would be most frequently needed. This probability is worked out on the basis of past observations. It acts as a high speed buffer between CPU and main memory and is used to temporary store very active data and action during processing since the cache memory is faster then main memory, the processing speed is increased by making the data and instructions needed in current processing available in cache.



0コメント

  • 1000 / 1000