Cache Memory Mapping Techniques
The clock speed of the CPU is much faster than the main memory, So, the CPU requires fast memory. Such a fast and small memory is referred to as a ‘cache memory‘. The Cache Memory is the intermediate Memory between the CPU and the main memory.
Basic Operation of Cache Memory:
When the CPU needs to access memory, the cache is examined. If the word is found in the cache, it is read from cache memory. If the word isn’t found in the cache, the CPU read the word from the main memory and the same word is copied into cache memory from the main memory.
Cache Hit Ratio:
The performance of cache memory is measured in terms of a quantity called “Hit Ratio“. If the word isn’t found in the cache memory, it is in the main memory, it counts as ‘miss‘, if the word is found in the cache memory then it is called ‘hit‘. The ratio of the number of hits divided by the total CPU references to memory (hit+misses) is the hit ratio.
Hit Ratio = number of hits / (number of hits + number of misses)
Cache Memory Mapping:
The basic characteristics of cache memory are its fast access time. So, it is very little or no time must be wasted when searching for words in the cache. The transformation of data from main memory to cache memory is referred to as a ‘Mapping‘ process. There are 3 types of mapping procedures are there for cache memory:
1. Associative Mapping
2. Direct Mapping
3. Set-Associative Mapping
The associative memory stores both the address and the content (data) of the memory word. This permits any location in the cache to store any word from the main memory.
In this mapping procedure, the CPU address of 15 bits is divided into two fields. One is the index field of 9 bits and the second is a tag equal to 6 bits. The number of bits in the index field is equal to the number of address bits required to access the cache memory.
The third type of cache organization is called set-associative mapping. In this mapping, each word of the cache can store two or more words of memory under the same index address. Each data word is stored together with its tag and the number of tag data items in one word of cache is said to form a set.
The design of the cache depends on 5 factors. These are:
i. Cache Size: The small caches can have a significant impact on performance. If the size increases the performance will decrease.
ii. Block Size: The block size is the unit of data exchanged between the cache and main memory. As the block size increases, ‘more unuseful data is brought into the cache. The hit ratio will begin to decrease.
iii. Mapping Function: The transformation of data from main memory to cache memory is referred to as a mapping process. The mapping function determines which location the block will occupy.
iv. Replacement Algorithm: This factor determines which block is replace. If the cache is full. We would like to replace a block that is least likely to be needed again shortly.
v. Write Policy: The write policy dictates when the memory write operation takes place.