The story of cache — Car, Truck and Superbike

Harsimar Singh
3 min readAug 28, 2022

--

The buzzword here is memory. Your brain has little synapses to trigger the “fetch and retrieve” mechanism of some actions/response which are either new stimulus or inspired by previous activity. The feedback mechanism then stores this behaviour and read when something on similar lines happen. Similarly our computers use memory to store any piece of information which is retrieved and stored by the real electricity. Now this information is stored in the form of 0’s and 1’s. Why do we store it in format? The answer will be found here in this blog.

Let’s draw some comparisons —

In the above analogies we are clear about certain prospects.

When we talk about memory we should visualise a long tape on which there is a needle which is reading and writing. The quicker the read and write the efficient it is but “great power comes with greater responsibilities”. We can define memory hierarchy based on the time to fetch and it’s cost.

memory hierarchy

Mechanical drives are cheap and have large capacity and are slow. Cache is fast and are usually in very less size in contrast to hard disk. 4MB cache vs 4TB of hard disk. What are the uses ?

When we switch on computer, bunch of operations loads the OS and we see an interactive Graphical User Interface which enables us to run software, blog, Tweet etc. E.g. you see that your favourite TV show has released a new season and you are eager to watch it. When you click on web browser, this is what OS is doing -

Because the hard disk is too slow, the initial piece of instructions are transferred to RAM which is faster and quickly moves our data from source to target and manipulating files. Work of truck here is to save the files for longer time like a depot storage and car will quickly push to local warehouse for local stuff. Now comes the superbike which will effectively connects local warehouse to individual houses. At first the superbike guy will search the name on package and will look for house. Now he will learn where the actual location is. Next time when he gets the name of the person, he will quickly launch himself and thus saving us crucial time. This is what cache do every time.

Cache is high performance memory closer to CPU which the CPU will access before calling RAM and it will ask the cache — “Do you have this WORD I am looking for?”If yes we got a hit else it is a miss. So here we are trying to optimise the search time because the less the time the more fast we can execute an instruction. Now cache can only save few words, what to do when a new one appears? We have some replacement policies which actually replaces an old word and brings in a new one. We have different levels of cache as well such as L1 cache, L2 cache and so on. So each stage runs on tradeoff between cost and performance.

Remember to flush cache when old data gets out of sync!

--

--

Harsimar Singh
Harsimar Singh

Written by Harsimar Singh

Co-Founder VAAR Lab, Alumni IIT Ropar ( 2018-2020 ) M.Tech CSE. Loves breaking complex things into granular objects like Rutherford did.

No responses yet