top of page
  • LinkedIn
  • Facebook
  • YouTube
  • Twitter
  • Instagram
  • Pinterest
  • Tumblr
  • Vkontakte

LRU Cache - Everything You Need To Know

Updated: Jun 3, 2022

In our earlier article we had an overview on caching and a brief introduction to various cache eviction policies. In this article we will have a detailed discussion on LRU or Least Recently Used cache eviction policy.


Introduction


LRU cache or least recently used cache is one of the most popular cache eviction strategies. It is also a very commonly asked interview question.


LRU cache keeps track of the order in which items in the cache were accessed. It stores items in the order in which they were requested. So the most recently used item will be at the top of the cache and the least recently used item will be at the end of the cache. In LRU strategy, when the cache is full, the item that hasn't been used for the longest time (least recently used item) will be eliminated or evicted from cache. It also provides a quick constant time access to items in cache.


The idea behind LRU cache is simple, if the cache has a capacity to store n items, then LRU cache stores the n most recently used items in cache. And when the cache is full and a new item needs to be added, it makes space for the new item by evicting the least recently used item.


Example


Lets understand LRU cache with an example. Consider we have an application where our users can read from a collection of books. For the ease of understanding let us assume that our application has 4 books in database: book1, book2, book3 and book4 and our cache limit is 3 (3 times can be stored at max in cache).


Initially the cache is empty since no requests have been made for any book.

Lets look at how LRU cache stores and evicts items from cache as we try to access data from it.


To begin with lets say a user makes a request for book1 from our application. Our application will check to see if the data is available in cache, since book1 is not available in cache it fetches it from database and an entry for "book1" is now stored in cache as shown below.