Open In App

Design LRU Cache

Last Updated : 03 Jan, 2025
Summarize
Comments
Improve
Suggest changes
Like Article
Like
Share
Report
News Follow

Design a data structure for LRU Cache. It should support the following operations: get and put.

  • get(key) – Returns the value of the given key if it exists in the cache; otherwise, returns -1.
  • put(key, value) – Inserts or updates the key-value pair in the cache. If the cache reaches capacity, remove the least recently used item before adding the new item.
Working-of-LRU-Cache-copy-2


How to design your own LRU Cache?

Input: [LRUCache cache = new LRUCache(2) , put(1 ,1) , put(2 ,2) , get(1) , put(3 ,3) , get(2) , put(4 ,4) , get(1) , get(3) , get(4)]
Output: [1 ,-1, -1, 3, 4]
Explanation: The values mentioned in the output are the values returned by get operations.

  • Initialize LRUCache class with capacity = 2.
  • cache.put(1, 1): (key, pair) = (1,1) inserted and has the highest priority.
  • cache.put(2, 2): (key , pair) = (2,2) inserted and has the highest priority.
  • cache.get(1): For key 1, value is 1, so 1 returned and (1,1) moved to the highest priority.
  • cache.put(3, 3): Since cache is full, remove least recently used that is (2,2), (3,3) inserted with the highest priority.
  • cache.get(2): returns -1 (key 2 not found)
  • cache.put(4, 4): Since the cache is full, remove least recently used that is (1,1). (4,4) inserted with the highest priority.
  • cache.get(1): return -1 (not found)
  • cache.get(3): return 3 , (3,3) will moved to the highest priority.
  • cache.get(4): return 4 , (4,4) moved to the highest priority.

[Naive Approach – 1] LRU Cache using an Array of Nodes – O(n) Time and O(n) Space:

The idea is to implement using an array to store nodes, where each node holds a key-value pair. The primary operations, get and put, are performed with O(n) time complexity due to the need to search through the array. The size of the array will be equal to the given capacity of the cache. 

  • put(int key, int value)
    • If the cache is full, find the node with the oldest timestamp (least recently used) and replace this node with the new key and value.
    • else, simply add the new node to the end of the array with the timestamp of insertion.
    • Time Complexity: O(n) (because you might have to search for the oldest node)
  • get(int key)
    • Search through the array for the node with the matching key.
    • If found, update its timestamp and return its value , else return -1.
    • Time Complexity: O(n) (because you might have to check every node)

We initialize an array of size equal to that of our cache. Here each data element stores extra information to mark with an access time stamp. It shows the time at which the key is stored. We will use the timeStamp to find out the least recently used element in the LRU cache.

C++
// C++ class to represent a node in an LRU Cache
class LRUCache {
  public:
    int key;
    int value;
    int timeStamp;

    LRUCache(int k, int data, int time) {
        key = k;
        value = data;
        timeStamp = time;
    }
};
Java Python C# JavaScript

[Naive Approach – 2] LRU Cache using Singly Linked List – O(n) Time and O(n) Space:

The approach to implement an LRU (Least Recently Used) cache involves using a singly linked list to maintain the order of cache entries.

  • get operation: The cache searches for the node with the requested key by traversing the list from the head. If the key is found, the node is moved to the head of the list to mark it as the most recently used, and its value is returned. else, returns -1. This operation has a time complexity of O(n) because it may require scanning through the entire list.
  • put operation: The cache inserts a new key-value pair at the head of the list if the cache has not reached its capacity. If the key already exists, the corresponding node is found and updated, then moved to the head. When the cache reaches its capacity, the least recently used element, which is located at the tail of the list, is removed. The time complexity for this operation is also O(n) due to the traversal and reordering steps involved.

[Expected Approach] LRU Cache using Doubly Linked List and Hashing- O(1) Time and O(1) Space:

The basic idea behind implementing an LRU (Least Recently Used) cache using a key-value pair approach is to manage element access and removal efficiently through a combination of a doubly linked list and a hash map.

  • When adding a new key-value pair, insert it as a new node at the head of the doubly linked list. This ensures that the newly added key-value pair is marked as the most recently used.
  • If the key is already present in the cache, get the corresponding node in the doubly linked list using hashmap, update its value and move it to the head of the list.
  • When the cache reaches its maximum capacity and a new key-value pair needs to be added, remove the node from the tail in the doubly linked list. and hash map as well.


Please refer to codes LRU Cache implementation using Doubly Linked Lists for implementation.



Next Article

Similar Reads

three90RightbarBannerImg