{"id":2651,"date":"2021-10-13T19:20:00","date_gmt":"2021-10-13T13:50:00","guid":{"rendered":"https:\/\/www.interviewbit.com\/blog\/?p=2651"},"modified":"2021-10-13T18:14:54","modified_gmt":"2021-10-13T12:44:54","slug":"lru-cache","status":"publish","type":"post","link":"https:\/\/www.interviewbit.com\/blog\/lru-cache\/","title":{"rendered":"LRU Cache"},"content":{"rendered":"\n<div class=\"gutentoc tocactive nostyle\" style=\"max-width:400px\"><div class=\"gutentoc-toc-wrap\"><div class=\"gutentoc-toc-title-wrap\"><div class=\"gutentoc-toc-title\">Table Of Contents<\/div><div id=\"open\" class=\"toggletwo\">show<\/div><\/div><div id=\"toclist\"><div class=\"gutentoc-toc__list-wrap\"><ul class=\"gutentoc-toc__list\"><li><a href=\"#problem-statement\">Problem Statement<\/a><\/li><ul class=\"gutentoc-toc__list\"><li><a href=\"#c-implementation\">C++ Implementation<\/a><\/li><li><a href=\"#java--implementation\">Java <meta charset=\"utf-8\">Implementation<\/a><\/li><li><a href=\"#python--implementation\">Python <meta charset=\"utf-8\">Implementation<\/a><\/li><\/ul><li><a href=\"#practice-questions\">Practice Questions:<\/a><\/li><li><a href=\"#frequently-asked-questions\">Frequently Asked Questions<\/a><\/li><\/ul><\/div><\/div><\/div><\/div>\n\n\n\n<h2 id=\"problem-statement\">Problem Statement<\/h2>\n\n\n\n<p>Design and implement a data structure for LRU (Least Recently Used) cache. It should support the following operations: get and set.<\/p>\n\n\n\n<ul><li><strong>get(key)<\/strong> &#8211; Get the value (will always be positive) of the key if the key exists in the cache, otherwise return <strong>-1<\/strong>.<\/li><li><strong>set(key, value)<\/strong> &#8211; Set or insert the value if the key is not already present. When the cache reaches its capacity, it should invalidate the least recently used item before inserting the new item.<\/li><\/ul>\n\n\n\n<p>The LRU Cache will be initialized with an integer corresponding to its capacity. Capacity indicates the maximum number of unique keys it can hold at a time.<br><br>Definition of \u201cleast recently used\u201d : An access to an item is defined as a get or a set operation of the item. \u201cLeast recently used\u201d item is the one with the oldest access time.<\/p>\n\n\n\n<p><strong>Examples:<\/strong><\/p>\n\n\n\n<p><strong>Input:<\/strong> [&#8220;LRUCache&#8221;, &#8220;set&#8221;, &#8220;set&#8221;, &#8220;get&#8221;, &#8220;set&#8221;, &#8220;get&#8221;, &#8220;set&#8221;, &#8220;get&#8221;, &#8220;get&#8221;, &#8220;get&#8221;]<br>           [[2], [1, 1], [2, 2], [1], [3, 3], [2], [4, 4], [1], [3], [4]]<br><br><strong>Output:<\/strong> [null, null, null, 1, null, -1, null, -1, 3, 4]<br><br><strong>Explanation:<\/strong> <br>lRUCache.set(1, 1); \/\/ cache is {1=1} <br>lRUCache.set(2, 2); \/\/ cache is {1=1, 2=2}<br>lRUCache.get(1);&nbsp; &nbsp; \/\/ return 1<br>lRUCache.set(3, 3); \/\/ LRU key was 2, evicts key 2, cache is {1=1, 3=3}<br>lRUCache.get(2);&nbsp; &nbsp; \/\/ returns -1 (not found)<br>lRUCache.set(4, 4); \/\/ LRU key was 1, evicts key 1, cache is {4=4, 3=3}<br>lRUCache.set(1);&nbsp; &nbsp; \/\/ return -1 (not found)<br>lRUCache.set(3);&nbsp; &nbsp; \/\/ return 3<br>lRUCache.set(4);&nbsp; &nbsp; \/\/ return 4<\/p>\n\n\n\n<p id=\"approach\"><strong>Approach<\/strong><\/p>\n\n\n\n<p>Before directly jumping into the algorithm, let us first analyse the properties of <strong>LRU Cache.<\/strong><\/p>\n\n\n\n<ul><li><strong>Ordered: <\/strong>The input has to be ordered to distinguish the recently used and longest unused.<\/li><li><strong>Fast Search: <\/strong>Able to identify if a key exists in the cache<strong>.<\/strong><\/li><li><strong>Fast Deletion: <\/strong>If the cache is full, delete the last element.<\/li><li><strong>Fast Insertion: <\/strong>Insert the data to the head upon each visit.<\/li><\/ul>\n\n\n\n<p>From the above, it can be concluded that a <strong>hashmap <\/strong>can be used for searching since searching in a <strong>hashmap<\/strong> is O(1). But in <strong>hashmap<\/strong> data is unordered, hence it would take <strong>O(N)<\/strong> time for deletion and insertion.Another data structure we can think of is a <strong>linked list<\/strong>. In the linked list, the data is ordered. Hence, deletion and insertion can be done in <strong>O(1) <\/strong>or constant time, but searching in a <strong>linked list <\/strong>takes linear time.<\/p>\n\n\n\n<p>So, the idea is to come up with a new data structure combining both the ideas above i.e. <strong>hashed linked list.<\/strong><\/p>\n\n\n\n<p>The data structure <strong>hashed linked list <\/strong>consists of a <strong>hash map<\/strong> and a <strong>doubly linked list<\/strong>. This ensures that all the operations can be performed in <strong>O(1)<\/strong>.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img  loading=\"lazy\"  width=\"854\"  height=\"500\"  src=\"data:image\/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABAQMAAAAl21bKAAAAA1BMVEUAAP+KeNJXAAAAAXRSTlMAQObYZgAAAAlwSFlzAAAOxAAADsQBlSsOGwAAAApJREFUCNdjYAAAAAIAAeIhvDMAAAAASUVORK5CYII=\"  alt=\"LRU Cache 1\"  class=\"wp-image-2716 pk-lazyload\"  data-pk-sizes=\"auto\"  data-ls-sizes=\"(max-width: 854px) 100vw, 854px\"  data-pk-src=\"https:\/\/www.interviewbit.com\/blog\/wp-content\/uploads\/2021\/10\/LRU-Cache-1.png\"  data-pk-srcset=\"https:\/\/www.interviewbit.com\/blog\/wp-content\/uploads\/2021\/10\/LRU-Cache-1.png 854w, https:\/\/www.interviewbit.com\/blog\/wp-content\/uploads\/2021\/10\/LRU-Cache-1-300x176.png 300w, https:\/\/www.interviewbit.com\/blog\/wp-content\/uploads\/2021\/10\/LRU-Cache-1-768x450.png 768w, https:\/\/www.interviewbit.com\/blog\/wp-content\/uploads\/2021\/10\/LRU-Cache-1-380x222.png 380w, https:\/\/www.interviewbit.com\/blog\/wp-content\/uploads\/2021\/10\/LRU-Cache-1-550x322.png 550w, https:\/\/www.interviewbit.com\/blog\/wp-content\/uploads\/2021\/10\/LRU-Cache-1-800x468.png 800w\" ><\/figure>\n\n\n\n<p>The question remains, why do we need a <strong>doubly-linked list<\/strong> and not a <strong>single linked list<\/strong>?<br>The answer is, whenever we need to delete a node from the linked list, it can be easily done by updating the pointer of the nodes present before and after the given nodes. So, deletion can be performed in <strong>constant time.<\/strong><\/p>\n\n\n\n<p><strong>Algorithm:<\/strong><\/p>\n\n\n\n<ul><li>For the <strong>set <\/strong>function, update the value of the given node in the hashed linked list and insert it at the front of the list. If a node with the given value already exists, delete that node and update the pointers.<\/li><li>In case, the capacity of the <strong>cache <\/strong>is <strong>full<\/strong>, delete the last node of the linked list.<\/li><li>For the <strong>get<\/strong> function, it can be obtained by printing the hash value corresponding to the key.<\/li><\/ul>\n\n\n\n<h3 id=\"c-implementation\">C++ Implementation<\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"cpp\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">class Node {\n  public:\n  int key, value;\n  Node *prev, *next;\n  Node(int k, int v): key(k), value(v), prev(NULL), next(NULL) {}\n};\n \nclass DoublyLinkedList {\n  Node *front, *rear;\n  \n  bool isEmpty() {\n      return rear == NULL;\n  }\n \n  public:\n  DoublyLinkedList(): front(NULL), rear(NULL) {}\n  \n  Node* add_page_to_head(int key, int value) {\n      Node *page = new Node(key, value);\n      if(!front &amp;&amp; !rear) {\n          front = rear = page;\n      }\n      else {\n          page->next = front;\n          front->prev = page;\n          front = page;\n      }\n      return page;\n  }\n \n  void move_page_to_head(Node *page) {\n      if(page==front) {\n         return;\n      }\n      if(page == rear) {\n          rear = rear->prev;\n          rear->next = NULL;\n      }\n      else {\n          page->prev->next = page->next;\n          page->next->prev = page->prev;\n      }\n \n      page->next = front;\n      page->prev = NULL;\n      front->prev = page;\n      front = page;\n  }\n \n  void remove_rear_page() {\n      if(isEmpty()) {\n          return;\n      }\n      if(front == rear) {\n          delete rear;\n          front = rear = NULL;\n      }\n      else {\n          Node *temp = rear;\n          rear = rear->prev;\n          rear->next = NULL;\n          delete temp;\n      }\n  }\n  Node* get_rear_page() {\n      return rear;\n  }\n  \n};\n \nclass LRUCache{\n int capacity, size;\n  DoublyLinkedList *pageList;\n  map&lt;int, Node*> pageMap;\n \n  public:\n    LRUCache(int capacity) {\n      this->capacity = capacity;\n      size = 0;\n        pageList = new DoublyLinkedList();\n        pageMap = map&lt;int, Node*>();\n    }\n \n    int get(int key) {\n        if(pageMap.find(key)==pageMap.end()) {\n          return -1;\n        }\n        int val = pageMap[key]->value;\n \n        \/\/ move the page to front\n        pageList->move_page_to_head(pageMap[key]);\n        return val;\n    }\n \n    void put(int key, int value) {\n      if(pageMap.find(key)!=pageMap.end()) {\n          \/\/ if key already present, update value and move page to head\n          pageMap[key]->value = value;\n          pageList->move_page_to_head(pageMap[key]);\n          return;\n      }\n \n        if(size == capacity) {\n          \/\/ remove rear page\n          int k = pageList->get_rear_page()->key;\n          pageMap.erase(k);\n          pageList->remove_rear_page();\n          size--;\n        }\n \n \/\/ add new page to head to Queue\n        Node *page = pageList->add_page_to_head(key, value);\n        size++;\n        pageMap[key] = page;\n    }\n \n    ~LRUCache() {\n      map&lt;int, Node*>::iterator i1;\n      for(i1=pageMap.begin();i1!=pageMap.end();i1++) {\n          delete i1->second;\n      }\n      delete pageList;\n    }\n};<\/pre>\n\n\n\n<h3 id=\"java--implementation\"><span id=\"java-implementation\">Java <meta charset=\"utf-8\">Implementation<\/span><\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"java\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">public class LRUCache {\n \n  class DLinkedNode {\n    int key;\n    int value;\n    DLinkedNode prev;\n    DLinkedNode next;\n  }\n \n  private void addNode(DLinkedNode node) {\n    \/**\n     * Always add the new node right after head.\n     *\/\n    node.prev = head;\n    node.next = head.next;\n \n    head.next.prev = node;\n    head.next = node;\n  }\n \n  private void removeNode(DLinkedNode node){\n    \/**\n     * Remove an existing node from the linked list.\n     *\/\n    DLinkedNode prev = node.prev;\n    DLinkedNode next = node.next;\n \n    prev.next = next;\n    next.prev = prev;\n  }\n \n  private void moveToHead(DLinkedNode node){\n    \/**\n     * Move certain node in between to the head.\n     *\/\n    removeNode(node);\n    addNode(node);\n  }\n \n  private DLinkedNode popTail() {\n    \/**\n     * Pop the current tail.\n     *\/\n    DLinkedNode res = tail.prev;\n    removeNode(res);\n    return res;\n  }\n \n  private Map&lt;Integer, DLinkedNode> cache = new HashMap&lt;>();\n  private int size;\n  private int capacity;\n  private DLinkedNode head, tail;\n \n  public LRUCache(int capacity) {\n    this.size = 0;\n    this.capacity = capacity;\n \n    head = new DLinkedNode();\n    \/\/ head.prev = null;\n \ntail = new DLinkedNode();\n    \/\/ tail.next = null;\n \n    head.next = tail;\n    tail.prev = head;\n  }\n \n  public int get(int key) {\n    DLinkedNode node = cache.get(key);\n    if (node == null) return -1;\n \n    \/\/ move the accessed node to the head;\n    moveToHead(node);\n \n    return node.value;\n  }\n \n  public void set(int key, int value) {\n    DLinkedNode node = cache.get(key);\n \n    if(node == null) {\n      DLinkedNode newNode = new DLinkedNode();\n      newNode.key = key;\n      newNode.value = value;\n \n      cache.put(key, newNode);\n      addNode(newNode);\n \n      ++size;\n \n      if(size > capacity) {\n        \/\/ pop the tail\n        DLinkedNode tail = popTail();\n        cache.remove(tail.key);\n        --size;\n      }\n    } else {\n      \/\/ update the value.\n      node.value = value;\n      moveToHead(node);\n   }\n  }\n}<\/pre>\n\n\n\n<h3 id=\"python--implementation\"><span id=\"python-implementation\">Python <meta charset=\"utf-8\">Implementation<\/span><\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">class DLinkedNode(): \n    def __init__(self):\n        self.key = 0\n        self.value = 0\n        self.prev = None\n        self.next = None\n            \nclass LRUCache():\n    def _add_node(self, node):\n        \"\"\"\n        Always add the new node right after head.\n        \"\"\"\n        node.prev = self.head\n        node.next = self.head.next\n \n        self.head.next.prev = node\n        self.head.next = node\n \n    def _remove_node(self, node):\n        \"\"\"\n        Remove an existing node from the linked list.\n        \"\"\"\n        prev = node.prev\n        new = node.next\n \n        prev.next = new\n        new.prev = prev\n \n    def _move_to_head(self, node):\n        \"\"\"\n        Move certain node in between to the head.\n        \"\"\"\n      self._remove_node(node)\n        self._add_node(node)\n \n    def _pop_tail(self):\n        \"\"\"\n        Pop the current tail.\n        \"\"\"\n        res = self.tail.prev\n        self._remove_node(res)\n        return res\n \n    def __init__(self, capacity):\n        \"\"\"\n        :type capacity: int\n        \"\"\"\n        self.cache = {}\n        self.size = 0\n        self.capacity = capacity\n        self.head, self.tail = DLinkedNode(), DLinkedNode()\n \n        self.head.next = self.tail\n        self.tail.prev = self.head\n        \n \n    def get(self, key):\n        \"\"\"\n        :type key: int\n        :rtype: int\n        \"\"\"\n        node = self.cache.get(key, None)\n        if not node:\n            return -1\n \n        # move the accessed node to the head;\n        self._move_to_head(node)\n \n        return node.value\n \n    def set(self, key, value):\n        \"\"\"\n  :type key: int\n        :type value: int\n        :rtype: void\n        \"\"\"\n        node = self.cache.get(key)\n \n        if not node: \n            newNode = DLinkedNode()\n            newNode.key = key\n            newNode.value = value\n \n            self.cache[key] = newNode\n            self._add_node(newNode)\n \n            self.size += 1\n \n            if self.size > self.capacity:\n                # pop the tail\n                tail = self._pop_tail()\n                del self.cache[tail.key]\n                self.size -= 1\n        else:\n            # update the value.\n            node.value = value\n            self._move_to_head(node)<\/pre>\n\n\n\n<p><strong>Time Complexity:<\/strong> O(1) for all operations<br><strong>Space Complexity:<\/strong> O(N), as extra space, is used.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-text-color has-background has-black-background-color has-black-color is-style-wide\"\/>\n\n\n\n<h2 id=\"practice-questions\">Practice Questions:<\/h2>\n\n\n\n<p><a href=\"https:\/\/www.interviewbit.com\/problems\/rearrange-array\/\" target=\"_blank\" rel=\"noreferrer noopener\">Rearrange Array<\/a><\/p>\n\n\n\n<hr class=\"wp-block-separator has-text-color has-background has-black-background-color has-black-color is-style-wide\"\/>\n\n\n\n<h2 id=\"frequently-asked-questions\">Frequently Asked Questions<\/h2>\n\n\n\n<p><strong>What is a doubly linked list used instead of a single linked list for LRU Cache?<br><\/strong>Since we need to perform the <strong>set <\/strong>and <strong>get <\/strong>functions in <strong>O(1)<\/strong> time, a doubly linked list ensures the deletion of a node in O(1) by adjusting its front and rear pointers.<\/p>\n\n\n\n<p><strong>Is there some other approach to solve the LRU cache?<\/strong><strong><br><\/strong>Yes, there can be several approaches for the problem, but using a hashed linked list is the most suitable data structure since it ensures all the operations in O(1).<\/p>\n","protected":false},"excerpt":{"rendered":"Problem Statement Design and implement a data structure for LRU (Least Recently Used) cache. It should support the&hellip;\n","protected":false},"author":4,"featured_media":2715,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_daextam_enable_autolinks":"","csco_singular_sidebar":"","csco_page_header_type":"","csco_appearance_grid":"","csco_page_load_nextpost":"","csco_post_video_location":[],"csco_post_video_location_hash":"","csco_post_video_url":"","csco_post_video_bg_start_time":0,"csco_post_video_bg_end_time":0},"categories":[145],"tags":[452],"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/www.interviewbit.com\/blog\/wp-json\/wp\/v2\/posts\/2651"}],"collection":[{"href":"https:\/\/www.interviewbit.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.interviewbit.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.interviewbit.com\/blog\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.interviewbit.com\/blog\/wp-json\/wp\/v2\/comments?post=2651"}],"version-history":[{"count":5,"href":"https:\/\/www.interviewbit.com\/blog\/wp-json\/wp\/v2\/posts\/2651\/revisions"}],"predecessor-version":[{"id":2785,"href":"https:\/\/www.interviewbit.com\/blog\/wp-json\/wp\/v2\/posts\/2651\/revisions\/2785"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.interviewbit.com\/blog\/wp-json\/wp\/v2\/media\/2715"}],"wp:attachment":[{"href":"https:\/\/www.interviewbit.com\/blog\/wp-json\/wp\/v2\/media?parent=2651"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.interviewbit.com\/blog\/wp-json\/wp\/v2\/categories?post=2651"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.interviewbit.com\/blog\/wp-json\/wp\/v2\/tags?post=2651"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}