Solving Leetcode Interviews in Seconds with AI: LRU Cache
Introduction
In this blog post, we will explore how to solve the LeetCode problem "146" using AI. LeetCode is a popular platform for preparing for coding interviews, and with the help of AI tools like Chatmagic, we can generate solutions quickly and efficiently - helping you pass the interviews and get the job offer without having to study for months.
Problem Statement
Design a data structure that follows the constraints of a Least Recently Used (LRU) cache. Implement the LRUCache class: LRUCache(int capacity) Initialize the LRU cache with positive size capacity. int get(int key) Return the value of the key if the key exists, otherwise return -1. void put(int key, int value) Update the value of the key if the key exists. Otherwise, add the key-value pair to the cache. If the number of keys exceeds the capacity from this operation, evict the least recently used key. The functions get and put must each run in O(1) average time complexity. Example 1: Input ["LRUCache", "put", "put", "get", "put", "get", "put", "get", "get", "get"] [[2], [1, 1], [2, 2], [1], [3, 3], [2], [4, 4], [1], [3], [4]] Output [null, null, null, 1, null, -1, null, -1, 3, 4] Explanation LRUCache lRUCache = new LRUCache(2); lRUCache.put(1, 1); // cache is {1=1} lRUCache.put(2, 2); // cache is {1=1, 2=2} lRUCache.get(1); // return 1 lRUCache.put(3, 3); // LRU key was 2, evicts key 2, cache is {1=1, 3=3} lRUCache.get(2); // returns -1 (not found) lRUCache.put(4, 4); // LRU key was 1, evicts key 1, cache is {4=4, 3=3} lRUCache.get(1); // return -1 (not found) lRUCache.get(3); // return 3 lRUCache.get(4); // return 4 Constraints: 1 <= capacity <= 3000 0 <= key <= 104 0 <= value <= 105 At most 2 * 105 calls will be made to get and put.
Explanation
Here's an efficient solution to the LRU cache problem:
- Hash Map for O(1) Access: Use a hash map (dictionary in Python) to store key-value pairs, enabling O(1) average time complexity for
getandputoperations. - Doubly Linked List for LRU Tracking: Employ a doubly linked list to maintain the order of keys based on their recent usage. The most recently used key will be at the head, and the least recently used at the tail.
Combined Operation: When a key is accessed or updated, move it to the head of the linked list. When the capacity is exceeded during a
putoperation, remove the tail node (LRU) from both the linked list and the hash map.Runtime Complexity: O(1) for get and put operations (average case). Storage Complexity: O(capacity).
Code
class Node:
def __init__(self, key, value):
self.key = key
self.value = value
self.next = None
self.prev = None
class LRUCache:
def __init__(self, capacity):
self.capacity = capacity
self.cache = {} # key: Node
self.head = None # Most recently used
self.tail = None # Least recently used
def _remove_node(self, node):
if node.prev:
node.prev.next = node.next
else:
self.head = node.next
if node.next:
node.next.prev = node.prev
else:
self.tail = node.prev
def _add_to_head(self, node):
node.next = self.head
node.prev = None
if self.head:
self.head.prev = node
self.head = node
if not self.tail:
self.tail = node
def get(self, key):
if key in self.cache:
node = self.cache[key]
self._remove_node(node)
self._add_to_head(node)
return node.value
return -1
def put(self, key, value):
if key in self.cache:
node = self.cache[key]
node.value = value
self._remove_node(node)
self._add_to_head(node)
else:
node = Node(key, value)
self.cache[key] = node
self._add_to_head(node)
if len(self.cache) > self.capacity:
# Evict LRU
lru_node = self.tail
del self.cache[lru_node.key]
self._remove_node(lru_node)