In this article, we will walk through the implementation of an LRU (Least Recently Used) cache in Java. LRU caches are widely used in scenarios where data needs to be cached, but the available storage is limited. When the cache reaches its limit, it evicts the least recently used item to free up space for new entries. This is a common problem faced in technical interviews and is a valuable pattern to understand for optimizing memory management.
Key Concepts
To build an efficient LRU cache, we need to utilize data structures that allow for quick retrieval and modification of the cache elements. The most suitable data structures for this task are:
-
HashMap: This provides constant time (O(1)) access to cache entries by their keys. A HashMap allows us to quickly check if a key exists in the cache and retrieve its associated value.
-
Doubly Linked List: This allows us to maintain the order of cache entries, specifically the order in which they were last accessed. A doubly linked list provides efficient removal and insertion operations from both ends, which is crucial for the eviction process.
With these two data structures in place, we can efficiently implement both the get
and put
operations in constant time (O(1)).
LRU Cache Design
1. Data Structures
To implement the LRU cache, we'll use:
- HashMap: This will store key-value pairs for fast lookup.
- Doubly Linked List: This will maintain the order of items by their recent usage. The most recently used item will be at the front of the list, and the least recently used item will be at the end of the list.
2. Cache Operations
- get(key): This operation checks if the key exists in the cache. If it does, we move the key to the front of the list to mark it as the most recently used. If the key does not exist, we return
-1
. - put(key, value): This operation adds a key-value pair to the cache. If the key already exists, we update the value and move the key to the front of the list. If the cache exceeds its capacity, we evict the least recently used item, which is located at the tail of the doubly linked list.
Code Implementation
import java.util.HashMap;
public class LRUCache {
// Doubly Linked List Node
class Node {
int key, value;
Node prev, next;
public Node(int key, int value) {
this.key = key;
this.value = value;
}
}
private HashMap<Integer, Node> cache;
private int capacity;
private Node head, tail;
public LRUCache(int capacity) {
this.capacity = capacity;
this.cache = new HashMap<>();
// Create dummy head and tail nodes
head = new Node(0, 0);
tail = new Node(0, 0);
// Connect head and tail
head.next = tail;
tail.prev = head;
}
// Move the node to the front (most recently used)
private void moveToFront(Node node) {
removeNode(node);
addToFront(node);
}
// Add a node right after the head (most recently used)
private void addToFront(Node node) {
node.next = head.next;
node.prev = head;
head.next.prev = node;
head.next = node;
}
// Remove a node from the list
private void removeNode(Node node) {
node.prev.next = node.next;
node.next.prev = node.prev;
}
// Get the value of a key from the cache
public int get(int key) {
if (!cache.containsKey(key)) {
return -1; // Key not found
}
Node node = cache.get(key);
moveToFront(node); // Move the accessed node to the front
return node.value;
}
// Put a key-value pair into the cache
public void put(int key, int value) {
if (cache.containsKey(key)) {
// Update the existing node and move it to the front
Node node = cache.get(key);
node.value = value;
moveToFront(node);
} else {
// If the cache is full, remove the least recently used node
if (cache.size() >= capacity) {
cache.remove(tail.prev.key); // Remove the least recently used node
removeNode(tail.prev);
}
// Add the new node to the front of the list
Node newNode = new Node(key, value);
cache.put(key, newNode);
addToFront(newNode);
}
}
}
Explanation of the Code
-
Doubly Linked List Node (
Node
): Each node contains a key, value, and pointers to the previous and next nodes. This allows us to traverse the list efficiently in both directions. -
HashMap (
cache
): This stores the key-node pairs for quick access to the cache entries. The key is the identifier for the cache entry, and the node stores the associated value. -
Head and Tail Nodes: These are dummy nodes used to simplify the code. The
head
is always the most recently used item, and thetail
is the least recently used item. -
Operations:
- get(key): Looks up a key in the cache. If found, it moves the corresponding node to the front to mark it as the most recently used.
- put(key, value): Adds a new key-value pair. If the key already exists, it updates the value and moves the node to the front. If the cache exceeds its capacity, it evicts the least recently used item (tail).
Time Complexity
- get(key): O(1) – The key lookup and movement of the node are done in constant time.
- put(key, value): O(1) – Both insertion and deletion operations are done in constant time using the doubly linked list and HashMap.
Cache Eviction Strategy
The eviction strategy in this implementation follows the LRU principle. When the cache reaches its capacity, the node at the tail (the least recently used item) is removed. This ensures that the most recently used items stay in the cache and the least recently used ones are evicted when necessary.
Summary
An LRU Cache is a powerful technique for managing limited memory in systems where data access patterns follow the "use it or lose it" principle. By combining a HashMap with a doubly linked list, we can ensure both time and space efficiency. This implementation guarantees that both the get
and put
operations are executed in constant time, O(1), which is essential for high-performance applications.
This approach provides a great example of combining multiple data structures to solve a real-world problem efficiently, and it's an important concept to master for both technical interviews and system design.
Please stay tune, I will update Point 4 of FANNG Interview series, Please check top 10 interview questions here.