Least Recently Used (LRU) cache in Java

What is LRU Cache?

Least Recently Used (LRU) is a caching strategy that evicts the least recently accessed item when the cache exceeds its capacity. It ensures that the most frequently or recently used data stays in memory while discarding older, less useful entries.


Real-World Use Case

  • Android image loading: Cache recently viewed images to avoid reloading from the network.

  • Database query results: Store recently used queries for faster results.

  • Web browsers: Maintain a small history of visited pages.

  • Memory-constrained systems: Manage resource usage by limiting active items in memory.


Best Way to Implement LRU in Java

The simplest and most effective way to implement an LRU Cache in Java is by using LinkedHashMap with access-order enabled.


LRU Cache Using LinkedHashMap (Best Practice)

import java.util.LinkedHashMap;
import java.util.Map;

public class LRUCache<K, V> extends LinkedHashMap<K, V> {
    private final int capacity;

    public LRUCache(int capacity) {
        // initialCapacity, loadFactor, accessOrder
        super(capacity, 0.75f, true);
        this.capacity = capacity;
    }

    @Override
    protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
        return size() > capacity;
    }

    public static void main(String[] args) {
        LRUCache<Integer, String> cache = new LRUCache<>(3);

        cache.put(1, "One");
        cache.put(2, "Two");
        cache.put(3, "Three");

        // Access key 2 (makes 2 most recently used)
        cache.get(2);

        // Add key 4 - should evict key 1 (least recently used)
        cache.put(4, "Four");

        System.out.println(cache);
        // Output: {3=Three, 2=Two, 4=Four}
    }
}

 Output

{3=Three, 2=Two, 4=Four}

Explanation:

  • Initially added 1, 2, 3.

  • Accessed key 2 → usage order becomes [1, 3, 2]

  • Adding key 4 → evicts 1 as it is the Least Recently Used.


Custom LRU Without LinkedHashMap (for interviews)

Here’s how you can manually implement an LRU Cache using a Doubly Linked List and HashMap in Java — this is the classic approach commonly asked in interviews because it guarantees O(1) time complexity for both get() and put() operations.


Key Concepts

  • HashMap gives O(1) lookup for keys.

  • Doubly Linked List maintains access order (most recent at the front, least at the end).

  • Each node contains key & value and links to its previous and next node.

import java.util.HashMap;

public class LRUCache<K, V> {
    private final int capacity;
    private final HashMap<K, Node> map;
    private final Node head, tail;

    private class Node {
        K key;
        V value;
        Node prev, next;

        Node(K key, V value) {
            this.key = key;
            this.value = value;
        }
    }

    public LRUCache(int capacity) {
        this.capacity = capacity;
        map = new HashMap<>();

        // Dummy head and tail nodes to avoid null checks
        head = new Node(null, null);
        tail = new Node(null, null);
        head.next = tail;
        tail.prev = head;
    }

    public V get(K key) {
        if (!map.containsKey(key)) return null;
        Node node = map.get(key);
        moveToHead(node); // Mark as most recently used
        return node.value;
    }

    public void put(K key, V value) {
        if (map.containsKey(key)) {
            Node node = map.get(key);
            node.value = value;
            moveToHead(node);
        } else {
            Node newNode = new Node(key, value);
            map.put(key, newNode);
            addToHead(newNode);

            if (map.size() > capacity) {
                Node lru = tail.prev;
                removeNode(lru);
                map.remove(lru.key);
            }
        }
    }

    // Helper methods for doubly linked list operations
    private void addToHead(Node node) {
        node.prev = head;
        node.next = head.next;

        head.next.prev = node;
        head.next = node;
    }

    private void removeNode(Node node) {
        node.prev.next = node.next;
        node.next.prev = node.prev;
    }

    private void moveToHead(Node node) {
        removeNode(node);
        addToHead(node);
    }

    public void printCache() {
        Node current = head.next;
        while (current != tail) {
            System.out.print(current.key + "=" + current.value + " ");
            current = current.next;
        }
        System.out.println();
    }

    public static void main(String[] args) {
        LRUCache<Integer, String> cache = new LRUCache<>(3);
        cache.put(1, "One");
        cache.put(2, "Two");
        cache.put(3, "Three");

        cache.printCache(); // 3=Three 2=Two 1=One

        cache.get(2); // Access 2 to move it to front
        cache.printCache(); // 2=Two 3=Three 1=One

        cache.put(4, "Four"); // Evicts 1 (least recently used)
        cache.printCache(); // 4=Four 2=Two 3=Three
    }
}

Output

3=Three 2=Two 1=One
2=Two 3=Three 1=One
4=Four 2=Two 3=Three

Benefits of This Approach

Feature Description
Time Complexity O(1) for both get() and put()
Space Complexity O(capacity)
No built-in Java tricks Great for coding interviews
Full control Easy to customize or extend

Key Takeaways

  • Use LinkedHashMap with accessOrder = true for easy and efficient LRU.

  • Override removeEldestEntry() to control eviction.

  • Prefer it for in-memory caches in performance-sensitive apps.

  • For interviews, know the manual implementation using Doubly Linked List + HashMap.


📢 Feedback: Did you find this article helpful? Let me know your thoughts or suggestions for improvements! 😊 please leave a comment below. I’d love to hear from you! 👇

Happy coding! 💻✨

0 comments:

Post a Comment