Amazon Questions

Design an LRU (Least Recently Used) cache with O(1) get and put operations.

MediumTechnicalData Structures20-25 minutes

Model Answer

An LRU cache evicts the least recently accessed item when the cache is full. To achieve O(1) for both get and put, we combine a hash map for fast lookups with a doubly linked list for maintaining access order. The hash map stores keys mapping to linked list nodes. The doubly linked list keeps items in access order — most recently used at the head, least recently used at the tail. On get: look up the key in the hash map, move the node to the head of the list, return the value. On put: if the key exists, update and move to head. If new and at capacity, remove the tail node and its hash map entry, then insert at head.

Approaches

Hash Map + Doubly Linked List

Combine O(1) hash map lookups with O(1) linked list insertion/deletion to achieve O(1) for both get and put. The list maintains order, the map enables fast key lookups.

class LRUCache {
  constructor(capacity) {
    this.capacity = capacity;
    this.map = new Map();
    // Sentinel nodes simplify edge cases
    this.head = { key: null, val: null, prev: null, next: null };
    this.tail = { key: null, val: null, prev: null, next: null };
    this.head.next = this.tail;
    this.tail.prev = this.head;
  }

  _remove(node) {
    node.prev.next = node.next;
    node.next.prev = node.prev;
  }

  _addToHead(node) {
    node.next = this.head.next;
    node.prev = this.head;
    this.head.next.prev = node;
    this.head.next = node;
  }

  get(key) {
    if (!this.map.has(key)) return -1;
    const node = this.map.get(key);
    this._remove(node);
    this._addToHead(node);
    return node.val;
  }

  put(key, value) {
    if (this.map.has(key)) {
      const node = this.map.get(key);
      node.val = value;
      this._remove(node);
      this._addToHead(node);
    } else {
      if (this.map.size >= this.capacity) {
        const lru = this.tail.prev;
        this._remove(lru);
        this.map.delete(lru.key);
      }
      const node = { key, val: value, prev: null, next: null };
      this._addToHead(node);
      this.map.set(key, node);
    }
  }
}
Time: O(1) for both get and putSpace: O(capacity)

Using JavaScript Map (Insertion Order)

JavaScript's Map maintains insertion order. By deleting and re-inserting on access, we get LRU behavior. Simpler to implement but relies on language-specific behavior.

class LRUCache {
  constructor(capacity) {
    this.capacity = capacity;
    this.cache = new Map();
  }

  get(key) {
    if (!this.cache.has(key)) return -1;
    const val = this.cache.get(key);
    // Move to end (most recent)
    this.cache.delete(key);
    this.cache.set(key, val);
    return val;
  }

  put(key, value) {
    if (this.cache.has(key)) {
      this.cache.delete(key);
    } else if (this.cache.size >= this.capacity) {
      // Delete the first (oldest) entry
      const firstKey = this.cache.keys().next().value;
      this.cache.delete(firstKey);
    }
    this.cache.set(key, value);
  }
}
Time: O(1) amortized (Map internals)Space: O(capacity)

Common Mistakes

  • 1.Using a singly linked list instead of doubly linked, making removal O(n)
  • 2.Forgetting to update the hash map when evicting the tail node
  • 3.Not handling the edge case where get is called on a non-existent key
  • 4.Implementing put without checking if the key already exists (leading to duplicates)

Follow-up Questions

  • How would you make this thread-safe?
  • What if you needed to support TTL (time-to-live) for cache entries?
  • How would you implement an LFU (Least Frequently Used) cache instead?