Introduction: Why Caching is a Superpower

Welcome back, aspiring software engineer! In our journey through Data Structures and Algorithms, we’ve explored many fundamental building blocks. Now, it’s time to put some of that knowledge into action by building a practical, real-world system: a caching mechanism.

Why caching? Imagine you have an application that frequently fetches the same data from a slow database or a remote API. Every time a user asks for that data, your app has to wait, leading to a sluggish experience. What if we could store a copy of that frequently accessed data in a faster, more accessible location, like in memory? That’s the magic of caching! It’s a fundamental technique used across almost all levels of computing, from your CPU’s cache to web browsers, databases, and large-scale distributed systems.

In this chapter, we’ll embark on a hands-on project to build a Least Recently Used (LRU) cache using TypeScript. You’ll learn not just what an LRU cache is, but why it’s designed the way it is, leveraging the power of Hash Maps and Doubly Linked Lists that we’ve covered in previous chapters. By the end, you’ll have a clear understanding of how to implement a common and highly effective caching strategy and appreciate its impact on application performance. Ready to make your applications faster? Let’s dive in!

Core Concepts: Understanding Caching and LRU

Before we write any code, let’s solidify our understanding of caching and, specifically, the Least Recently Used (LRU) strategy.

What is Caching?

At its heart, caching is about storing copies of data so that future requests for that data can be served faster. It’s a trade-off: we use more memory (to store the cache) to gain speed (by avoiding slower data sources).

Think of it like this: You’re studying for an exam. You have a massive textbook (the slow data source) and a notepad (your cache). When you encounter a crucial concept, you write it down in your notepad. The next time you need that concept, you check your notepad first. If it’s there (a cache hit), great! You get the information quickly. If not (a cache miss), you go back to the textbook, find the information, and then add it to your notepad for future reference.

The challenge with a notepad (or any cache) is that it has limited space. What happens when it’s full? You have to decide what to erase to make room for new information. This decision-making process is called the eviction policy.

Why LRU? The Least Recently Used Eviction Policy

There are many eviction policies:

  • FIFO (First-In, First-Out): Evict the item that has been in the cache the longest.
  • LIFO (Last-In, First-Out): Evict the item that was most recently added.
  • LFU (Least Frequently Used): Evict the item that has been accessed the fewest times.
  • LRU (Least Recently Used): Evict the item that has not been accessed for the longest time.

The Least Recently Used (LRU) policy is one of the most popular and effective. It operates on the principle that if a piece of data has been used recently, it’s likely to be used again soon. Conversely, if it hasn’t been used for a long time, it’s less likely to be needed in the near future, making it a good candidate for eviction.

Imagine your web browser’s history. The pages you visited most recently are often the ones you might revisit. The pages you haven’t touched in weeks are probably safe to forget about. That’s LRU in action!

The Perfect Pair: Hash Map and Doubly Linked List for LRU

To implement an LRU cache efficiently, we need two key operations to be fast:

  1. Quick Lookup/Access: Given a key, we need to find its corresponding value (and potentially update its “recency”) in O(1) time.
  2. Quick Reordering/Eviction: When an item is accessed, we need to move it to the “most recently used” position. When the cache is full, we need to quickly identify and remove the “least recently used” item, both in O(1) time.

Can you think of data structures that excel at these tasks?

  • For quick lookup, a Hash Map (or Map in JavaScript/TypeScript) is perfect! It allows us to store key-value pairs and retrieve a value by its key in average O(1) time. We can store key -> Node mappings, where Node holds the actual data and links.

  • For quick reordering and eviction, a Doubly Linked List is ideal!

    • When an item is accessed, we can remove its node from its current position and move it to the front (most recently used) in O(1) time because we have direct pointers to the previous and next nodes.
    • When we need to evict an item, the least recently used item will always be at the end of our doubly linked list. Removing it is also an O(1) operation.

Let’s visualize this combined structure:

graph TD subgraph LRU_Cache_Structure["LRU Cache Structure"] A[Head of List Most Recent] --> Node_1[Key Value] Node_1 --> Node_2[Key Value] Node_2 --> Node_3[Key Value] Node_3 --> E[Tail of List Least Recent] subgraph Hash_Map["Hash Map (Key --> Node Reference)"] HM1[KeyA greater than Node 1] HM2[KeyB greater than Node 2] HM3[KeyC greater than Node 3] end A --- HM1 B --- HM2 C --- HM3 end style A fill:#D4EDDA,stroke:#28A745,stroke-width:2px,color:#000 style E fill:#F8D7DA,stroke:#DC3545,stroke-width:2px,color:#000
  • The doubly linked list maintains the order of usage. The head points to the most recently used item, and the tail points to the least recently used item.
  • The hash map provides direct access to any node in the linked list using its key. Without the hash map, finding a node to update its position would require traversing the linked list, making lookups O(N) instead of O(1).

This elegant combination allows both get and put operations to achieve an average time complexity of O(1). Pretty neat, right?

Step-by-Step Implementation: Building Our LRU Cache

Now that we understand the theory, let’s roll up our sleeves and build our LRU cache in TypeScript.

1. Project Setup

First, let’s create a new Node.js project and set up TypeScript.

Open your terminal and run:

mkdir lru-cache-project
cd lru-cache-project
npm init -y
npm install [email protected] # As of Feb 2026, 5.x is the latest stable major
npm install @types/[email protected] # For Node.js type definitions
npx tsc --init

A quick check on versions:

  • Node.js LTS: 24.13.0 (Krypton) as of 2026-01-13.
  • TypeScript: 5.x.x is the latest stable major version.

In your tsconfig.json, make sure target is set to something modern like es2020 or es2022, and module to commonjs or esnext.

// tsconfig.json
{
  "compilerOptions": {
    "target": "es2022", // Modern JavaScript features
    "module": "commonjs", // Or "esnext" for ES Modules
    "rootDir": "./src", // Source files will be in 'src' folder
    "outDir": "./dist", // Compiled JavaScript will go to 'dist'
    "strict": true, // Enable all strict type-checking options
    "esModuleInterop": true, // Enables better interoperability for CommonJS/ES Modules
    "skipLibCheck": true, // Skip type checking of declaration files
    "forceConsistentCasingInFileNames": true // Ensure consistent file casing
  },
  "include": ["src/**/*"] // Include all files in src
}

Create a src directory:

mkdir src

2. Defining the CacheNode

Our doubly linked list will consist of nodes. Each node needs to store a key, a value, and pointers to the previous and next nodes.

Create a file src/lruCache.ts and add the CacheNode class:

// src/lruCache.ts

/**
 * Represents a node in our doubly linked list.
 * Each node stores a key-value pair and pointers to the previous and next nodes.
 */
class CacheNode<K, V> {
    key: K;
    value: V;
    prev: CacheNode<K, V> | null;
    next: CacheNode<K, V> | null;

    constructor(key: K, value: V) {
        this.key = key;
        this.value = value;
        this.prev = null;
        this.next = null;
    }
}

Explanation:

  • We use generics <K, V> to make our cache flexible, allowing it to store any type of key and value.
  • key and value store the actual data.
  • prev and next are pointers to other CacheNode instances, forming the links in our list. They can be null for the head or tail of the list.

3. Defining the LRUCache Class

Now, let’s create the main LRUCache class. It will hold our capacity, the map for O(1) lookups, and head and tail pointers for the doubly linked list.

Add this to src/lruCache.ts, below the CacheNode class:

// src/lruCache.ts (continued)

// ... CacheNode class above ...

/**
 * Implements a Least Recently Used (LRU) cache.
 * Uses a combination of a Hash Map (Map) for O(1) lookups
 * and a Doubly Linked List for O(1) reordering and eviction.
 */
class LRUCache<K, V> {
    private capacity: number;
    private cache: Map<K, CacheNode<K, V>>; // Our Hash Map: key -> CacheNode
    private head: CacheNode<K, V> | null;  // Most recently used node
    private tail: CacheNode<K, V> | null;  // Least recently used node

    constructor(capacity: number) {
        if (capacity < 1) {
            throw new Error("Cache capacity must be at least 1.");
        }
        this.capacity = capacity;
        this.cache = new Map<K, CacheNode<K, V>>();
        this.head = null;
        this.tail = null;
    }

    // We'll add methods here
}

Explanation:

  • capacity: The maximum number of items our cache can hold.
  • cache: This is our Map (the TypeScript equivalent of a hash map). It stores key -> CacheNode pairs. This allows us to quickly find a CacheNode given its key.
  • head: Points to the CacheNode that was most recently accessed. This will always be at the front of our logical list.
  • tail: Points to the CacheNode that was least recently accessed. This will always be at the end of our logical list and is the first candidate for eviction when the cache is full.
  • The constructor initializes these properties and includes a basic validation for capacity.

4. Helper Methods for Linked List Manipulation

To keep our get and put methods clean, let’s create two helper methods for managing the doubly linked list:

  1. _removeNode(node): Removes a given node from its current position in the list.
  2. _addNodeToFront(node): Adds a given node to the front of the list (making it the new head).

Add these methods inside the LRUCache class:

// src/lruCache.ts (continued inside LRUCache class)

// ... constructor above ...

    /**
     * Removes a node from its current position in the doubly linked list.
     * @param node The node to remove.
     * @private
     */
    private _removeNode(node: CacheNode<K, V>): void {
        // If the node is the head, update head
        if (node === this.head) {
            this.head = node.next;
        }
        // If the node is the tail, update tail
        if (node === this.tail) {
            this.tail = node.prev;
        }

        // Link previous node to next node
        if (node.prev) {
            node.prev.next = node.next;
        }
        // Link next node to previous node
        if (node.next) {
            node.next.prev = node.prev;
        }

        // Clear node's links to ensure it's fully detached
        node.prev = null;
        node.next = null;
    }

    /**
     * Adds a node to the front (head) of the doubly linked list.
     * This makes it the most recently used item.
     * @param node The node to add to the front.
     * @private
     */
    private _addNodeToFront(node: CacheNode<K, V>): void {
        node.next = this.head; // New node points to current head
        node.prev = null;     // New node has no previous (it's the new head)

        // If there was an existing head, its 'prev' should now point to the new node
        if (this.head) {
            this.head.prev = node;
        }
        this.head = node; // Update head to be the new node

        // If the list was empty, the new node is also the tail
        if (!this.tail) {
            this.tail = node;
        }
    }

Explanation:

  • _removeNode: Handles all cases: node is head, node is tail, or node is in the middle. It carefully updates prev and next pointers of surrounding nodes and the head/tail pointers of the cache itself.
  • _addNodeToFront: Makes the given node the new head. It adjusts pointers accordingly. It also handles the special case where the list was previously empty, making the new node both head and tail.

5. Implementing get(key)

The get method retrieves a value from the cache. If the key exists (cache hit), it should return the value and mark the item as “most recently used” by moving its node to the front of the list. If the key doesn’t exist (cache miss), it returns undefined.

Add this method inside the LRUCache class:

// src/lruCache.ts (continued inside LRUCache class)

// ... _addNodeToFront method above ...

    /**
     * Retrieves the value associated with the given key from the cache.
     * If found, the item is marked as most recently used.
     * @param key The key to look up.
     * @returns The value if found, otherwise undefined.
     */
    get(key: K): V | undefined {
        // 1. Check if the key exists in our hash map
        const node = this.cache.get(key);

        if (!node) {
            // Cache Miss: Key not found
            return undefined;
        }

        // Cache Hit: Key found!
        // 2. Mark this node as most recently used by moving it to the front of the list
        this._removeNode(node);
        this._addNodeToFront(node);

        // 3. Return its value
        return node.value;
    }

Explanation:

  1. We use this.cache.get(key) to quickly find the CacheNode in O(1) time.
  2. If node is undefined, it’s a cache miss, so we return undefined.
  3. If node is found, we first _removeNode from its current position.
  4. Then, we _addNodeToFront to make it the new head, signifying it’s now the most recently used item.
  5. Finally, we return the node.value.

6. Implementing put(key, value)

The put method adds or updates an item in the cache. This is where the eviction policy comes into play.

Add this method inside the LRUCache class:

// src/lruCache.ts (continued inside LRUCache class)

// ... get method above ...

    /**
     * Adds a new key-value pair to the cache, or updates an existing one.
     * If the cache is full, the least recently used item is evicted.
     * @param key The key to add or update.
     * @param value The value to associate with the key.
     */
    put(key: K, value: V): void {
        let node = this.cache.get(key);

        if (node) {
            // 1. Key already exists: Update its value and mark as most recently used.
            node.value = value;
            this._removeNode(node);
            this._addNodeToFront(node);
        } else {
            // 2. Key does not exist: Add new item.
            // Check if cache is at capacity
            if (this.cache.size >= this.capacity) {
                // Cache is full, time to evict the LRU item (tail of the list)
                if (this.tail) { // Ensure tail exists
                    this.cache.delete(this.tail.key); // Remove from hash map
                    this._removeNode(this.tail);      // Remove from linked list
                }
            }

            // Create a new node and add it
            const newNode = new CacheNode(key, value);
            this.cache.set(key, newNode);     // Add to hash map
            this._addNodeToFront(newNode);    // Add to front of linked list (most recent)
        }
    }
} // End of LRUCache class

Explanation:

  1. We first check if the key already exists in this.cache.
  2. If it exists: This is an update. We update the node.value, then move it to the front of the list using _removeNode and _addNodeToFront. This correctly marks it as most recently used.
  3. If it does not exist: This is a new item.
    • We first check if this.cache.size (current number of items) is equal to or exceeds this.capacity.
    • If it is full, we must evict the LRU item. The this.tail node is always the LRU. We delete it from the this.cache map and _removeNode from the linked list.
    • After potential eviction (or if not full), we create a newNode.
    • We set this newNode into this.cache map.
    • Finally, we _addNodeToFront(newNode) to make it the new head of the list.

7. Testing Our LRU Cache

Let’s create a simple test file to see our cache in action.

Create src/index.ts:

// src/index.ts
import { LRUCache } from './lruCache'; // Assuming LRUCache is exported

// To export LRUCache, add 'export' keyword before the class definition in lruCache.ts:
// export class LRUCache<K, V> { ... }
// export class CacheNode<K, V> { ... }

function runCacheDemo() {
    console.log("--- LRU Cache Demo ---");

    const cache = new LRUCache<string, number>(3); // Capacity of 3

    console.log("1. Adding (A, 1)");
    cache.put("A", 1); // Cache: A
    console.log("Cache size:", cache.size()); // We need to add a size getter later!

    console.log("2. Adding (B, 2)");
    cache.put("B", 2); // Cache: B, A
    console.log("3. Adding (C, 3)");
    cache.put("C", 3); // Cache: C, B, A (Full)

    console.log("\nCache state after A, B, C added (capacity 3):");
    console.log(`Get A: ${cache.get("A")}`); // Access A. Cache: A, C, B
    console.log(`Get B: ${cache.get("B")}`); // Access B. Cache: B, A, C
    console.log(`Get C: ${cache.get("C")}`); // Access C. Cache: C, B, A

    console.log("\n4. Adding (D, 4) - Eviction expected!");
    cache.put("D", 4); // Cache: D, C, B (A should be evicted)
    console.log(`Get A: ${cache.get("A")}`); // Should be undefined
    console.log(`Get D: ${cache.get("D")}`); // Should be 4

    console.log("\n5. Updating B to 20");
    cache.put("B", 20); // Update B, move to front. Cache: B, D, C
    console.log(`Get B: ${cache.get("B")}`); // Should be 20

    console.log("\nFinal state check:");
    console.log(`Get C: ${cache.get("C")}`); // C is now LRU. Cache: C, B, D
    console.log(`Get D: ${cache.get("D")}`); // D is now MRU. Cache: D, C, B

    console.log("\n--- Demo Complete ---");
}

// Don't forget to export your LRUCache and CacheNode classes in lruCache.ts
// Add 'export' before class LRUCache and class CacheNode

// Call the demo function
runCacheDemo();

Before running, we need to add a size() getter to our LRUCache class and export the classes.

Modify src/lruCache.ts to export the classes and add size():

// src/lruCache.ts

export class CacheNode<K, V> { // Added 'export'
    key: K;
    value: V;
    prev: CacheNode<K, V> | null;
    next: CacheNode<K, V> | null;

    constructor(key: K, value: V) {
        this.key = key;
        this.value = value;
        this.prev = null;
        this.next = null;
    }
}

export class LRUCache<K, V> { // Added 'export'
    private capacity: number;
    private cache: Map<K, CacheNode<K, V>>;
    private head: CacheNode<K, V> | null;
    private tail: CacheNode<K, V> | null;

    constructor(capacity: number) {
        if (capacity < 1) {
            throw new Error("Cache capacity must be at least 1.");
        }
        this.capacity = capacity;
        this.cache = new Map<K, CacheNode<K, V>>();
        this.head = null;
        this.tail = null;
    }

    // New getter for current size
    public size(): number {
        return this.cache.size;
    }

    // ... _removeNode, _addNodeToFront, get, put methods ...
}

Now, let’s compile and run!

npx tsc # Compiles src/lruCache.ts and src/index.ts into dist/
node dist/index.js

You should see output similar to this, demonstrating the LRU eviction:

--- LRU Cache Demo ---
1. Adding (A, 1)
Cache size: 1
2. Adding (B, 2)
3. Adding (C, 3)

Cache state after A, B, C added (capacity 3):
Get A: 1
Get B: 2
Get C: 3

4. Adding (D, 4) - Eviction expected!
Get A: undefined
Get D: 4

5. Updating B to 20
Get B: 20

Final state check:
Get C: 3
Get D: 4

--- Demo Complete ---

Notice how ‘A’ was evicted when ‘D’ was added because ‘A’ was the least recently used after ‘C’ was accessed. This confirms our LRU logic is working!

Mini-Challenge: Enhancing the LRU Cache

Your challenge is to add a new method to our LRUCache called peek(key).

Challenge: Implement peek(key)

The peek method should retrieve the value associated with a key, just like get(key), but without marking the item as most recently used. This means it should NOT move the accessed node to the front of the linked list. If the key is not found, peek should return undefined.

Hint: Think about which parts of the get method you need to keep and which you need to remove.

What to observe/learn: This challenge helps you understand the distinction between merely reading data and interacting with the cache’s recency logic, which is crucial for certain use cases where you want to check an item without affecting its eviction priority.

Take a few minutes to try implementing peek yourself!

Click for Hint

The peek method will only need to interact with the this.cache (the Map). It won’t need to call _removeNode or _addNodeToFront.

Click for Solution
// src/lruCache.ts (inside LRUCache class)

    /**
     * Retrieves the value associated with the given key from the cache,
     * without marking the item as most recently used.
     * @param key The key to look up.
     * @returns The value if found, otherwise undefined.
     */
    peek(key: K): V | undefined {
        const node = this.cache.get(key);
        return node ? node.value : undefined;
    }

Common Pitfalls & Troubleshooting

Even with a relatively simple data structure like the LRU cache, it’s easy to run into issues. Here are some common pitfalls and how to troubleshoot them:

  1. Incorrect Head/Tail Pointer Updates:

    • Symptom: Items are not being evicted correctly, or the list breaks (e.g., this.head.prev is not null, or this.tail.next is not null).
    • Troubleshooting: Carefully review your _removeNode and _addNodeToFront methods.
      • Did you handle the edge cases where the list becomes empty (after removing the last node) or where a node becomes the only node (after adding to an empty list)?
      • Step through your code with a debugger, paying close attention to head, tail, and node.prev/node.next values at each step.
      • Ensure null checks are in place before accessing node.prev.next or similar.
  2. Forgetting to Update the Map:

    • Symptom: You evict a node from the linked list, but a get call for its key still returns a value, or the cache size is incorrect.
    • Troubleshooting: Remember that the Map (this.cache) is the source of truth for key-to-node mappings. When you evict a node, you must also call this.cache.delete(node.key). Similarly, when adding a new node, this.cache.set(key, newNode) is essential.
  3. Capacity Off-by-One Errors:

    • Symptom: The cache holds one more or one less item than its capacity.
    • Troubleshooting: Double-check your eviction logic: if (this.cache.size >= this.capacity). Should it be > or >=? For an LRU cache, when size equals capacity and a new item needs to be added, an eviction must occur first. So, this.cache.size >= this.capacity is correct for triggering eviction when adding a new item.
  4. Generic Type Issues in TypeScript:

    • Symptom: Compiler errors related to K or V not being assignable.
    • Troubleshooting: Ensure you consistently use your generic types <K, V> throughout the CacheNode and LRUCache classes. When instantiating LRUCache, provide the specific types, e.g., new LRUCache<string, number>(...).

Debugging a doubly linked list can be tricky due to pointer manipulation. Drawing diagrams on paper can be incredibly helpful to visualize how prev and next pointers change with each operation.

Summary

Congratulations! You’ve successfully built a functional LRU caching system using TypeScript. This project brought together several important concepts:

  • Caching Fundamentals: You learned why caching is essential for performance and the trade-offs involved.
  • LRU Eviction Policy: You understood the logic behind the Least Recently Used strategy and why it’s so common.
  • Data Structure Synergy: You saw a powerful example of how combining a Hash Map (Map in TypeScript) for O(1) lookups and a Doubly Linked List for O(1) reordering and eviction results in an efficient O(1) average time complexity for both get and put operations.
  • Practical Implementation: You gained hands-on experience implementing these concepts in TypeScript, including managing pointers and handling edge cases.

Real-World Applications

LRU caches are ubiquitous in software engineering:

  • Web Browsers: Caching web pages, images, and scripts.
  • Databases: Caching frequently accessed query results or data blocks.
  • Operating Systems: Caching disk blocks in memory.
  • Web Servers/APIs: Caching responses to popular API endpoints.
  • Content Delivery Networks (CDNs): Caching static assets closer to users.
  • Recommendation Engines: Caching personalized recommendations.

Understanding and being able to implement an LRU cache is a fantastic skill that demonstrates a solid grasp of both data structures and system design principles.

What’s Next?

In the next chapters, we’ll continue our exploration of advanced data structures and algorithmic paradigms. You’ll find that the principles of efficiency, trade-offs, and choosing the right tool for the job that you applied in this caching project are fundamental to mastering DSA. Keep practicing, keep building, and your mastery will grow!


References

  1. Node.js Official Website: https://nodejs.org/ (Refer to the v24.13.0 LTS or v25.6.1 current release documentation for specific API details.)
  2. TypeScript Handbook (Official Documentation): https://www.typescriptlang.org/docs/handbook/intro.html (For general TypeScript syntax and features.)
  3. MDN Web Docs - Map: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map (Details on JavaScript’s Map object, which serves as our hash map.)
  4. GeeksforGeeks - LRU Cache: https://www.geeksforgeeks.org/lru-cache-implementation/ (A classic resource for understanding LRU cache implementation details across various languages.)

This page is AI-assisted and reviewed. It references official documentation and recognized resources where relevant.