Introduction: Why Caching is a Superpower
Welcome back, aspiring software engineer! In our journey through Data Structures and Algorithms, we’ve explored many fundamental building blocks. Now, it’s time to put some of that knowledge into action by building a practical, real-world system: a caching mechanism.
Why caching? Imagine you have an application that frequently fetches the same data from a slow database or a remote API. Every time a user asks for that data, your app has to wait, leading to a sluggish experience. What if we could store a copy of that frequently accessed data in a faster, more accessible location, like in memory? That’s the magic of caching! It’s a fundamental technique used across almost all levels of computing, from your CPU’s cache to web browsers, databases, and large-scale distributed systems.
In this chapter, we’ll embark on a hands-on project to build a Least Recently Used (LRU) cache using TypeScript. You’ll learn not just what an LRU cache is, but why it’s designed the way it is, leveraging the power of Hash Maps and Doubly Linked Lists that we’ve covered in previous chapters. By the end, you’ll have a clear understanding of how to implement a common and highly effective caching strategy and appreciate its impact on application performance. Ready to make your applications faster? Let’s dive in!
Core Concepts: Understanding Caching and LRU
Before we write any code, let’s solidify our understanding of caching and, specifically, the Least Recently Used (LRU) strategy.
What is Caching?
At its heart, caching is about storing copies of data so that future requests for that data can be served faster. It’s a trade-off: we use more memory (to store the cache) to gain speed (by avoiding slower data sources).
Think of it like this: You’re studying for an exam. You have a massive textbook (the slow data source) and a notepad (your cache). When you encounter a crucial concept, you write it down in your notepad. The next time you need that concept, you check your notepad first. If it’s there (a cache hit), great! You get the information quickly. If not (a cache miss), you go back to the textbook, find the information, and then add it to your notepad for future reference.
The challenge with a notepad (or any cache) is that it has limited space. What happens when it’s full? You have to decide what to erase to make room for new information. This decision-making process is called the eviction policy.
Why LRU? The Least Recently Used Eviction Policy
There are many eviction policies:
- FIFO (First-In, First-Out): Evict the item that has been in the cache the longest.
- LIFO (Last-In, First-Out): Evict the item that was most recently added.
- LFU (Least Frequently Used): Evict the item that has been accessed the fewest times.
- LRU (Least Recently Used): Evict the item that has not been accessed for the longest time.
The Least Recently Used (LRU) policy is one of the most popular and effective. It operates on the principle that if a piece of data has been used recently, it’s likely to be used again soon. Conversely, if it hasn’t been used for a long time, it’s less likely to be needed in the near future, making it a good candidate for eviction.
Imagine your web browser’s history. The pages you visited most recently are often the ones you might revisit. The pages you haven’t touched in weeks are probably safe to forget about. That’s LRU in action!
The Perfect Pair: Hash Map and Doubly Linked List for LRU
To implement an LRU cache efficiently, we need two key operations to be fast:
- Quick Lookup/Access: Given a key, we need to find its corresponding value (and potentially update its “recency”) in O(1) time.
- Quick Reordering/Eviction: When an item is accessed, we need to move it to the “most recently used” position. When the cache is full, we need to quickly identify and remove the “least recently used” item, both in O(1) time.
Can you think of data structures that excel at these tasks?
For quick lookup, a Hash Map (or
Mapin JavaScript/TypeScript) is perfect! It allows us to store key-value pairs and retrieve a value by its key in average O(1) time. We can storekey -> Nodemappings, whereNodeholds the actual data and links.For quick reordering and eviction, a Doubly Linked List is ideal!
- When an item is accessed, we can remove its node from its current position and move it to the front (most recently used) in O(1) time because we have direct pointers to the previous and next nodes.
- When we need to evict an item, the least recently used item will always be at the end of our doubly linked list. Removing it is also an O(1) operation.
Let’s visualize this combined structure:
- The doubly linked list maintains the order of usage. The
headpoints to the most recently used item, and thetailpoints to the least recently used item. - The hash map provides direct access to any node in the linked list using its key. Without the hash map, finding a node to update its position would require traversing the linked list, making lookups O(N) instead of O(1).
This elegant combination allows both get and put operations to achieve an average time complexity of O(1). Pretty neat, right?
Step-by-Step Implementation: Building Our LRU Cache
Now that we understand the theory, let’s roll up our sleeves and build our LRU cache in TypeScript.
1. Project Setup
First, let’s create a new Node.js project and set up TypeScript.
Open your terminal and run:
mkdir lru-cache-project
cd lru-cache-project
npm init -y
npm install [email protected] # As of Feb 2026, 5.x is the latest stable major
npm install @types/[email protected] # For Node.js type definitions
npx tsc --init
A quick check on versions:
- Node.js LTS:
24.13.0(Krypton) as of 2026-01-13. - TypeScript:
5.x.xis the latest stable major version.
In your tsconfig.json, make sure target is set to something modern like es2020 or es2022, and module to commonjs or esnext.
// tsconfig.json
{
"compilerOptions": {
"target": "es2022", // Modern JavaScript features
"module": "commonjs", // Or "esnext" for ES Modules
"rootDir": "./src", // Source files will be in 'src' folder
"outDir": "./dist", // Compiled JavaScript will go to 'dist'
"strict": true, // Enable all strict type-checking options
"esModuleInterop": true, // Enables better interoperability for CommonJS/ES Modules
"skipLibCheck": true, // Skip type checking of declaration files
"forceConsistentCasingInFileNames": true // Ensure consistent file casing
},
"include": ["src/**/*"] // Include all files in src
}
Create a src directory:
mkdir src
2. Defining the CacheNode
Our doubly linked list will consist of nodes. Each node needs to store a key, a value, and pointers to the previous and next nodes.
Create a file src/lruCache.ts and add the CacheNode class:
// src/lruCache.ts
/**
* Represents a node in our doubly linked list.
* Each node stores a key-value pair and pointers to the previous and next nodes.
*/
class CacheNode<K, V> {
key: K;
value: V;
prev: CacheNode<K, V> | null;
next: CacheNode<K, V> | null;
constructor(key: K, value: V) {
this.key = key;
this.value = value;
this.prev = null;
this.next = null;
}
}
Explanation:
- We use generics
<K, V>to make our cache flexible, allowing it to store any type of key and value. keyandvaluestore the actual data.prevandnextare pointers to otherCacheNodeinstances, forming the links in our list. They can benullfor the head or tail of the list.
3. Defining the LRUCache Class
Now, let’s create the main LRUCache class. It will hold our capacity, the map for O(1) lookups, and head and tail pointers for the doubly linked list.
Add this to src/lruCache.ts, below the CacheNode class:
// src/lruCache.ts (continued)
// ... CacheNode class above ...
/**
* Implements a Least Recently Used (LRU) cache.
* Uses a combination of a Hash Map (Map) for O(1) lookups
* and a Doubly Linked List for O(1) reordering and eviction.
*/
class LRUCache<K, V> {
private capacity: number;
private cache: Map<K, CacheNode<K, V>>; // Our Hash Map: key -> CacheNode
private head: CacheNode<K, V> | null; // Most recently used node
private tail: CacheNode<K, V> | null; // Least recently used node
constructor(capacity: number) {
if (capacity < 1) {
throw new Error("Cache capacity must be at least 1.");
}
this.capacity = capacity;
this.cache = new Map<K, CacheNode<K, V>>();
this.head = null;
this.tail = null;
}
// We'll add methods here
}
Explanation:
capacity: The maximum number of items our cache can hold.cache: This is ourMap(the TypeScript equivalent of a hash map). It storeskey -> CacheNodepairs. This allows us to quickly find aCacheNodegiven its key.head: Points to theCacheNodethat was most recently accessed. This will always be at the front of our logical list.tail: Points to theCacheNodethat was least recently accessed. This will always be at the end of our logical list and is the first candidate for eviction when the cache is full.- The constructor initializes these properties and includes a basic validation for
capacity.
4. Helper Methods for Linked List Manipulation
To keep our get and put methods clean, let’s create two helper methods for managing the doubly linked list:
_removeNode(node): Removes a given node from its current position in the list._addNodeToFront(node): Adds a given node to the front of the list (making it the new head).
Add these methods inside the LRUCache class:
// src/lruCache.ts (continued inside LRUCache class)
// ... constructor above ...
/**
* Removes a node from its current position in the doubly linked list.
* @param node The node to remove.
* @private
*/
private _removeNode(node: CacheNode<K, V>): void {
// If the node is the head, update head
if (node === this.head) {
this.head = node.next;
}
// If the node is the tail, update tail
if (node === this.tail) {
this.tail = node.prev;
}
// Link previous node to next node
if (node.prev) {
node.prev.next = node.next;
}
// Link next node to previous node
if (node.next) {
node.next.prev = node.prev;
}
// Clear node's links to ensure it's fully detached
node.prev = null;
node.next = null;
}
/**
* Adds a node to the front (head) of the doubly linked list.
* This makes it the most recently used item.
* @param node The node to add to the front.
* @private
*/
private _addNodeToFront(node: CacheNode<K, V>): void {
node.next = this.head; // New node points to current head
node.prev = null; // New node has no previous (it's the new head)
// If there was an existing head, its 'prev' should now point to the new node
if (this.head) {
this.head.prev = node;
}
this.head = node; // Update head to be the new node
// If the list was empty, the new node is also the tail
if (!this.tail) {
this.tail = node;
}
}
Explanation:
_removeNode: Handles all cases: node is head, node is tail, or node is in the middle. It carefully updatesprevandnextpointers of surrounding nodes and thehead/tailpointers of the cache itself._addNodeToFront: Makes the givennodethe newhead. It adjusts pointers accordingly. It also handles the special case where the list was previously empty, making the new node bothheadandtail.
5. Implementing get(key)
The get method retrieves a value from the cache. If the key exists (cache hit), it should return the value and mark the item as “most recently used” by moving its node to the front of the list. If the key doesn’t exist (cache miss), it returns undefined.
Add this method inside the LRUCache class:
// src/lruCache.ts (continued inside LRUCache class)
// ... _addNodeToFront method above ...
/**
* Retrieves the value associated with the given key from the cache.
* If found, the item is marked as most recently used.
* @param key The key to look up.
* @returns The value if found, otherwise undefined.
*/
get(key: K): V | undefined {
// 1. Check if the key exists in our hash map
const node = this.cache.get(key);
if (!node) {
// Cache Miss: Key not found
return undefined;
}
// Cache Hit: Key found!
// 2. Mark this node as most recently used by moving it to the front of the list
this._removeNode(node);
this._addNodeToFront(node);
// 3. Return its value
return node.value;
}
Explanation:
- We use
this.cache.get(key)to quickly find theCacheNodein O(1) time. - If
nodeisundefined, it’s a cache miss, so we returnundefined. - If
nodeis found, we first_removeNodefrom its current position. - Then, we
_addNodeToFrontto make it the new head, signifying it’s now the most recently used item. - Finally, we return the
node.value.
6. Implementing put(key, value)
The put method adds or updates an item in the cache. This is where the eviction policy comes into play.
Add this method inside the LRUCache class:
// src/lruCache.ts (continued inside LRUCache class)
// ... get method above ...
/**
* Adds a new key-value pair to the cache, or updates an existing one.
* If the cache is full, the least recently used item is evicted.
* @param key The key to add or update.
* @param value The value to associate with the key.
*/
put(key: K, value: V): void {
let node = this.cache.get(key);
if (node) {
// 1. Key already exists: Update its value and mark as most recently used.
node.value = value;
this._removeNode(node);
this._addNodeToFront(node);
} else {
// 2. Key does not exist: Add new item.
// Check if cache is at capacity
if (this.cache.size >= this.capacity) {
// Cache is full, time to evict the LRU item (tail of the list)
if (this.tail) { // Ensure tail exists
this.cache.delete(this.tail.key); // Remove from hash map
this._removeNode(this.tail); // Remove from linked list
}
}
// Create a new node and add it
const newNode = new CacheNode(key, value);
this.cache.set(key, newNode); // Add to hash map
this._addNodeToFront(newNode); // Add to front of linked list (most recent)
}
}
} // End of LRUCache class
Explanation:
- We first check if the
keyalready exists inthis.cache. - If it exists: This is an update. We update the
node.value, then move it to the front of the list using_removeNodeand_addNodeToFront. This correctly marks it as most recently used. - If it does not exist: This is a new item.
- We first check if
this.cache.size(current number of items) is equal to or exceedsthis.capacity. - If it is full, we must evict the LRU item. The
this.tailnode is always the LRU. Wedeleteit from thethis.cachemap and_removeNodefrom the linked list. - After potential eviction (or if not full), we create a
newNode. - We
setthisnewNodeintothis.cachemap. - Finally, we
_addNodeToFront(newNode)to make it the new head of the list.
- We first check if
7. Testing Our LRU Cache
Let’s create a simple test file to see our cache in action.
Create src/index.ts:
// src/index.ts
import { LRUCache } from './lruCache'; // Assuming LRUCache is exported
// To export LRUCache, add 'export' keyword before the class definition in lruCache.ts:
// export class LRUCache<K, V> { ... }
// export class CacheNode<K, V> { ... }
function runCacheDemo() {
console.log("--- LRU Cache Demo ---");
const cache = new LRUCache<string, number>(3); // Capacity of 3
console.log("1. Adding (A, 1)");
cache.put("A", 1); // Cache: A
console.log("Cache size:", cache.size()); // We need to add a size getter later!
console.log("2. Adding (B, 2)");
cache.put("B", 2); // Cache: B, A
console.log("3. Adding (C, 3)");
cache.put("C", 3); // Cache: C, B, A (Full)
console.log("\nCache state after A, B, C added (capacity 3):");
console.log(`Get A: ${cache.get("A")}`); // Access A. Cache: A, C, B
console.log(`Get B: ${cache.get("B")}`); // Access B. Cache: B, A, C
console.log(`Get C: ${cache.get("C")}`); // Access C. Cache: C, B, A
console.log("\n4. Adding (D, 4) - Eviction expected!");
cache.put("D", 4); // Cache: D, C, B (A should be evicted)
console.log(`Get A: ${cache.get("A")}`); // Should be undefined
console.log(`Get D: ${cache.get("D")}`); // Should be 4
console.log("\n5. Updating B to 20");
cache.put("B", 20); // Update B, move to front. Cache: B, D, C
console.log(`Get B: ${cache.get("B")}`); // Should be 20
console.log("\nFinal state check:");
console.log(`Get C: ${cache.get("C")}`); // C is now LRU. Cache: C, B, D
console.log(`Get D: ${cache.get("D")}`); // D is now MRU. Cache: D, C, B
console.log("\n--- Demo Complete ---");
}
// Don't forget to export your LRUCache and CacheNode classes in lruCache.ts
// Add 'export' before class LRUCache and class CacheNode
// Call the demo function
runCacheDemo();
Before running, we need to add a size() getter to our LRUCache class and export the classes.
Modify src/lruCache.ts to export the classes and add size():
// src/lruCache.ts
export class CacheNode<K, V> { // Added 'export'
key: K;
value: V;
prev: CacheNode<K, V> | null;
next: CacheNode<K, V> | null;
constructor(key: K, value: V) {
this.key = key;
this.value = value;
this.prev = null;
this.next = null;
}
}
export class LRUCache<K, V> { // Added 'export'
private capacity: number;
private cache: Map<K, CacheNode<K, V>>;
private head: CacheNode<K, V> | null;
private tail: CacheNode<K, V> | null;
constructor(capacity: number) {
if (capacity < 1) {
throw new Error("Cache capacity must be at least 1.");
}
this.capacity = capacity;
this.cache = new Map<K, CacheNode<K, V>>();
this.head = null;
this.tail = null;
}
// New getter for current size
public size(): number {
return this.cache.size;
}
// ... _removeNode, _addNodeToFront, get, put methods ...
}
Now, let’s compile and run!
npx tsc # Compiles src/lruCache.ts and src/index.ts into dist/
node dist/index.js
You should see output similar to this, demonstrating the LRU eviction:
--- LRU Cache Demo ---
1. Adding (A, 1)
Cache size: 1
2. Adding (B, 2)
3. Adding (C, 3)
Cache state after A, B, C added (capacity 3):
Get A: 1
Get B: 2
Get C: 3
4. Adding (D, 4) - Eviction expected!
Get A: undefined
Get D: 4
5. Updating B to 20
Get B: 20
Final state check:
Get C: 3
Get D: 4
--- Demo Complete ---
Notice how ‘A’ was evicted when ‘D’ was added because ‘A’ was the least recently used after ‘C’ was accessed. This confirms our LRU logic is working!
Mini-Challenge: Enhancing the LRU Cache
Your challenge is to add a new method to our LRUCache called peek(key).
Challenge: Implement peek(key)
The peek method should retrieve the value associated with a key, just like get(key), but without marking the item as most recently used. This means it should NOT move the accessed node to the front of the linked list. If the key is not found, peek should return undefined.
Hint: Think about which parts of the get method you need to keep and which you need to remove.
What to observe/learn: This challenge helps you understand the distinction between merely reading data and interacting with the cache’s recency logic, which is crucial for certain use cases where you want to check an item without affecting its eviction priority.
Take a few minutes to try implementing peek yourself!
Click for Hint
The peek method will only need to interact with the this.cache (the Map). It won’t need to call _removeNode or _addNodeToFront.
Click for Solution
// src/lruCache.ts (inside LRUCache class)
/**
* Retrieves the value associated with the given key from the cache,
* without marking the item as most recently used.
* @param key The key to look up.
* @returns The value if found, otherwise undefined.
*/
peek(key: K): V | undefined {
const node = this.cache.get(key);
return node ? node.value : undefined;
}
Common Pitfalls & Troubleshooting
Even with a relatively simple data structure like the LRU cache, it’s easy to run into issues. Here are some common pitfalls and how to troubleshoot them:
Incorrect Head/Tail Pointer Updates:
- Symptom: Items are not being evicted correctly, or the list breaks (e.g.,
this.head.previs not null, orthis.tail.nextis not null). - Troubleshooting: Carefully review your
_removeNodeand_addNodeToFrontmethods.- Did you handle the edge cases where the list becomes empty (after removing the last node) or where a node becomes the only node (after adding to an empty list)?
- Step through your code with a debugger, paying close attention to
head,tail, andnode.prev/node.nextvalues at each step. - Ensure
nullchecks are in place before accessingnode.prev.nextor similar.
- Symptom: Items are not being evicted correctly, or the list breaks (e.g.,
Forgetting to Update the
Map:- Symptom: You evict a node from the linked list, but a
getcall for its key still returns a value, or the cachesizeis incorrect. - Troubleshooting: Remember that the
Map(this.cache) is the source of truth for key-to-node mappings. When you evict a node, you must also callthis.cache.delete(node.key). Similarly, when adding a new node,this.cache.set(key, newNode)is essential.
- Symptom: You evict a node from the linked list, but a
Capacity Off-by-One Errors:
- Symptom: The cache holds one more or one less item than its
capacity. - Troubleshooting: Double-check your eviction logic:
if (this.cache.size >= this.capacity). Should it be>or>=? For an LRU cache, whensizeequalscapacityand a new item needs to be added, an eviction must occur first. So,this.cache.size >= this.capacityis correct for triggering eviction when adding a new item.
- Symptom: The cache holds one more or one less item than its
Generic Type Issues in TypeScript:
- Symptom: Compiler errors related to
KorVnot being assignable. - Troubleshooting: Ensure you consistently use your generic types
<K, V>throughout theCacheNodeandLRUCacheclasses. When instantiatingLRUCache, provide the specific types, e.g.,new LRUCache<string, number>(...).
- Symptom: Compiler errors related to
Debugging a doubly linked list can be tricky due to pointer manipulation. Drawing diagrams on paper can be incredibly helpful to visualize how prev and next pointers change with each operation.
Summary
Congratulations! You’ve successfully built a functional LRU caching system using TypeScript. This project brought together several important concepts:
- Caching Fundamentals: You learned why caching is essential for performance and the trade-offs involved.
- LRU Eviction Policy: You understood the logic behind the Least Recently Used strategy and why it’s so common.
- Data Structure Synergy: You saw a powerful example of how combining a Hash Map (
Mapin TypeScript) for O(1) lookups and a Doubly Linked List for O(1) reordering and eviction results in an efficient O(1) average time complexity for bothgetandputoperations. - Practical Implementation: You gained hands-on experience implementing these concepts in TypeScript, including managing pointers and handling edge cases.
Real-World Applications
LRU caches are ubiquitous in software engineering:
- Web Browsers: Caching web pages, images, and scripts.
- Databases: Caching frequently accessed query results or data blocks.
- Operating Systems: Caching disk blocks in memory.
- Web Servers/APIs: Caching responses to popular API endpoints.
- Content Delivery Networks (CDNs): Caching static assets closer to users.
- Recommendation Engines: Caching personalized recommendations.
Understanding and being able to implement an LRU cache is a fantastic skill that demonstrates a solid grasp of both data structures and system design principles.
What’s Next?
In the next chapters, we’ll continue our exploration of advanced data structures and algorithmic paradigms. You’ll find that the principles of efficiency, trade-offs, and choosing the right tool for the job that you applied in this caching project are fundamental to mastering DSA. Keep practicing, keep building, and your mastery will grow!
References
- Node.js Official Website: https://nodejs.org/ (Refer to the
v24.13.0LTS orv25.6.1current release documentation for specific API details.) - TypeScript Handbook (Official Documentation): https://www.typescriptlang.org/docs/handbook/intro.html (For general TypeScript syntax and features.)
- MDN Web Docs - Map: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map (Details on JavaScript’s
Mapobject, which serves as our hash map.) - GeeksforGeeks - LRU Cache: https://www.geeksforgeeks.org/lru-cache-implementation/ (A classic resource for understanding LRU cache implementation details across various languages.)
This page is AI-assisted and reviewed. It references official documentation and recognized resources where relevant.