Skip to content

Commit

Permalink
Add an example of the LRU Cache based on the Map.
Browse files Browse the repository at this point in the history
  • Loading branch information
trekhleb committed Jan 24, 2023
1 parent 69c3a16 commit 4b4d770
Show file tree
Hide file tree
Showing 4 changed files with 216 additions and 3 deletions.
6 changes: 3 additions & 3 deletions src/data-structures/lru-cache/LRUCache.js
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ class LinkedListNode {
* Implementation of the LRU (Least Recently Used) Cache
* based on the HashMap and Doubly Linked List data-structures.
*
* Current implementation allows to have fast (O(1)) read and write operations.
* Current implementation allows to have fast O(1) (in average) read and write operations.
*
* At any moment in time the LRU Cache holds not more that "capacity" number of items in it.
*/
Expand All @@ -43,7 +43,7 @@ class LRUCache {

/**
* Returns the cached value by its key.
* Time complexity: O(1).
* Time complexity: O(1) in average.
* @param {string} key
* @returns {any}
*/
Expand All @@ -56,7 +56,7 @@ class LRUCache {

/**
* Sets the value to cache by its key.
* Time complexity: O(1).
* Time complexity: O(1) in average.
* @param {string} key
* @param {any} val
*/
Expand Down
53 changes: 53 additions & 0 deletions src/data-structures/lru-cache/LRUCacheOnMap.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
/* eslint-disable no-restricted-syntax, no-unreachable-loop */

/**
* Implementation of the LRU (Least Recently Used) Cache
* based on the (ordered) Map data-structure.
*
* Current implementation allows to have fast O(1) (in average) read and write operations.
*
* At any moment in time the LRU Cache holds not more that "capacity" number of items in it.
*/
class LRUCacheOnMap {
/**
* Creates a cache instance of a specific capacity.
* @param {number} capacity
*/
constructor(capacity) {
this.capacity = capacity; // How many items to store in cache at max.
this.items = new Map(); // The ordered hash map of all cached items.
}

/**
* Returns the cached value by its key.
* Time complexity: O(1) in average.
* @param {string} key
* @returns {any}
*/
get(key) {
if (!this.items.has(key)) return undefined;
const val = this.items.get(key);
this.items.delete(key);
this.items.set(key, val);
return val;
}

/**
* Sets the value to cache by its key.
* Time complexity: O(1).
* @param {string} key
* @param {any} val
*/
set(key, val) {
this.items.delete(key);
this.items.set(key, val);
if (this.items.size > this.capacity) {
for (const headKey of this.items.keys()) {
this.items.delete(headKey);
break;
}
}
}
}

export default LRUCacheOnMap;
12 changes: 12 additions & 0 deletions src/data-structures/lru-cache/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ The functions `get()` and `set()` must each run in `O(1)` average time complexit

## Implementation

### Version 1: Doubly Linked List + Hash Map

See the `LRUCache` implementation example in [LRUCache.js](./LRUCache.js). The solution uses a `HashMap` for fast `O(1)` (in average) cache items access, and a `DoublyLinkedList` for fast `O(1)` (in average) cache items promotions and eviction (to keep the maximum allowed cache capacity).

![Linked List](./images/lru-cache.jpg)
Expand All @@ -24,6 +26,16 @@ See the `LRUCache` implementation example in [LRUCache.js](./LRUCache.js). The s

You may also find more test-case examples of how the LRU Cache works in [LRUCache.test.js](./__test__/LRUCache.test.js) file.

### Version 2: Ordered Map

The first implementation that uses doubly linked list is good for learning purposes and for better understanding of how the average `O(1)` time complexity is achievable while doing `set()` and `get()`.

However, the simpler approach might be to use a JavaScript [Map](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map) object. The `Map` object holds key-value pairs and **remembers the original insertion order** of the keys. We can use this fact in order to keep the recently-used items in the "end" of the map by removing and re-adding items. The item at the beginning of the `Map` is the first one to be evicted if cache capacity overflows. The order of the items may checked by using the `IterableIterator` like `map.keys()`.

See the `LRUCacheOnMap` implementation example in [LRUCacheOnMap.js](./LRUCacheOnMap.js).

You may also find more test-case examples of how the LRU Cache works in [LRUCacheOnMap.test.js](./__test__/LRUCacheOnMap.test.js) file.

## Complexities

| | Average |
Expand Down
148 changes: 148 additions & 0 deletions src/data-structures/lru-cache/__test__/LRUCacheOnMap.test.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
import LRUCache from '../LRUCacheOnMap';

describe('LRUCacheOnMap', () => {
it('should set and get values to and from the cache', () => {
const cache = new LRUCache(100);
expect(cache.get('key-1')).toBeUndefined();

cache.set('key-1', 15);
cache.set('key-2', 16);
cache.set('key-3', 17);
expect(cache.get('key-1')).toBe(15);
expect(cache.get('key-2')).toBe(16);
expect(cache.get('key-3')).toBe(17);
expect(cache.get('key-3')).toBe(17);
expect(cache.get('key-2')).toBe(16);
expect(cache.get('key-1')).toBe(15);

cache.set('key-1', 5);
cache.set('key-2', 6);
cache.set('key-3', 7);
expect(cache.get('key-1')).toBe(5);
expect(cache.get('key-2')).toBe(6);
expect(cache.get('key-3')).toBe(7);
});

it('should evict least recently used items from cache with cache size of 1', () => {
const cache = new LRUCache(1);
expect(cache.get('key-1')).toBeUndefined();

cache.set('key-1', 15);
expect(cache.get('key-1')).toBe(15);

cache.set('key-2', 16);
expect(cache.get('key-1')).toBeUndefined();
expect(cache.get('key-2')).toBe(16);

cache.set('key-2', 17);
expect(cache.get('key-2')).toBe(17);

cache.set('key-3', 18);
cache.set('key-4', 19);
expect(cache.get('key-2')).toBeUndefined();
expect(cache.get('key-3')).toBeUndefined();
expect(cache.get('key-4')).toBe(19);
});

it('should evict least recently used items from cache with cache size of 2', () => {
const cache = new LRUCache(2);
expect(cache.get('key-21')).toBeUndefined();

cache.set('key-21', 15);
expect(cache.get('key-21')).toBe(15);

cache.set('key-22', 16);
expect(cache.get('key-21')).toBe(15);
expect(cache.get('key-22')).toBe(16);

cache.set('key-22', 17);
expect(cache.get('key-22')).toBe(17);

cache.set('key-23', 18);
expect(cache.get('key-21')).toBeUndefined();
expect(cache.get('key-22')).toBe(17);
expect(cache.get('key-23')).toBe(18);

cache.set('key-24', 19);
expect(cache.get('key-21')).toBeUndefined();
expect(cache.get('key-22')).toBeUndefined();
expect(cache.get('key-23')).toBe(18);
expect(cache.get('key-24')).toBe(19);
});

it('should evict least recently used items from cache with cache size of 3', () => {
const cache = new LRUCache(3);

cache.set('key-1', 1);
cache.set('key-2', 2);
cache.set('key-3', 3);
expect(cache.get('key-1')).toBe(1);
expect(cache.get('key-2')).toBe(2);
expect(cache.get('key-3')).toBe(3);

cache.set('key-3', 4);
expect(cache.get('key-1')).toBe(1);
expect(cache.get('key-2')).toBe(2);
expect(cache.get('key-3')).toBe(4);

cache.set('key-4', 5);
expect(cache.get('key-1')).toBeUndefined();
expect(cache.get('key-2')).toBe(2);
expect(cache.get('key-3')).toBe(4);
expect(cache.get('key-4')).toBe(5);
});

it('should promote the node while calling set() method', () => {
const cache = new LRUCache(2);

cache.set('2', 1);
cache.set('1', 1);
cache.set('2', 3);
cache.set('4', 1);
expect(cache.get('1')).toBeUndefined();
expect(cache.get('2')).toBe(3);
});

it('should promote the recently accessed item with cache size of 3', () => {
const cache = new LRUCache(3);

cache.set('key-1', 1);
cache.set('key-2', 2);
cache.set('key-3', 3);
expect(cache.get('key-1')).toBe(1);

cache.set('key-4', 4);
expect(cache.get('key-1')).toBe(1);
expect(cache.get('key-3')).toBe(3);
expect(cache.get('key-4')).toBe(4);
expect(cache.get('key-2')).toBeUndefined();
});

it('should promote the recently accessed item with cache size of 4', () => {
const cache = new LRUCache(4);

cache.set('key-1', 1);
cache.set('key-2', 2);
cache.set('key-3', 3);
cache.set('key-4', 4);
expect(cache.get('key-4')).toBe(4);
expect(cache.get('key-3')).toBe(3);
expect(cache.get('key-2')).toBe(2);
expect(cache.get('key-1')).toBe(1);

cache.set('key-5', 5);
expect(cache.get('key-1')).toBe(1);
expect(cache.get('key-2')).toBe(2);
expect(cache.get('key-3')).toBe(3);
expect(cache.get('key-4')).toBeUndefined();
expect(cache.get('key-5')).toBe(5);

cache.set('key-6', 6);
expect(cache.get('key-1')).toBeUndefined();
expect(cache.get('key-2')).toBe(2);
expect(cache.get('key-3')).toBe(3);
expect(cache.get('key-4')).toBeUndefined();
expect(cache.get('key-5')).toBe(5);
expect(cache.get('key-6')).toBe(6);
});
});

0 comments on commit 4b4d770

Please sign in to comment.