What is tiny-lru?
The tiny-lru npm package is a lightweight, in-memory Least Recently Used (LRU) cache. It is designed to be simple and efficient, making it suitable for use in environments where memory usage and performance are critical.
What are tiny-lru's main functionalities?
Creating an LRU Cache
This feature allows you to create an LRU cache with a specified maximum size. The cache will automatically evict the least recently used items when the size limit is reached.
const LRU = require('tiny-lru');
const cache = LRU(100); // Create a cache with a max size of 100 items
Setting and Getting Cache Items
You can store items in the cache using the `set` method and retrieve them using the `get` method. If the item is not found, `get` will return `undefined`.
cache.set('key', 'value');
const value = cache.get('key'); // 'value'
Deleting Cache Items
This feature allows you to delete specific items from the cache using the `delete` method.
cache.set('key', 'value');
cache.delete('key');
const value = cache.get('key'); // undefined
Clearing the Cache
You can clear all items from the cache using the `clear` method, which removes all entries.
cache.set('key1', 'value1');
cache.set('key2', 'value2');
cache.clear();
const value1 = cache.get('key1'); // undefined
const value2 = cache.get('key2'); // undefined
Cache Size and Item Count
This feature allows you to check the current number of items in the cache using the `size` property.
cache.set('key1', 'value1');
cache.set('key2', 'value2');
const size = cache.size; // 2
Other packages similar to tiny-lru
lru-cache
The lru-cache package is another popular LRU cache implementation for Node.js. It offers more features and configuration options compared to tiny-lru, such as time-to-live (TTL) settings for cache entries and more detailed cache statistics.
quick-lru
The quick-lru package is a minimalist LRU cache implementation that focuses on performance and simplicity. It is similar to tiny-lru in terms of its lightweight nature but offers a slightly different API.
node-cache
The node-cache package provides a simple and efficient in-memory caching solution with support for TTL and other advanced features. It is more feature-rich compared to tiny-lru, making it suitable for more complex caching needs.
🚀 Tiny LRU

A high-performance, lightweight LRU cache for JavaScript with strong UPDATE performance and competitive SET/GET/DELETE, and a compact bundle size. Built for developers who need fast caching without compromising on features.
📦 Installation
npm install tiny-lru
yarn add tiny-lru
pnpm add tiny-lru
Requirements: Node.js ≥12
⚡ Quick Start
import {lru} from "tiny-lru";
const cache = lru(100);
cache.set('user:123', {name: 'John', age: 30});
const user = cache.get('user:123');
const tempCache = lru(50, 5000);
tempCache.set('session', 'abc123');
📑 Table of Contents
✨ Features & Benefits
Why Choose Tiny LRU?
- 🔄 Strong Cache Updates - Excellent performance in update-heavy workloads
- 📦 Compact Bundle - Just ~2.2 KiB minified for a full-featured LRU library
- ⚖️ Balanced Performance - Competitive across all operations with O(1) complexity
- ⏱️ TTL Support - Optional time-to-live with automatic expiration
- 🔄 Method Chaining - Fluent API for better developer experience
- 🎯 TypeScript Ready - Full TypeScript support with complete type definitions
- 🌐 Universal Compatibility - Works seamlessly in Node.js and browsers
- 🛡️ Production Ready - Battle-tested and reliable
Benchmark Comparison (Mean of 5 runs)
tiny-lru | 404,753 | 1,768,449 | 1,703,716 | 298,770 |
lru-cache | 326,221 | 1,069,061 | 878,858 | 277,734 |
quick-lru | 591,683 | 1,298,487 | 935,481 | 359,600 |
mnemonist | 412,467 | 2,478,778 | 2,156,690 | 0 |
Notes:
- Mean values computed from the Performance Summary across 5 consecutive runs of
npm run benchmark:comparison
.
- mnemonist lacks a compatible delete method in this harness, so DELETE ops/sec is 0.
- Performance varies by hardware, Node.js version, and workload patterns; run the provided benchmarks locally to assess your specific use case.
- Environment: Node.js v24.5.0, macOS arm64.
📊 Performance Deep Dive
When to Choose Tiny LRU
✅ Perfect for:
- Frequent cache updates - Leading UPDATE performance
- Mixed read/write workloads - Balanced across all operations
- Bundle size constraints - Compact library with full features
- Production applications - Battle-tested with comprehensive testing
Running Your Own Benchmarks
npm run benchmark:all
npm run benchmark:modern
npm run benchmark:perf
npm run benchmark:comparison
🚀 Getting Started
Installation
npm install tiny-lru
yarn add tiny-lru
pnpm add tiny-lru
Quick Examples
import {lru} from "tiny-lru";
const cache = lru(100);
cache.set('key1', 'value1')
.set('key2', 'value2')
.set('key3', 'value3');
console.log(cache.get('key1'));
console.log(cache.size);
const cacheWithTtl = lru(50, 30000);
cacheWithTtl.set('temp-data', {important: true});
const resetCache = lru(25, 10000, true);
resetCache.set('session', 'user123');
CDN Usage (Browser)
<script type="module">
import {lru, LRU} from 'https://cdn.skypack.dev/tiny-lru';
const cache = lru(100);
</script>
<script src="https://unpkg.com/tiny-lru/dist/tiny-lru.umd.js"></script>
<script>
const {lru, LRU} = window.lru;
const cache = lru(100);
</script>
TypeScript Usage
import {lru, LRU} from "tiny-lru";
const cache = lru<string>(100);
cache.set('user:123', 'John Doe');
const user: string | undefined = cache.get('user:123');
class MyCache extends LRU<User> {
constructor() {
super(1000, 60000, true);
}
}
Configuration Options
Factory Function
import {lru} from "tiny-lru";
const cache = lru(max, ttl = 0, resetTtl = false);
Parameters:
max
{Number}
- Maximum number of items (0 = unlimited, default: 1000)
ttl
{Number}
- Time-to-live in milliseconds (0 = no expiration, default: 0)
resetTtl
{Boolean}
- Reset TTL when updating existing items via set()
(default: false)
Class Constructor
import {LRU} from "tiny-lru";
const cache = new LRU(1000, 60000, true);
Best Practices
const cache = lru(1000);
cache.set(`user:${userId}:profile`, userProfile);
cache.set(`product:${productId}:details`, productDetails);
function getData(key) {
const cached = cache.get(key);
if (cached !== undefined) {
return cached;
}
const data = expensiveOperation(key);
cache.set(key, data);
return data;
}
process.on('exit', () => {
cache.clear();
});
Optimization Tips
- Cache Size: Keep cache size reasonable (1000-10000 items for most use cases)
- TTL Usage: Only use TTL when necessary; it adds overhead
- Key Types: String keys perform better than object keys
- Memory: Call
clear()
when done to help garbage collection
💡 Real-World Examples
API Response Caching
import {lru} from "tiny-lru";
class ApiClient {
constructor() {
this.cache = lru(100, 300000);
}
async fetchUser(userId) {
const cacheKey = `user:${userId}`;
if (this.cache.has(cacheKey)) {
return this.cache.get(cacheKey);
}
const response = await fetch(`/api/users/${userId}`);
const user = await response.json();
this.cache.set(cacheKey, user);
return user;
}
}
Function Memoization
import {lru} from "tiny-lru";
function memoize(fn, maxSize = 100) {
const cache = lru(maxSize);
return function(...args) {
const key = JSON.stringify(args);
if (cache.has(key)) {
return cache.get(key);
}
const result = fn.apply(this, args);
cache.set(key, result);
return result;
};
}
const expensiveCalculation = memoize((n) => {
console.log(`Computing for ${n}`);
return n * n * n;
}, 50);
console.log(expensiveCalculation(5));
console.log(expensiveCalculation(5));
Session Management
import {lru} from "tiny-lru";
class SessionManager {
constructor() {
this.sessions = lru(1000, 1800000, true);
}
createSession(userId, data) {
const sessionId = this.generateId();
const session = {
userId,
data,
createdAt: Date.now()
};
this.sessions.set(sessionId, session);
return sessionId;
}
getSession(sessionId) {
return this.sessions.get(sessionId);
}
endSession(sessionId) {
this.sessions.delete(sessionId);
}
}
🔗 Interoperability
Compatible with Lodash's memoize
function cache interface:
import _ from "lodash";
import {lru} from "tiny-lru";
_.memoize.Cache = lru().constructor;
const memoized = _.memoize(myFunc);
memoized.cache.max = 10;
🛠️ Development
Testing
Tiny LRU maintains 100% test coverage with comprehensive unit and integration tests.
npm test
npm run mocha
npm run lint
npm run build
Test Coverage: 100% coverage across all modules
----------|---------|----------|---------|---------|-------------------
File | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s
----------|---------|----------|---------|---------|-------------------
All files | 100 | 100 | 100 | 100 |
lru.js | 100 | 100 | 100 | 100 |
----------|---------|----------|---------|---------|-------------------
Contributing
Quick Start for Contributors
git clone https://github.com/avoidwork/tiny-lru.git
cd tiny-lru
npm install
npm test
npm run lint
npm run benchmark:all
npm run build
Development Workflow
- Fork the repository on GitHub
- Clone your fork locally
- Create a feature branch:
git checkout -b feature/amazing-feature
- Develop your changes with tests
- Test thoroughly:
npm test && npm run lint
- Commit using conventional commits:
git commit -m "feat: add amazing feature"
- Push to your fork:
git push origin feature/amazing-feature
- Submit a Pull Request
Contribution Guidelines
- Code Quality: Follow ESLint rules and existing code style
- Testing: Maintain 100% test coverage for all changes
- Documentation: Update README.md and JSDoc for API changes
- Performance: Benchmark changes that could impact performance
- Compatibility: Ensure Node.js ≥12 compatibility
- Commit Messages: Use Conventional Commits format
📖 API Reference
Factory Function
lru(max, ttl, resetTtl)
Creates a new LRU cache instance using the factory function.
Parameters:
max
{Number}
- Maximum number of items to store (default: 1000; 0 = unlimited)
ttl
{Number}
- Time-to-live in milliseconds (default: 0; 0 = no expiration)
resetTtl
{Boolean}
- Reset TTL when updating existing items via set()
(default: false)
Returns: {LRU}
New LRU cache instance
Throws: {TypeError}
When parameters are invalid
import {lru} from "tiny-lru";
const cache = lru(100);
const cacheWithTtl = lru(50, 30000);
const resetCache = lru(25, 10000, true);
lru(-1);
lru(100, -1);
lru(100, 0, "no");
Properties
first
{Object|null}
- Item in first (least recently used) position
const cache = lru();
cache.first;
last
{Object|null}
- Item in last (most recently used) position
const cache = lru();
cache.last;
max
{Number}
- Maximum number of items to hold in cache
const cache = lru(500);
cache.max;
resetTtl
{Boolean}
- Whether to reset TTL when updating existing items via set()
const cache = lru(500, 5*6e4, true);
cache.resetTtl;
size
{Number}
- Current number of items in cache
const cache = lru();
cache.size;
ttl
{Number}
- TTL in milliseconds (0 = no expiration)
const cache = lru(100, 3e4);
cache.ttl;
Methods
clear()
Removes all items from cache.
Returns: {Object}
LRU instance
cache.clear();
delete(key)
Removes specified item from cache.
Parameters:
Returns: {Object}
LRU instance
cache.set('key1', 'value1');
cache.delete('key1');
console.log(cache.has('key1'));
entries([keys])
Returns array of cache items as [key, value]
pairs.
Parameters:
keys
{Array}
- Optional array of specific keys to retrieve (defaults to all keys)
Returns: {Array}
Array of [key, value]
pairs
cache.set('a', 1).set('b', 2);
console.log(cache.entries());
console.log(cache.entries(['a']));
evict()
Removes the least recently used item from cache.
Returns: {Object}
LRU instance
cache.set('old', 'value').set('new', 'value');
cache.evict();
expiresAt(key)
Gets expiration timestamp for cached item.
Parameters:
Returns: {Number|undefined}
Expiration time (epoch milliseconds) or undefined if key doesn't exist
const cache = new LRU(100, 5000);
cache.set('key1', 'value1');
console.log(cache.expiresAt('key1'));
get(key)
Retrieves cached item and promotes it to most recently used position.
Parameters:
Returns: {*}
Item value or undefined if not found/expired
Note: get()
does not reset or extend TTL. TTL is only reset on set()
when resetTtl
is true
.
cache.set('key1', 'value1');
console.log(cache.get('key1'));
console.log(cache.get('nonexistent'));
has(key)
Checks if key exists in cache (without promoting it).
Parameters:
Returns: {Boolean}
True if key exists and is not expired
cache.set('key1', 'value1');
console.log(cache.has('key1'));
console.log(cache.has('nonexistent'));
keys()
Returns array of all cache keys in LRU order (first = least recent).
Returns: {Array}
Array of keys
cache.set('a', 1).set('b', 2);
cache.get('a');
console.log(cache.keys());
set(key, value)
Stores item in cache as most recently used.
Parameters:
key
{String}
- Item key
value
{*}
- Item value
Returns: {Object}
LRU instance
cache.set('key1', 'value1')
.set('key2', 'value2')
.set('key3', 'value3');
setWithEvicted(key, value)
Stores item and returns evicted item if cache was full.
Parameters:
key
{String}
- Item key
value
{*}
- Item value
Returns: {Object|null}
Evicted item {key, value, expiry, prev, next}
or null
const cache = new LRU(2);
cache.set('a', 1).set('b', 2);
const evicted = cache.setWithEvicted('c', 3);
if (evicted) {
console.log(`Evicted: ${evicted.key}`, evicted.value);
}
values([keys])
Returns array of cache values.
Parameters:
keys
{Array}
- Optional array of specific keys to retrieve (defaults to all keys)
Returns: {Array}
Array of values
cache.set('a', 1).set('b', 2);
console.log(cache.values());
console.log(cache.values(['a']));
📄 License
Copyright (c) 2025 Jason Mulligan
Licensed under the BSD-3 license.