
Security News
Meet Socket at Black Hat Europe and BSides London 2025
Socket is heading to London! Stop by our booth or schedule a meeting to see what we've been working on.
A high-performance, lightweight LRU cache. Built for developers who need fast caching without compromising on features.
A high-performance, lightweight LRU cache for JavaScript with strong UPDATE performance and competitive SET/GET/DELETE, and a compact bundle size. Built for developers who need fast caching without compromising on features.
npm install tiny-lru
# or
yarn add tiny-lru
# or
pnpm add tiny-lru
Requirements: Node.js β₯12
import {lru} from "tiny-lru";
// Create cache and start using immediately
const cache = lru(100); // Max 100 items
cache.set('user:123', {name: 'John', age: 30});
const user = cache.get('user:123'); // {name: 'John', age: 30}
// With TTL (5 second expiration)
const tempCache = lru(50, 5000);
tempCache.set('session', 'abc123'); // Automatically expires after 5 seconds
| Library | SET ops/sec | GET ops/sec | UPDATE ops/sec | DELETE ops/sec |
|---|---|---|---|---|
| tiny-lru | 404,753 | 1,768,449 | 1,703,716 | 298,770 |
| lru-cache | 326,221 | 1,069,061 | 878,858 | 277,734 |
| quick-lru | 591,683 | 1,298,487 | 935,481 | 359,600 |
| mnemonist | 412,467 | 2,478,778 | 2,156,690 | 0 |
Notes:
npm run benchmark:comparison.β Perfect for:
# Run all performance benchmarks
npm run benchmark:all
# Individual benchmark suites
npm run benchmark:modern # Comprehensive Tinybench suite
npm run benchmark:perf # Performance Observer measurements
npm run benchmark:comparison # Compare against other LRU libraries
npm install tiny-lru
# or
yarn add tiny-lru
# or
pnpm add tiny-lru
import {lru} from "tiny-lru";
// Basic cache
const cache = lru(100);
cache.set('key1', 'value1')
.set('key2', 'value2')
.set('key3', 'value3');
console.log(cache.get('key1')); // 'value1'
console.log(cache.size); // 3
// With TTL (time-to-live)
const cacheWithTtl = lru(50, 30000); // 30 second TTL
cacheWithTtl.set('temp-data', {important: true});
// Automatically expires after 30 seconds
const resetCache = lru(25, 10000, true);
resetCache.set('session', 'user123');
// Because resetTtl is true, TTL resets when you set() the same key again
<!-- ES Modules -->
<script type="module">
import {lru, LRU} from 'https://cdn.skypack.dev/tiny-lru';
const cache = lru(100);
</script>
<!-- UMD Bundle (global: window.lru) -->
<script src="https://unpkg.com/tiny-lru/dist/tiny-lru.umd.js"></script>
<script>
const {lru, LRU} = window.lru;
const cache = lru(100);
// or: const cache = new LRU(100);
</script>
import {lru, LRU} from "tiny-lru";
// Type-safe cache
const cache = lru<string>(100);
// or: const cache: LRU<string> = lru<string>(100);
cache.set('user:123', 'John Doe');
const user: string | undefined = cache.get('user:123');
// Class inheritance
class MyCache extends LRU<User> {
constructor() {
super(1000, 60000, true); // 1000 items, 1 min TTL, reset TTL on set
}
}
import {lru} from "tiny-lru";
const cache = lru(max, ttl = 0, resetTtl = false);
Parameters:
max {Number} - Maximum number of items (0 = unlimited, default: 1000)ttl {Number} - Time-to-live in milliseconds (0 = no expiration, default: 0)resetTtl {Boolean} - Reset TTL when updating existing items via set() (default: false)import {LRU} from "tiny-lru";
const cache = new LRU(1000, 60000, true); // 1000 items, 1 min TTL, reset TTL on set
// 1. Size your cache appropriately
const cache = lru(1000); // Not too small, not too large
// 2. Use meaningful keys
cache.set(`user:${userId}:profile`, userProfile);
cache.set(`product:${productId}:details`, productDetails);
// 3. Handle cache misses gracefully
function getData(key) {
const cached = cache.get(key);
if (cached !== undefined) {
return cached;
}
// Fallback to slower data source
const data = expensiveOperation(key);
cache.set(key, data);
return data;
}
// 4. Clean up when needed
process.on('exit', () => {
cache.clear(); // Help garbage collection
});
clear() when done to help garbage collectionimport {lru} from "tiny-lru";
class ApiClient {
constructor() {
this.cache = lru(100, 300000); // 5 minute cache
}
async fetchUser(userId) {
const cacheKey = `user:${userId}`;
// Return cached result if available
if (this.cache.has(cacheKey)) {
return this.cache.get(cacheKey);
}
// Fetch from API and cache
const response = await fetch(`/api/users/${userId}`);
const user = await response.json();
this.cache.set(cacheKey, user);
return user;
}
}
import {lru} from "tiny-lru";
function memoize(fn, maxSize = 100) {
const cache = lru(maxSize);
return function(...args) {
const key = JSON.stringify(args);
if (cache.has(key)) {
return cache.get(key);
}
const result = fn.apply(this, args);
cache.set(key, result);
return result;
};
}
// Usage
const expensiveCalculation = memoize((n) => {
console.log(`Computing for ${n}`);
return n * n * n;
}, 50);
console.log(expensiveCalculation(5)); // Computing for 5 -> 125
console.log(expensiveCalculation(5)); // 125 (cached)
import {lru} from "tiny-lru";
class SessionManager {
constructor() {
// 30 minute TTL, with resetTtl enabled for set()
this.sessions = lru(1000, 1800000, true);
}
createSession(userId, data) {
const sessionId = this.generateId();
const session = {
userId,
data,
createdAt: Date.now()
};
this.sessions.set(sessionId, session);
return sessionId;
}
getSession(sessionId) {
// get() does not extend TTL; to extend, set the session again when resetTtl is true
return this.sessions.get(sessionId);
}
endSession(sessionId) {
this.sessions.delete(sessionId);
}
}
Compatible with Lodash's memoize function cache interface:
import _ from "lodash";
import {lru} from "tiny-lru";
_.memoize.Cache = lru().constructor;
const memoized = _.memoize(myFunc);
memoized.cache.max = 10;
Tiny LRU maintains 100% test coverage with comprehensive unit and integration tests.
# Run all tests with coverage
npm test
# Run tests with verbose output
npm run mocha
# Lint code
npm run lint
# Full build (lint + build)
npm run build
Test Coverage: 100% coverage across all modules
----------|---------|----------|---------|---------|-------------------
File | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s
----------|---------|----------|---------|---------|-------------------
All files | 100 | 100 | 100 | 100 |
lru.js | 100 | 100 | 100 | 100 |
----------|---------|----------|---------|---------|-------------------
# Clone and setup
git clone https://github.com/avoidwork/tiny-lru.git
cd tiny-lru
npm install
# Run tests
npm test
# Run linting
npm run lint
# Run benchmarks
npm run benchmark:all
# Build distribution files
npm run build
git checkout -b feature/amazing-featurenpm test && npm run lintgit commit -m "feat: add amazing feature"git push origin feature/amazing-featureCreates a new LRU cache instance using the factory function.
Parameters:
max {Number} - Maximum number of items to store (default: 1000; 0 = unlimited)ttl {Number} - Time-to-live in milliseconds (default: 0; 0 = no expiration)resetTtl {Boolean} - Reset TTL when updating existing items via set() (default: false)Returns: {LRU} New LRU cache instance
Throws: {TypeError} When parameters are invalid
import {lru} from "tiny-lru";
// Basic cache
const cache = lru(100);
// With TTL
const cacheWithTtl = lru(50, 30000); // 30 second TTL
// With resetTtl enabled for set()
const resetCache = lru(25, 10000, true);
// Validation errors
lru(-1); // TypeError: Invalid max value
lru(100, -1); // TypeError: Invalid ttl value
lru(100, 0, "no"); // TypeError: Invalid resetTtl value
{Object|null} - Item in first (least recently used) position
const cache = lru();
cache.first; // null - empty cache
{Object|null} - Item in last (most recently used) position
const cache = lru();
cache.last; // null - empty cache
{Number} - Maximum number of items to hold in cache
const cache = lru(500);
cache.max; // 500
{Boolean} - Whether to reset TTL when updating existing items via set()
const cache = lru(500, 5*6e4, true);
cache.resetTtl; // true
{Number} - Current number of items in cache
const cache = lru();
cache.size; // 0 - empty cache
{Number} - TTL in milliseconds (0 = no expiration)
const cache = lru(100, 3e4);
cache.ttl; // 30000
Removes all items from cache.
Returns: {Object} LRU instance
cache.clear();
Removes specified item from cache.
Parameters:
key {String} - Item keyReturns: {Object} LRU instance
cache.set('key1', 'value1');
cache.delete('key1');
console.log(cache.has('key1')); // false
Returns array of cache items as [key, value] pairs.
Parameters:
keys {Array} - Optional array of specific keys to retrieve (defaults to all keys)Returns: {Array} Array of [key, value] pairs
cache.set('a', 1).set('b', 2);
console.log(cache.entries()); // [['a', 1], ['b', 2]]
console.log(cache.entries(['a'])); // [['a', 1]]
Removes the least recently used item from cache.
Returns: {Object} LRU instance
cache.set('old', 'value').set('new', 'value');
cache.evict(); // Removes 'old' item
Gets expiration timestamp for cached item.
Parameters:
key {String} - Item keyReturns: {Number|undefined} Expiration time (epoch milliseconds) or undefined if key doesn't exist
const cache = new LRU(100, 5000); // 5 second TTL
cache.set('key1', 'value1');
console.log(cache.expiresAt('key1')); // timestamp 5 seconds from now
Retrieves cached item and promotes it to most recently used position.
Parameters:
key {String} - Item keyReturns: {*} Item value or undefined if not found/expired
Note: get() does not reset or extend TTL. TTL is only reset on set() when resetTtl is true.
cache.set('key1', 'value1');
console.log(cache.get('key1')); // 'value1'
console.log(cache.get('nonexistent')); // undefined
Checks if key exists in cache (without promoting it).
Parameters:
key {String} - Item keyReturns: {Boolean} True if key exists and is not expired
cache.set('key1', 'value1');
console.log(cache.has('key1')); // true
console.log(cache.has('nonexistent')); // false
Returns array of all cache keys in LRU order (first = least recent).
Returns: {Array} Array of keys
cache.set('a', 1).set('b', 2);
cache.get('a'); // Move 'a' to most recent
console.log(cache.keys()); // ['b', 'a']
Stores item in cache as most recently used.
Parameters:
key {String} - Item keyvalue {*} - Item valueReturns: {Object} LRU instance
cache.set('key1', 'value1')
.set('key2', 'value2')
.set('key3', 'value3');
Stores item and returns evicted item if cache was full.
Parameters:
key {String} - Item keyvalue {*} - Item valueReturns: {Object|null} Evicted item {key, value, expiry, prev, next} or null
const cache = new LRU(2);
cache.set('a', 1).set('b', 2);
const evicted = cache.setWithEvicted('c', 3); // evicted = {key: 'a', value: 1, ...}
if (evicted) {
console.log(`Evicted: ${evicted.key}`, evicted.value);
}
Returns array of cache values.
Parameters:
keys {Array} - Optional array of specific keys to retrieve (defaults to all keys)Returns: {Array} Array of values
cache.set('a', 1).set('b', 2);
console.log(cache.values()); // [1, 2]
console.log(cache.values(['a'])); // [1]
Copyright (c) 2025 Jason Mulligan
Licensed under the BSD-3 license.
The lru-cache package is another popular LRU cache implementation for Node.js. It offers more features and configuration options compared to tiny-lru, such as time-to-live (TTL) settings for cache entries and more detailed cache statistics.
The quick-lru package is a minimalist LRU cache implementation that focuses on performance and simplicity. It is similar to tiny-lru in terms of its lightweight nature but offers a slightly different API.
The node-cache package provides a simple and efficient in-memory caching solution with support for TTL and other advanced features. It is more feature-rich compared to tiny-lru, making it suitable for more complex caching needs.
FAQs
A high-performance, lightweight LRU cache. Built for developers who need fast caching without compromising on features.
The npm package tiny-lru receives a total of 959,165 weekly downloads. As such, tiny-lru popularity was classified as popular.
We found that tiny-lru demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.Β It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Socket is heading to London! Stop by our booth or schedule a meeting to see what we've been working on.

Security News
OWASPβs 2025 Top 10 introduces Software Supply Chain Failures as a new category, reflecting rising concern over dependency and build system risks.

Research
/Security News
Socket researchers discovered nine malicious NuGet packages that use time-delayed payloads to crash applications and corrupt industrial control systems.