
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
@bernierllc/cache-manager
Advanced tools
Multi-tier caching with TTL support, cache invalidation, and multiple storage backends
Multi-tier caching with TTL support, cache invalidation, and multiple storage backends.
npm install @bernierllc/cache-manager
For Redis support:
npm install @bernierllc/cache-manager redis
import { CacheManager } from '@bernierllc/cache-manager';
// Create cache with default memory backend
const cache = new CacheManager({
strategy: 'lru',
maxSize: 1000,
defaultTtl: 60 * 1000 // 1 minute
});
// Set cache value
await cache.set('user:123', { name: 'John', email: 'john@example.com' });
// Get cache value
const user = await cache.get('user:123');
console.log(user); // { name: 'John', email: 'john@example.com' }
// Get or set pattern
const userData = await cache.getOrSet('user:456', async () => {
return await fetchUserFromDatabase('456');
}, 5 * 60 * 1000); // Cache for 5 minutes
const cache = new CacheManager(options: CacheOptions)
Options:
backend?: CacheBackend | CacheBackend[] - Storage backend(s)strategy?: 'lru' | 'lfu' | 'ttl' - Eviction strategy (default: 'lru')maxSize?: number - Maximum cache size (default: 1000)defaultTtl?: number - Default TTL in millisecondskeyPrefix?: string - Prefix for all cache keysonEviction?: (key, value) => void - Eviction callbackonExpiration?: (key, value) => void - Expiration callback// Basic operations
await cache.set(key: string, value: any, ttl?: number, tags?: string[]): Promise<void>
await cache.get<T>(key: string): Promise<T | null>
await cache.delete(key: string): Promise<boolean>
await cache.clear(): Promise<void>
await cache.has(key: string): Promise<boolean>
// Batch operations
await cache.mget<T>(keys: string[]): Promise<(T | null)[]>
await cache.mset<T>(entries: Record<string, T>, ttl?: number): Promise<void>
await cache.getOrSet<T>(key: string, factory: () => Promise<T>, ttl?: number): Promise<T>
// Invalidation
await cache.invalidateByTag(tag: string): Promise<void>
await cache.invalidateByPattern(pattern: string): Promise<void>
// Utilities
await cache.keys(pattern?: string): Promise<string[]>
await cache.size(): Promise<number>
await cache.getStats(): Promise<CacheStats>
await cache.cleanup(): Promise<number>
import { CacheManager, MemoryCacheBackend } from '@bernierllc/cache-manager';
const cache = new CacheManager({
backend: new MemoryCacheBackend({
maxSize: 1000,
strategy: 'lru'
}),
defaultTtl: 60 * 60 * 1000 // 1 hour
});
// Cache user data
await cache.set('user:123', {
name: 'John Doe',
email: 'john@example.com'
});
const user = await cache.get('user:123');
console.log(user.name); // 'John Doe'
import { CacheManager, RedisCacheBackend } from '@bernierllc/cache-manager';
const cache = new CacheManager({
backend: new RedisCacheBackend({
host: 'localhost',
port: 6379,
keyPrefix: 'myapp:'
}),
defaultTtl: 15 * 60 * 1000 // 15 minutes
});
// Works across multiple application instances
await cache.set('global:config', configObject);
import {
CacheManager,
MemoryCacheBackend,
RedisCacheBackend
} from '@bernierllc/cache-manager';
const cache = new CacheManager({
backend: [
new MemoryCacheBackend({ maxSize: 100 }), // L1 cache - fast, small
new RedisCacheBackend({ host: 'localhost' }) // L2 cache - shared, persistent
]
});
// Automatically checks L1, then L2, updates L1 on L2 hits
const data = await cache.get('expensive:computation');
const cache = new CacheManager({
backend: new RedisCacheBackend({ host: 'localhost' })
});
// Cache with tags
await cache.set('post:1', postData, 60 * 60 * 1000, ['user:123', 'category:tech']);
await cache.set('post:2', postData2, 60 * 60 * 1000, ['user:123', 'category:news']);
// Invalidate all posts by user
await cache.invalidateByTag('user:123');
// Invalidate all tech posts
await cache.invalidateByTag('category:tech');
// Cache user-specific data
await cache.set('user:123:profile', profileData);
await cache.set('user:123:settings', settingsData);
await cache.set('user:456:profile', otherProfileData);
// Invalidate all data for user 123
await cache.invalidateByPattern('user:123:*');
// Warm cache with frequently accessed data
async function warmCache() {
const popularUsers = await getPopularUsers();
for (const user of popularUsers) {
await cache.set(`user:${user.id}`, user, 60 * 60 * 1000);
}
}
// Background cache refresh
setInterval(async () => {
const keys = await cache.keys('user:*');
for (const key of keys) {
const userId = key.split(':')[1];
const freshData = await fetchUserFromDatabase(userId);
await cache.set(key, freshData, 60 * 60 * 1000);
}
}, 30 * 60 * 1000); // Refresh every 30 minutes
// Monitor cache performance
setInterval(async () => {
const stats = await cache.getStats();
console.log(`Cache Performance:
Hit Rate: ${(stats.hitRate * 100).toFixed(1)}%
Size: ${stats.size} entries
Memory: ${Math.round(stats.memoryUsage / 1024)}KB
Evictions: ${stats.evictions}
`);
}, 60000);
// Automatic cleanup
cache.startCleanupInterval(5 * 60 * 1000); // Clean every 5 minutes
new MemoryCacheBackend({
maxSize: 1000, // Maximum number of entries
strategy: 'lru', // Eviction strategy
onEviction: (key, value) => console.log('Evicted:', key),
onExpiration: (key, value) => console.log('Expired:', key)
})
new RedisCacheBackend({
host: 'localhost',
port: 6379,
password: 'secret', // Optional
db: 0, // Database number
keyPrefix: 'cache:', // Key prefix
connectionString: 'redis://localhost:6379' // Alternative to host/port
})
new MultiTierCacheBackend({
backends: [
new MemoryCacheBackend({ maxSize: 100 }),
new RedisCacheBackend({ host: 'localhost' })
]
})
import { JSONSerializer, BinarySerializer } from '@bernierllc/cache-manager';
// JSON serialization (default)
const cache = new CacheManager({
serializer: new JSONSerializer()
});
// Binary serialization for better performance
const cache = new CacheManager({
serializer: new BinarySerializer()
});
try {
await cache.set('key', 'value');
const value = await cache.get('key');
} catch (error) {
console.error('Cache operation failed:', error);
// Fallback to original data source
}
The cache manager is designed for high-performance scenarios:
The cache manager supports optional logger integration using @bernierllc/logger:
import { CacheManager } from '@bernierllc/cache-manager';
import { detectLogger } from '@bernierllc/logger';
const cache = new CacheManager({
strategy: 'lru',
maxSize: 1000
});
// Auto-detect logger if available
const logger = await detectLogger();
if (logger) {
// Enhanced logging for cache operations
cache.on('hit', (key) => {
logger.debug('Cache hit', { key });
});
cache.on('miss', (key) => {
logger.debug('Cache miss', { key });
});
cache.on('evicted', (key, reason) => {
logger.info('Cache eviction', { key, reason });
});
}
The cache manager integrates with NeverHub when available for enhanced service discovery and monitoring:
import { CacheManager } from '@bernierllc/cache-manager';
import { detectNeverHub } from '@bernierllc/neverhub-adapter';
async function initializeCacheManager() {
const cache = new CacheManager({
strategy: 'lru',
maxSize: 1000
});
// Auto-detect NeverHub
const neverhub = await detectNeverHub();
if (neverhub) {
// Register cache manager as a service
await neverhub.register({
type: 'cache-manager',
name: '@bernierllc/cache-manager',
version: '1.0.0',
capabilities: [
{ type: 'cache', name: 'memory', version: '1.0.0' },
{ type: 'cache', name: 'redis', version: '1.0.0' }
]
});
// Publish cache events
cache.on('hit', async (key) => {
await neverhub.publishEvent({
type: 'cache.hit',
data: { key, timestamp: Date.now() }
});
});
// Subscribe to cache invalidation events
await neverhub.subscribe('cache.invalidate', async (event) => {
if (event.data.pattern) {
await cache.invalidatePattern(event.data.pattern);
} else if (event.data.key) {
await cache.delete(event.data.key);
}
});
}
return cache;
}
The cache manager implements graceful degradation patterns:
redis for Redis backend supportlz4 for compression support@bernierllc/logger for enhanced logging capabilities@bernierllc/neverhub-adapter for service discovery integration@bernierllc/connection-parser for connection string parsingCopyright (c) 2025 Bernier LLC. All rights reserved.
FAQs
Multi-tier caching with TTL support, cache invalidation, and multiple storage backends
We found that @bernierllc/cache-manager demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.