Security News
GitHub Removes Malicious Pull Requests Targeting Open Source Repositories
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Memoize functions - An optimization used to speed up consecutive function calls by caching the result of calls with identical input
The 'mem' npm package is a utility for memoizing functions, which means it caches the result of function calls based on the arguments provided. This can significantly improve performance for expensive or frequently called functions by avoiding redundant computations.
Basic Memoization
This feature allows you to memoize a function so that it caches the result of function calls based on the arguments. Subsequent calls with the same arguments will return the cached result instead of recalculating.
const mem = require('mem');
const expensiveFunction = (input) => {
console.log('Function called with', input);
return input * 2;
};
const memoizedFunction = mem(expensiveFunction);
console.log(memoizedFunction(2)); // Function called with 2, 4
console.log(memoizedFunction(2)); // 4 (cached result)
Custom Cache Key
This feature allows you to define a custom cache key function, which determines how the cache key is generated based on the function arguments. This can be useful for more complex caching strategies.
const mem = require('mem');
const expensiveFunction = (input) => {
console.log('Function called with', input);
return input * 2;
};
const customCacheKey = (args) => args[0] % 2; // Cache based on even/odd
const memoizedFunction = mem(expensiveFunction, { cacheKey: customCacheKey });
console.log(memoizedFunction(2)); // Function called with 2, 4
console.log(memoizedFunction(4)); // 4 (cached result)
console.log(memoizedFunction(3)); // Function called with 3, 6
Cache Expiration
This feature allows you to set a maximum age for cache entries. After the specified time, the cache entry will expire, and the function will be called again to recalculate the result.
const mem = require('mem');
const expensiveFunction = (input) => {
console.log('Function called with', input);
return input * 2;
};
const memoizedFunction = mem(expensiveFunction, { maxAge: 1000 });
console.log(memoizedFunction(2)); // Function called with 2, 4
setTimeout(() => {
console.log(memoizedFunction(2)); // Function called with 2, 4 (after 1 second, cache expired)
}, 1500);
Lodash's memoize function provides similar functionality to 'mem' by caching the result of function calls. It is part of the larger Lodash utility library, which offers a wide range of utility functions for JavaScript. Compared to 'mem', lodash.memoize is more lightweight but lacks some advanced features like cache expiration.
Memoizee is a full-featured memoization library that offers a wide range of options, including cache expiration, custom cache keys, and more. It is more feature-rich compared to 'mem' but also comes with a larger footprint.
Moize is another memoization library that offers a balance between performance and features. It supports cache expiration, custom cache keys, and other advanced options. It is similar to 'mem' in terms of functionality but aims to provide better performance and more configuration options.
Memoize functions - An optimization used to speed up consecutive function calls by caching the result of calls with identical input
Memory is automatically released when an item expires or the cache is cleared.
By default, only the first argument is considered and it only works with primitives. If you need to cache multiple arguments or cache object
s by value, have a look at alternative caching strategies below.
$ npm install mem
import mem from 'mem';
let index = 0;
const counter = () => ++index;
const memoized = mem(counter);
memoized('foo');
//=> 1
// Cached as it's the same argument
memoized('foo');
//=> 1
// Not cached anymore as the argument changed
memoized('bar');
//=> 2
memoized('bar');
//=> 2
// Only the first argument is considered by default
memoized('bar', 'foo');
//=> 2
import mem from 'mem';
let index = 0;
const counter = async () => ++index;
const memoized = mem(counter);
console.log(await memoized());
//=> 1
// The return value didn't increase as it's cached
console.log(await memoized());
//=> 1
import mem from 'mem';
import got from 'got';
import delay from 'delay';
const memGot = mem(got, {maxAge: 1000});
await memGot('https://sindresorhus.com');
// This call is cached
await memGot('https://sindresorhus.com');
await delay(2000);
// This call is not cached as the cache has expired
await memGot('https://sindresorhus.com');
By default, only the first argument is compared via exact equality (===
) to determine whether a call is identical.
const power = mem((a, b) => Math.power(a, b));
power(2, 2); // => 4, stored in cache with the key 2 (number)
power(2, 3); // => 4, retrieved from cache at key 2 (number), it's wrong
You will have to use the cache
and cacheKey
options appropriate to your function. In this specific case, the following could work:
const power = mem((a, b) => Math.power(a, b), {
cacheKey: arguments_ => arguments_.join(',')
});
power(2, 2); // => 4, stored in cache with the key '2,2' (both arguments as one string)
power(2, 3); // => 8, stored in cache with the key '2,3'
More advanced examples follow.
If your function accepts an object, it won't be memoized out of the box:
const heavyMemoizedOperation = mem(heavyOperation);
heavyMemoizedOperation({full: true}); // Stored in cache with the object as key
heavyMemoizedOperation({full: true}); // Stored in cache with the object as key, again
// The objects look the same but for JS they're two different objects
You might want to serialize or hash them, for example using JSON.stringify
or something like serialize-javascript, which can also serialize RegExp
, Date
and so on.
const heavyMemoizedOperation = mem(heavyOperation, {cacheKey: JSON.stringify});
heavyMemoizedOperation({full: true}); // Stored in cache with the key '[{"full":true}]' (string)
heavyMemoizedOperation({full: true}); // Retrieved from cache
The same solution also works if it accepts multiple serializable objects:
const heavyMemoizedOperation = mem(heavyOperation, {cacheKey: JSON.stringify});
heavyMemoizedOperation('hello', {full: true}); // Stored in cache with the key '["hello",{"full":true}]' (string)
heavyMemoizedOperation('hello', {full: true}); // Retrieved from cache
If your function accepts multiple arguments that aren't supported by JSON.stringify
(e.g. DOM elements and functions), you can instead extend the initial exact equality (===
) to work on multiple arguments using many-keys-map
:
import ManyKeysMap from 'many-keys-map';
const addListener = (emitter, eventName, listener) => emitter.on(eventName, listener);
const addOneListener = mem(addListener, {
cacheKey: arguments_ => arguments_, // Use *all* the arguments as key
cache: new ManyKeysMap() // Correctly handles all the arguments for exact equality
});
addOneListener(header, 'click', console.log); // `addListener` is run, and it's cached with the `arguments` array as key
addOneListener(header, 'click', console.log); // `addListener` is not run again
addOneListener(mainContent, 'load', console.log); // `addListener` is run, and it's cached with the `arguments` array as key
Better yet, if your function’s arguments are compatible with WeakMap
, you should use deep-weak-map
instead of many-keys-map
. This will help avoid memory leaks.
Type: Function
Function to be memoized.
Type: object
Type: number
Default: Infinity
Milliseconds until the cache expires.
Type: Function
Default: arguments_ => arguments_[0]
Example: arguments_ => JSON.stringify(arguments_)
Determines the cache key for storing the result based on the function arguments. By default, only the first argument is considered.
A cacheKey
function can return any type supported by Map
(or whatever structure you use in the cache
option).
Refer to the caching strategies section for more information.
Type: object
Default: new Map()
Use a different cache storage. Must implement the following methods: .has(key)
, .get(key)
, .set(key, value)
, .delete(key)
, and optionally .clear()
. You could for example use a WeakMap
instead or quick-lru
for a LRU cache.
Refer to the caching strategies section for more information.
Returns a decorator to memoize class methods or static class methods.
Notes:
--experimentalDecorators
; follow TypeScript’s docs.Type: object
Same as options for mem()
.
import {memDecorator} from 'mem';
class Example {
index = 0
@memDecorator()
counter() {
return ++this.index;
}
}
class ExampleWithOptions {
index = 0
@memDecorator({maxAge: 1000})
counter() {
return ++this.index;
}
}
Clear all cached data of a memoized function.
Type: Function
Memoized function.
If you want to know how many times your cache had a hit or a miss, you can make use of stats-map as a replacement for the default cache.
import mem from 'mem';
import StatsMap from 'stats-map';
import got from 'got';
const cache = new StatsMap();
const memGot = mem(got, {cache});
await memGot('https://sindresorhus.com');
await memGot('https://sindresorhus.com');
await memGot('https://sindresorhus.com');
console.log(cache.stats);
//=> {hits: 2, misses: 1}
FAQs
Memoize functions - An optimization used to speed up consecutive function calls by caching the result of calls with identical input
The npm package mem receives a total of 7,384,947 weekly downloads. As such, mem popularity was classified as popular.
We found that mem demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.
Security News
Node.js will be enforcing stricter semver-major PR policies a month before major releases to enhance stability and ensure reliable release candidates.