Security News
GitHub Removes Malicious Pull Requests Targeting Open Source Repositories
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
micro-memoize
Advanced tools
micro-memoize is a tiny, performant memoization library for JavaScript. It is designed to cache the results of function calls to improve performance, especially for expensive or frequently called functions. The library is highly configurable and supports various options for customizing the memoization behavior.
Basic Memoization
This feature allows you to memoize a function so that its results are cached. Subsequent calls with the same arguments will return the cached result instead of recomputing.
const memoize = require('micro-memoize');
const expensiveFunction = (num) => {
console.log('Computing...');
return num * num;
};
const memoizedFunction = memoize(expensiveFunction);
console.log(memoizedFunction(5)); // Computing... 25
console.log(memoizedFunction(5)); // 25 (cached result)
Custom Cache Size
This feature allows you to set a custom cache size. Once the cache exceeds this size, the oldest cached results are discarded.
const memoize = require('micro-memoize');
const expensiveFunction = (num) => {
console.log('Computing...');
return num * num;
};
const memoizedFunction = memoize(expensiveFunction, { maxSize: 2 });
console.log(memoizedFunction(5)); // Computing... 25
console.log(memoizedFunction(6)); // Computing... 36
console.log(memoizedFunction(5)); // 25 (cached result)
console.log(memoizedFunction(7)); // Computing... 49
console.log(memoizedFunction(6)); // 36 (cached result)
console.log(memoizedFunction(5)); // Computing... 25 (cache size exceeded, recomputed)
Custom Equality Function
This feature allows you to provide a custom equality function to determine if two arguments are equivalent. This is useful for complex data structures where simple equality checks are insufficient.
const memoize = require('micro-memoize');
const expensiveFunction = (obj) => {
console.log('Computing...');
return obj.value * obj.value;
};
const isEqual = (a, b) => a.value === b.value;
const memoizedFunction = memoize(expensiveFunction, { isEqual });
console.log(memoizedFunction({ value: 5 })); // Computing... 25
console.log(memoizedFunction({ value: 5 })); // 25 (cached result)
lodash.memoize is a memoization function from the Lodash library. It provides basic memoization capabilities and is widely used due to Lodash's popularity. However, it may not be as performant or configurable as micro-memoize.
memoizee is a highly configurable memoization library that supports various features like cache expiration, custom resolvers, and more. It is more feature-rich compared to micro-memoize but also larger in size.
fast-memoize is a fast and efficient memoization library focused on performance. It is similar to micro-memoize in terms of speed but offers fewer configuration options.
A tiny, crazy fast memoization library for the 95% use-case
As the author of moize
, I created a consistently fast memoization library, but moize
has a lot of features to satisfy a large number of edge cases. micro-memoize
is a simpler approach, focusing on the core feature set with a much smaller footprint (~1.41kB minified+gzipped). Stripping out these edge cases also allows micro-memoize
to be faster across the board than moize
.
const assembleToObject = (one: string, two: string) => ({ one, two });
const memoized = memoize(assembleToObject);
console.log(memoized('one', 'two')); // {one: 'one', two: 'two'}
console.log(memoized('one', 'two')); // pulled from cache, {one: 'one', two: 'two'}
If you need them, all types are available under the MicroMemoize
namespace.
import { MicroMemoize } from 'micro-memoize';
Starting in 4.0.0
, you can compose memoized functions if you want to have multiple types of memoized versions based on different options.
const simple = memoized(fn); // { maxSize: 1 }
const upToFive = memoized(simple, { maxSize: 5 }); // { maxSize: 5 }
const withCustomEquals = memoized(upToFive, { isArgEqual: deepEqual }); // { maxSize: 5, isArgEqual: deepEqual }
NOTE: The original function is the function used in the composition, the composition only applies to the options. In the example above, upToFive
does not call simple
, it calls fn
.
boolean
Identifies the value returned from the method as a Promise
, which will result in one of two possible scenarios:
onCacheHit
and onCacheChange
optionsconst fn = async (one: string, two: string) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
reject(new Error(JSON.stringify({ one, two })));
}, 500);
});
};
const memoized = memoize(fn, { async: true });
memoized('one', 'two');
console.log(memoized.cache.entries()); // [['one', 'two'], Promise]
setTimeout(() => {
console.log(memoized.cache.entries()); // []
}, 1000);
NOTE: If you don't want rejections to auto-remove the entry from cache, set async
to false
(or simply do not set it), but be aware this will also remove the cache listeners that fire on successful resolution.
function(arg1: any, arg2: any): boolean
, defaults to isSameValueZero
Custom method to compare equality of keys, determining whether to pull from cache or not, by comparing each argument in order.
Common use-cases:
import { deepEqual } from 'fast-equals';
type ContrivedObject = {
deep: string;
};
const deepObject = (object: {
foo: ContrivedObject;
bar: ContrivedObject;
}) => ({
foo: object.foo,
bar: object.bar,
});
const memoizedDeepObject = memoize(deepObject, { isArgEqual: deepEqual });
console.log(
memoizedDeepObject({
foo: {
deep: 'foo',
},
bar: {
deep: 'bar',
},
baz: {
deep: 'baz',
},
}),
); // {foo: {deep: 'foo'}, bar: {deep: 'bar'}}
console.log(
memoizedDeepObject({
foo: {
deep: 'foo',
},
bar: {
deep: 'bar',
},
baz: {
deep: 'baz',
},
}),
); // pulled from cache
NOTE: The default method tests for SameValueZero equality, which is summarized as strictly equal while also considering NaN
equal to NaN
.
function(existingKey: any[], passedKey: any[]): boolean
Custom method to compare equality of keys, determining whether to pull from cache or not, by comparing the entire key.
Common use-cases:
import { deepEqual } from 'fast-equals';
type ContrivedObject = { foo: string; bar: number };
const deepObject = (object: ContrivedObject) => ({
foo: object.foo,
bar: object.bar,
});
const memoizedShape = memoize(deepObject, {
// receives the full key in cache and the full key of the most recent call
isKeyEqual(key1, key2) {
const object1 = key1[0];
const object2 = key2[0];
return (
object1.hasOwnProperty('foo') &&
object2.hasOwnProperty('foo') &&
object1.bar === object2.bar
);
},
});
console.log(
memoizedShape({
foo: 'foo',
bar: 123,
baz: 'baz',
}),
); // {foo: {deep: 'foo'}, bar: {deep: 'bar'}}
console.log(
memoizedShape({
foo: 'not foo',
bar: 123,
baz: 'baz',
}),
); // pulled from cache
number
, defaults to 1
The number of values to store in cache, based on a Least Recently Used basis. This operates the same as maxSize
on moize
.
const manyPossibleArgs = (one: string, two: string) => [one, two];
const memoized = memoize(manyPossibleArgs, { maxSize: 3 });
console.log(memoized('one', 'two')); // ['one', 'two']
console.log(memoized('two', 'three')); // ['two', 'three']
console.log(memoized('three', 'four')); // ['three', 'four']
console.log(memoized('one', 'two')); // pulled from cache
console.log(memoized('two', 'three')); // pulled from cache
console.log(memoized('three', 'four')); // pulled from cache
console.log(memoized('four', 'five')); // ['four', 'five'], drops ['one', 'two'] from cache
function(Array<any>): Array<any>
A method that allows you transform the key that is used for caching, if you want to use something other than the pure arguments.
const ignoreFunctionArgs = (one: string, two: () => {}) => [one, two];
const memoized = memoize(ignoreFunctionArgs, {
transformKey: (args) => [JSON.stringify(args[0])],
});
console.log(memoized('one', () => {})); // ['one', () => {}]
console.log(memoized('one', () => {})); // pulled from cache, ['one', () => {}]
If your transformed keys require something other than SameValueZero
equality, you can combine transformKey
with isArgEqual
for completely custom key creation and comparison.
const ignoreFunctionArg = (one: string, two: () => void) => [one, two];
const memoized = memoize(ignoreFunctionArg, {
isKeyEqual: (key1, key2) => key1[0] === key2[0],
// Cache based on the serialized first parameter
transformKey: (args) => [JSON.stringify(args[0])],
});
console.log(memoized('one', () => {})); // ['one', () => {}]
console.log(memoized('one', () => {})); // pulled from cache, ['one', () => {}]
Access to the cache memoized function's internal cache is available at memoized.cache
. It is not recommended to mutate the internals, however there are some exposed methods that can be used for additional capabilities.
Listeners to cache change events can be dynamically added to and removed from the cache for the memoized function.
const fn = (one: string, two: string) => [one, two];
const memoized = memoize(fn);
memoized.cache.on('add', (event) => {
console.log('added to cache', event);
});
While more narrowly-typed in the implementation, the general shape of all events are:
interface Event {
cache: Cache;
entry: { key: any[]; value: any };
reason?: string;
type: string;
}
The following events are available:
add
: New entry is added to the cachedelete
: Existing entry is deleted from the cachehit
: Existing entry was found and was the most recently used entryupdate
: Existing entry was found but was not the most recent call, and was updated to be the most recently usedCode examples of each are as follows:
const fn = (one: string, two: string) => [one, two];
const memoized = memoize(fn, { maxSize: 2 });
memoized.cache.on('add', (event) => {
console.log('cache entry added: ', cache);
});
memoized('foo', 'bar'); // cache entry added
memoized('foo', 'bar');
memoized('foo', 'bar');
memoized('bar', 'foo'); // cache entry added
memoized('bar', 'foo');
memoized('bar', 'foo');
memoized('foo', 'bar');
memoized('foo', 'bar');
memoized('foo', 'bar');
const fn = (one: string, two: string) => [one, two];
const memoized = memoize(fn, { maxSize: 2 });
memoized.cache.on('hit', (event) => {
console.log('cache was hit: ', cache);
});
memoized('foo', 'bar');
memoized('foo', 'bar'); // cache was hit
memoized('foo', 'bar'); // cache was hit
memoized('bar', 'foo');
memoized('bar', 'foo'); // cache was hit
memoized('bar', 'foo'); // cache was hit
memoized('foo', 'bar');
memoized('foo', 'bar'); // cache was hit
memoized('foo', 'bar'); // cache was hit
const fn = (one: string, two: string) => [one, two];
const memoized = memoize(fn, { maxSize: 2 });
memoized.cache.on('update', (event) => {
console.log('cache entry updated: ', cache);
});
memoized('foo', 'bar');
memoized('foo', 'bar');
memoized('foo', 'bar');
memoized('bar', 'foo');
memoized('bar', 'foo');
memoized('bar', 'foo');
memoized('foo', 'bar'); // cache entry updated
memoized('foo', 'bar');
memoized('foo', 'bar');
memoized('bar', 'foo'); // cache entry updated
memoized('bar', 'foo');
memoized('bar', 'foo');
clear
If you want to clear out the existing cache:
const fn = (one: string, two: string) => [one, two];
const memoized = memoize(fn);
memoized.cache.clear();
delete
If you want to delete an existing entry in the cache:
const fn = (one: string, two: string) => one + two;
const memoized = memoize(fn);
memoized('foo', 'bar');
console.log(memoized.cache.get(['foo', 'bar'])); // 'foobar'
memoized.cache.delete(['foo', 'bar']);
console.log(memoized.cache.get(['foo', 'bar'])); // undefined
get
If you want to get a value at an existing key in the cache:
const fn = (one: string, two: string) => one + two;
const memoized = memoize(fn);
memoized('foo', 'bar');
console.log(memoized.cache.get(['foo', 'bar'])); // 'foobar'
Returns undefined
if the key is not found.
has
If you want to determine if a key exists in the cache:
const fn = (one: string, two: string) => one + two;
const memoized = memoize(fn);
memoized('foo', 'bar');
console.log(memoized.cache.get(['foo', 'bar'])); // true
console.log(memoized.cache.get(['bar', 'baz'])); // false
set
If you want to set a value at a key in the cache:
const fn = (one: string, two: string) => one + two;
const memoized = memoize(fn);
memoized.cache.set(['foo', 'bar'], 'foobar');
console.log(memoized.cache.get(['foo', 'bar'])); // 'foobar'
A snapshot of the entries in cache can be taken at any time, which provides a point-in-time reflection of the values in the cache. Since the values are mutated internally, this can be useful for debugging unexpected behavior based on call dynamics.
const fn = (one: string, two: string) => one + two;
const memoized = memoize(fn, { maxSize: 2 });
console.log(memoized.cache.entries());
// [];
memoized('foo', 'bar');
console.log(memoized.cache.entries());
// [['foo', 'bar'], 'foobar']
memoized('bar', 'baz');
console.log(memoized.cache.entries());
// [['bar', 'baz'], 'barbaz'], [['foo', 'bar'], 'foobar']]
memoized('foo', 'bar');
console.log(memoized.cache.entries());
// [[['foo', 'bar'], 'foobar'], ['bar', 'baz'], 'barbaz']]
The original function passed to be memoized.
Hard-coded to true
when the function is memoized. This is useful for introspection, to identify if a method has been memoized or not.
The options
passed when creating the memoized method.
All values provided are the number of operations per second (ops/sec) calculated by the Benchmark suite. Note that underscore
, lodash
, and ramda
do not support mulitple-parameter memoization (which is where micro-memoize
really shines), so they are not included in those benchmarks.
Benchmarks was performed on an i9 16-core Arch Linux laptop with 64GB of memory using NodeJS version 18.19.0
. The default configuration of each library was tested with a fibonacci calculation based on the following parameters:
35
{number: 35}
35, true
{number: 35}, {isComplete: true}
NOTE: Not all libraries tested support multiple parameters out of the box, but support the ability to pass a custom resolver
. Because these often need to resolve to a string value, a common suggestion is to just JSON.stringify
the arguments, so that is what is used when needed.
This is usually what benchmarks target for ... its the least-likely use-case, but the easiest to optimize, often at the expense of more common use-cases.
Operations / second | |
---|---|
micro-memoize | 121,740,174 |
fast-memoize | 110,892,952 |
mem | 104,487,931 |
lru-memoize | 103,134,780 |
Addy Osmani | 71,287,174 |
underscore | 60,782,053 |
lodash | 56,091,762 |
ramda | 49,892,367 |
memoizee | 47,142,953 |
memoizerific | 18,708,398 |
This is what most memoization libraries target as the primary use-case, as it removes the complexities of multiple arguments but allows for usage with one to many values.
Operations / second | |
---|---|
micro-memoize | 59,189,580 |
lodash | 53,370,488 |
lru-memoize | 43,956,168 |
memoizee | 27,109,412 |
memoizerific | 16,657,400 |
ramda | 4,831,034 |
underscore | 4,596,231 |
mem | 4,159,677 |
Addy Osmani | 3,931,473 |
fast-memoize | 2,390,688 |
This is a very common use-case for function calls, but can be more difficult to optimize because you need to account for multiple possibilities ... did the number of arguments change, are there default arguments, etc.
Operations / second | |
---|---|
micro-memoize | 47,199,467 |
lru-memoize | 39,757,821 |
memoizee | 17,539,760 |
memoizerific | 11,917,711 |
Addy Osmani | 9,495,702 |
mem | 6.334.600 |
ramda | 2,408,712 |
underscore | 2,359,704 |
fast-memoize | 2,173,298 |
lodash | 2,148,902 |
This is the most robust use-case, with the same complexities as multiple primitives but managing bulkier objects with additional edge scenarios (destructured with defaults, for example).
Operations / second | |
---|---|
micro-memoize | 44,516,544 |
lru-memoize | 40,112,503 |
memoizee | 17,190,427 |
memoizerific | 12,216,818 |
Addy Osmani | 2,163,630 |
mem | 2,136,207 |
underscore | 1,427,046 |
ramda | 1,414,979 |
lodash | 1,349,489 |
fast-memoize | 1,246,579 |
Standard stuff, clone the repo and npm install
dependencies. The npm scripts available:
build
=> run webpack to build development dist
file with NODE_ENV=developmentbuild:minifed
=> run webpack to build production dist
file with NODE_ENV=productiondev
=> run webpack dev server to run example app (playground!)dist
=> runs build
and build-minified
lint
=> run ESLint against all files in the src
folderprepublish
=> runs compile-for-publish
prepublish:compile
=> run lint
, test
, transpile:es
, transpile:lib
, dist
test
=> run AVA test functions with NODE_ENV=test
test:coverage
=> run test
but with nyc
for coverage checkertest:watch
=> run test
, but with persistent watchertranspile:lib
=> run babel against all files in src
to create files in lib
transpile:es
=> run babel against all files in src
to create files in es
, preserving ES2015 modules (for pkg.module
)FAQs
A tiny, crazy fast memoization library for the 95% use-case
The npm package micro-memoize receives a total of 320,467 weekly downloads. As such, micro-memoize popularity was classified as popular.
We found that micro-memoize demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.
Security News
Node.js will be enforcing stricter semver-major PR policies a month before major releases to enhance stability and ensure reliable release candidates.