Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

dataloader

Package Overview
Dependencies
Maintainers
1
Versions
10
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

dataloader - npm Package Compare versions

Comparing version 1.2.0 to 1.3.0

index.d.ts

56

package.json

@@ -1,55 +0,1 @@

{
"name": "dataloader",
"version": "1.2.0",
"description": "A data loading utility to reduce requests to a backend via batching and caching.",
"contributors": [
"Lee Byron <lee@leebyron.com> (http://leebyron.com/)",
"Daniel Schafer <dschafer@fb.com>",
"Nicholas Schrock <schrockn@fb.com>"
],
"license": "BSD-3-Clause",
"homepage": "https://github.com/facebook/dataloader",
"bugs": {
"url": "https://github.com/facebook/dataloader/issues"
},
"repository": {
"type": "git",
"url": "http://github.com/facebook/dataloader.git"
},
"main": "dist/index.js",
"options": {
"mocha": "--require resources/mocha-bootload src/**/__tests__/**/*.js"
},
"scripts": {
"test": "npm run lint && npm run check && npm run testonly",
"testonly": "mocha $npm_package_options_mocha",
"lint": "eslint src",
"check": "flow check",
"build": "babel src --ignore __tests__ --out-dir dist/",
"watch": "babel --optional runtime resources/watch.js | node",
"cover": "babel-node node_modules/.bin/isparta cover --root src --report html node_modules/.bin/_mocha -- $npm_package_options_mocha",
"cover:lcov": "babel-node node_modules/.bin/isparta cover --root src --report lcovonly node_modules/.bin/_mocha -- $npm_package_options_mocha",
"preversion": ". ./resources/checkgit.sh && npm test",
"prepublish": ". ./resources/prepublish.sh"
},
"files": [
"dist",
"README.md",
"LICENSE",
"PATENTS"
],
"devDependencies": {
"babel": "5.8.21",
"babel-runtime": "^5.8.x",
"babel-core": "5.8.22",
"babel-eslint": "4.1.8",
"chai": "3.4.1",
"coveralls": "2.11.6",
"eslint": "1.10.3",
"eslint-plugin-babel": "2.2.0",
"flow-bin": "0.20.1",
"isparta": "3.0.3",
"mocha": "2.3.4",
"sane": "1.3.0"
}
}
{"name":"dataloader","version":"1.3.0","description":"A data loading utility to reduce requests to a backend via batching and caching.","contributors":["Lee Byron <lee@leebyron.com> (http://leebyron.com/)","Daniel Schafer <dschafer@fb.com>","Nicholas Schrock <schrockn@fb.com>"],"license":"BSD-3-Clause","homepage":"https://github.com/facebook/dataloader","bugs":{"url":"https://github.com/facebook/dataloader/issues"},"repository":{"type":"git","url":"http://github.com/facebook/dataloader.git"},"main":"index.js","files":["index.js","index.js.flow","index.d.ts","README.md","LICENSE","PATENTS"],"typings":"index.d.ts"}
# DataLoader
DataLoader is a generic utility to be used as part of your application's data
fetching layer to provide a consistent API over various key-value store backends
and reduce requests to those back-ends via batching and caching.
fetching layer to provide a simplified and consistent API over various remote
data sources such as databases or web services via batching and caching.

@@ -16,8 +16,20 @@ [![Build Status](https://travis-ci.org/facebook/dataloader.svg)](https://travis-ci.org/facebook/dataloader)

the underpinning for Facebook's GraphQL server implementation and type
definitions. DataLoader is presented in the hope that it may be useful to
produce a similar GraphQL underpinning for other systems which use
[graphql-js][] along side key-value stores, or at the very least remain a
publicly available example of this abstraction.
definitions.
DataLoader is a simplified version of this original idea implemented in
JavaScript for Node.js services. DataLoader is often used when implementing a
[graphql-js][] service, though it is also broadly useful in other situations.
This mechanism of batching and caching data requests is certainly not unique to
Node.js or JavaScript, it is also the primary motivation for
[Haxl](https://github.com/facebook/Haxl), Facebook's data loading library
for Haskell. More about how Haxl works can be read in this [blog post](https://code.facebook.com/posts/302060973291128/open-sourcing-haxl-a-library-for-haskell/).
DataLoader is provided so that it may be useful not just to build GraphQL
services for Node.js but also as a publicly available reference implementation
of this concept in the hopes that it can be ported to other languages. If you
port DataLoader to another language, please open an issue to include a link from
this repository.
## Getting Started

@@ -31,17 +43,12 @@

DataLoader assumes a JavaScript environment with global ES6 `Promise` and `Map`
classes, available in the recent versions of node.js or when using
[babel/polyfill][]. If your environment does not have these, provide them before
using DataLoader.
To get started, create a `DataLoader`. Each `DataLoader` instance represents a
unique cache. Typically instances are created per request when used within a
web-server like [express][] if different users can see different things.
```js
global.Promise = require('es6-promise')
global.Map = require('es6-map')
```
> Note: DataLoader assumes a JavaScript environment with global ES6 `Promise`
and `Map` classes, available in all supported versions of Node.js.
To get started, create a `DataLoader`. Each `DataLoader` instance represents a
unique cache. You might create each loader once for your whole application, or
create new instances per request when used within a web-server like [express][]
if different users can see different things. It's up to you.
## Batching
Batching is not an advanced feature, it's DataLoader's primary feature.

@@ -57,3 +64,3 @@ Create loaders by providing a batch loading function.

A batch loading function accepts an Array of keys, and returns a Promise which
resolves to an Array of values.
resolves to an Array of values[<sup>*</sup>](#batch-function).

@@ -86,14 +93,48 @@ Then load individual values from the loader. DataLoader will coalesce all

### Caching
#### Batch Function
After being loaded once, the resulting value is cached, eliminating
redundant requests.
A batch loading function accepts an Array of keys, and returns a Promise which
resolves to an Array of values. There are a few constraints that must be upheld:
In the example above, if User `1` was last invited by User `2`, only a single
round trip will occur.
* The Array of values must be the same length as the Array of keys.
* Each index in the Array of values must correspond to the same index in the Array of keys.
Caching results in creating fewer objects which may relieve memory pressure on
your application:
For example, if your batch function was provided the Array of keys: `[ 2, 9, 6, 1 ]`,
and loading from a back-end service returned the values:
```js
{ id: 9, name: 'Chicago' }
{ id: 1, name: 'New York' }
{ id: 2, name: 'San Francisco' }
```
Our back-end service returned results in a different order than we requested, likely
because it was more efficient for it to do so. Also, it omitted a result for key `6`,
which we can interpret as no value existing for that key.
To uphold the constraints of the batch function, it must return an Array of values
the same length as the Array of keys, and re-order them to ensure each index aligns
with the original keys `[ 2, 9, 6, 1 ]`:
```js
[
{ id: 2, name: 'San Francisco' },
{ id: 9, name: 'Chicago' },
null,
{ id: 1, name: 'New York' }
]
```
## Caching
DataLoader provides a memoization cache for all loads which occur in a single
request to your application. After `.load()` is called once with a given key,
the resulting value is cached to eliminate redundant loads.
In addition to reliving pressure on your data storage, caching results per-request
also creates fewer objects which may relieve memory pressure on your application:
```js
var userLoader = new DataLoader(...)
var promise1A = userLoader.load(1)

@@ -104,22 +145,75 @@ var promise1B = userLoader.load(1)

There are two common examples when clearing the loader's cache is necessary:
#### Caching per-Request
*Mutations:* after a mutation or update, a cached value may be out of date.
Future loads should not use any possibly cached value.
DataLoader caching *does not* replace Redis, Memcache, or any other shared
application-level cache. DataLoader is first and foremost a data loading mechanism,
and its cache only serves the purpose of not repeatedly loading the same data in
the context of a single request to your Application. To do this, it maintains a
simple in-memory memoization cache (more accurately: `.load()` is a memoized function).
Avoid multiple requests from different users using the DataLoader instance, which
could result in cached data incorrectly appearing in each request. Typically,
DataLoader instances are created when a Request begins, and are not used once the
Request ends.
For example, when using with [express][]:
```js
function createLoaders(authToken) {
return {
users: new DataLoader(ids => genUsers(authToken, ids)),
}
}
var app = express()
app.get('/', function(req, res) {
var authToken = authenticateUser(req)
var loaders = createLoaders(authToken)
res.send(renderPage(req, loaders))
})
app.listen()
```
#### Clearing Cache
In certain uncommon cases, clearing the request cache may be necessary.
The most common example when clearing the loader's cache is necessary is after
a mutation or update within the same request, when a cached value could be out of
date and future loads should not use any possibly cached value.
Here's a simple example using SQL UPDATE to illustrate.
```js
// Request begins...
var userLoader = new DataLoader(...)
// And a value happens to be loaded (and cached).
userLoader.load(4).then(...)
// A mutation occurs, invalidating what might be in cache.
sqlRun('UPDATE users WHERE id=4 SET username="zuck"').then(
() => userLoader.clear(4)
)
// Later the value load is loaded again so the mutated data appears.
userLoader.load(4).then(...)
// Request completes.
```
*Transient Errors:* A load may fail because it simply can't be loaded
(a permanent issue) or it may fail because of a transient issue such as a down
database or network issue. For transient errors, clear the cache:
#### Caching Errors
If a batch load fails (that is, a batch function throws or returns a rejected
Promise), then the requested values will not be cached. However if a batch
function returns an `Error` instance for an individual value, that `Error` will
be cached to avoid frequently loading the same `Error`.
In some circumstances you may wish to clear the cache for these individual Errors:
```js
userLoader.load(1).catch(error => {
if (/* determine if error is transient */) {
if (/* determine if should clear error */) {
userLoader.clear(1);

@@ -131,3 +225,43 @@ }

#### Disabling Cache
In certain uncommon cases, a DataLoader which *does not* cache may be desirable.
Calling `new DataLoader(myBatchFn, { cache: false })` will ensure that every
call to `.load()` will produce a *new* Promise, and requested keys will not be
saved in memory.
However, when the memoization cache is disabled, your batch function will
receive an array of keys which may contain duplicates! Each key will be
associated with each call to `.load()`. Your batch loader should provide a value
for each instance of the requested key.
For example:
```js
var myLoader = new DataLoader(keys => {
console.log(keys)
return someBatchLoadFn(keys)
}, { cache: false })
myLoader.load('A')
myLoader.load('B')
myLoader.load('A')
// > [ 'A', 'B', 'A' ]
```
More complex cache behavior can be achieved by calling `.clear()` or `.clearAll()`
rather than disabling the cache completely. For example, this DataLoader will
provide unique keys to a batch function due to the memoization cache being
enabled, but will immediately clear its cache when the batch function is called
so later requests will load new values.
```js
var myLoader = new DataLoader(keys => {
identityLoader.clearAll()
return someBatchLoadFn(keys)
})
```
## API

@@ -157,5 +291,9 @@

- *cache*: Default `true`. Set to `false` to disable caching, instead
creating a new Promise and new key in the `batchLoadFn` for every load.
- *maxBatchSize*: Default `Infinity`. Limits the number of items that get
passed in to the `batchLoadFn`.
- *cache*: Default `true`. Set to `false` to disable memoization caching,
instead creating a new Promise and new key in the `batchLoadFn` for every
load of the same key.
- *cacheKeyFn*: A function to produce a cache key for a given load key.

@@ -271,5 +409,7 @@ Defaults to `key => key`. Useful to provide when JavaScript objects are keys

### Creating a new DataLoader per request.
In many applications, a web server using DataLoader serves requests to many
different users with different access permissions. It may be dangerous to use
one cache across many users, and is encouraged to create a new cache
one cache across many users, and is encouraged to create a new DataLoader
per request:

@@ -286,3 +426,3 @@

// Later, in an web request handler:
// When handling an incoming web request:
var loaders = createLoaders(request.query.authToken);

@@ -295,132 +435,46 @@

Creating an object where each key is a `DataLoader` is also a common pattern.
This provides a single value to pass around to code which needs to perform
Creating an object where each key is a `DataLoader` is one common pattern which
provides a single value to pass around to code which needs to perform
data loading, such as part of the `rootValue` in a [graphql-js][] request.
### Loading by alternative keys.
## Custom Caches
Occasionally, some kind of value can be accessed in multiple ways. For example,
perhaps a "User" type can be loaded not only by an "id" but also by a "username"
value. If the same user is loaded by both keys, then it may be useful to fill
both caches when a user is loaded from either source:
DataLoader can optionaly be provided a custom Map instance to use as its
cache. More specifically, any object that implements the methods `get()`,
`set()`, `delete()` and `clear()` can be provided. This allows for custom Maps
which implement various [cache algorithms][] to be provided. By default,
DataLoader uses the standard [Map][] which simply grows until the DataLoader
is released.
## Common Back-ends
Looking to get started with a specific back-end? Try these example loaders:
#### Redis
Redis is a very simple key-value store which provides the batch load method
[MGET](http://redis.io/commands/mget). Here we build a Redis DataLoader
using [node_redis][].
```js
var DataLoader = require('dataloader');
var redis = require('redis');
var client = redis.createClient();
var redisLoader = new DataLoader(keys => new Promise((resolve, reject) => {
client.mget(keys, (error, results) => {
if (error) {
return reject(error);
}
resolve(results.map((result, index) =>
result !== null ? result : new Error(`No key: ${keys[index]}`)
));
});
let userByIDLoader = new DataLoader(ids => genUsersByID(ids).then(users => {
for (let user of users) {
usernameLoader.prime(user.username, user);
}
return users;
}));
```
#### CouchDB
This example uses the [nano][] CouchDB client which offers a `fetch` method
implementing the [HTTP Bulk Document API](http://wiki.apache.org/couchdb/HTTP_Bulk_Document_API).
```js
var DataLoader = require('dataloader');
var nano = require('nano');
var couch = nano('http://localhost:5984');
var userDB = couch.use('users');
var userLoader = new DataLoader(keys => new Promise((resolve, reject) => {
userDB.fetch({ keys: keys }, (error, docs) => {
if (error) {
return reject(error);
}
resolve(docs.rows.map(row => row.error ? new Error(row.error) : row.doc));
});
let usernameLoader = new DataLoader(names => genUsernames(names).then(users => {
for (let user of users) {
userByIDLoader.prime(user.id, user);
}
return users;
}));
// Usage
var promise1 = userLoader.load('8fce1902834ac6458e9886fa7f89c0ef');
var promise2 = userLoader.load('00a271787f89c0ef2e10e88a0c00048b');
Promise.all([ promise1, promise2 ]).then(([ user1, user2]) => {
console.log(user1, user2);
});
```
#### SQLite
## Custom Caches
SQL offers a natural batch mechanism with `SELECT * WHERE IN`. `DataLoader`
is designed to operate over key-value stores, so in this example just requests
the entire row at a given `id`.
DataLoader can optionaly be provided a custom Map instance to use as its
memoization cache. More specifically, any object that implements the methods `get()`,
`set()`, `delete()` and `clear()` can be provided. This allows for custom Maps
which implement various [cache algorithms][] to be provided. By default,
DataLoader uses the standard [Map][] which simply grows until the DataLoader
is released. The default is appropriate when requests to your application are
short-lived.
This example uses the [sqlite3][] client which offers a `parallelize` method to
further batch queries together. Another non-caching `DataLoader` utilizes this
method to provide a similar API. `DataLoaders` can access other `DataLoaders`.
```js
var DataLoader = require('dataloader');
var sqlite3 = require('sqlite3');
## Common Back-ends
var db = new sqlite3.Database('./to/your/db.sql');
Looking to get started with a specific back-end? Try the [loaders in the examples directory](/examples).
// Dispatch a WHERE-IN query, ensuring response has rows in correct order.
var userLoader = new DataLoader(ids => {
var params = ids.map(id => '?' ).join();
var query = `SELECT * FROM users WHERE id IN (${params})`;
return queryLoader.load([query, ids]).then(
rows => ids.map(
id => rows.find(row => row.id === id) || new Error(`Row not found: ${id}`)
)
);
});
// Parallelize all queries, but do not cache.
var queryLoader = new DataLoader(queries => new Promise(resolve => {
var waitingOn = queries.length;
var results = [];
db.parallelize(() => {
queries.forEach((query, index) => {
db.all.apply(db, query.concat((error, result) => {
results[index] = error || result;
if (--waitingOn === 0) {
resolve(results);
}
}));
});
});
}), { cache: false });
// Usage
var promise1 = userLoader.load('1234');
var promise2 = userLoader.load('5678');
Promise.all([ promise1, promise2 ]).then(([ user1, user2]) => {
console.log(user1, user2);
});
```
## Video Source Code Walkthrough

@@ -439,4 +493,1 @@

[babel/polyfill]: https://babeljs.io/docs/usage/polyfill/
[node_redis]: https://github.com/NodeRedis/node_redis
[nano]: https://github.com/dscape/nano
[sqlite3]: https://github.com/mapbox/node-sqlite3
SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc