
Security News
TypeScript is Porting Its Compiler to Go for 10x Faster Builds
TypeScript is porting its compiler to Go, delivering 10x faster builds, lower memory usage, and improved editor performance for a smoother developer experience.
@jrmdayn/googleapis-batcher
Advanced tools
Node.js library for batching multiple requests made with the official Google APIs Node.js client
First, install the library using yarn/npm/pnpm:
yarn add @jrmdayn/googleapis-batcher
Then instantiate and use the batchFetchImplementation
:
import { google } from 'googleapis'
import { batchFetchImplementation } from '@jrmdayn/googleapis-batcher'
const fetchImpl = batchFetchImplementation()
const client = google.calendar({
version: 'v3',
fetchImplementation: fetchImpl,
})
// The 3 requests will be batched together
const [list, get, patch] = await Promise.all([
calendarClient.events.list({ calendarId: 'john@gmail.com' }),
calendarClient.events.get({
calendarId: 'john@gmail.com',
eventId: 'xyz123'
}),
calendarClient.events.patch({
calendarId: 'john@gmail.com',
eventId: 'xyz456',
requestBody: { colorId: '1' }
})
])
Controls the maximum number of requests to batch together in one HTTP request.
// limit the number of batched requests to 50
const fetchImpl = batchFetchImplementation({ maxBatchSize: 50 })
Note: Google limits the number of batched requests on a per API basis. For example, for the Calendar API it is 50 requests and for the People API it is 1000.
Controls the size of the time window (in milliseconds) that will be used to batch requests together. By default, all requests made in the same tick will be batched together. See Dataloader documentation for more on this.
// batch all requests made in a 30ms window
const fetchImpl = batchFetchImplementation({ batchWindowMs: 30 })
Defines a user controlled signal that is used to manually trigger a batch request.
const signal = makeBatchSchedulerSignal();
const fetchImpl = batchFetchImplementation({ signal })
const client = google.calendar({
version: 'v3',
fetchImplementation: fetchImpl,
})
const pList = calendarClient.events.list({ calendarId: 'john@gmail.com' }),
const pGet = calendarClient.events.get({
calendarId: 'john@gmail.com',
eventId: 'xyz123'
}),
const pPatch = calendarClient.events.patch({
calendarId: 'john@gmail.com',
eventId: 'xyz456',
requestBody: { colorId: '1' }
})
...
signal.schedule();
The max batch size varies per Google API. For example, it is set to 50 for Calendar API and to 1000 for People API. Read the docs to find out and configure the options accordingly.
Batching is homogeneous, so you cannot batch Calendar API and People API requests together. Instead, you must make 2 seperate batching calls, as there are 2 separate batching endpoints. Concretly what it means is that you should always provide a fetchImplementation
at the client API level, not at the global Google options level:
const fetchImpl = batchFetchImplementation()
const calendarClient = google.calendar({
version: 'v3',
fetchImplementation: fetchImpl,
})
const peopleClient = google.people({
version: 'v1',
fetchImplementation: fetchImpl,
})
// This will raise an error
await Promise.all([
calendarClient.events.list(),
peopleClient.people.get()
])
Do this instead:
const fetchImpl1 = batchFetchImplementation()
const fetchImpl2 = batchFetchImplementation()
const calendarClient = google.calendar({
version: 'v3',
fetchImplementation: fetchImpl1,
})
const peopleClient = google.people({
version: 'v1',
fetchImplementation: fetchImpl2,
})
await Promise.all([
calendarClient.events.list(),
peopleClient.people.get()
])
On August 12, 2020 Google deprecated its global batching endpoints (blog article here). Going forward, it is recommended to use API specific batch endpoints for batching homogeneous requests together. Unfortunately, the official Google APIs Node.js client does not support batching requests together out of the box. The task of composing a batched request and parsing the batch response is left to the developer.
Here is a link to the official guide for doing so with the Calendar API. As you can see, the task consists in handcrafting a multipart/mixed
HTTP request composed of multiple raw HTTP requests (one per request), and then parsing a multipart/mixed
response body composed of multiple raw HTTP responses (one per response).
At this point, I see at least 2 reasons as to why developers would not batch Google APIs requests:
multipart/mixed
HTTP request/response seems daunting and error proneI decided to write this library when I first encountered the need for batching Google APIs requests in Node.js, so that other developers would not have to face the task of writing and parsing multipart/mixed
HTTP requests. The key of the solution consists of providing your own fetch
implementation to the API client you are using. Google exposes a fetchImplementation
parameter in the options (probably for testing purpose) that we can easily override to intercept requests and group them together. For grouping the requests together, we use Dataloader, which can be configured to batch all requests made in one tick, or in a certain time window, or until an external signal is fired.
From a developer's point of vue, you do not need to worry about handcrafting the individual raw HTTP requests. You simply use the official Google APIs Node.js client as normal, and the fetch implementation will automatically batch the requests for you.
FAQs
A library for batching Google APIs requests in Node.js
The npm package @jrmdayn/googleapis-batcher receives a total of 1,204 weekly downloads. As such, @jrmdayn/googleapis-batcher popularity was classified as popular.
We found that @jrmdayn/googleapis-batcher demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
TypeScript is porting its compiler to Go, delivering 10x faster builds, lower memory usage, and improved editor performance for a smoother developer experience.
Research
Security News
The Socket Research Team has discovered six new malicious npm packages linked to North Korea’s Lazarus Group, designed to steal credentials and deploy backdoors.
Security News
Socket CEO Feross Aboukhadijeh discusses the open web, open source security, and how Socket tackles software supply chain attacks on The Pair Program podcast.