Security News
Bun 1.2 Released with 90% Node.js Compatibility and Built-in S3 Object Support
Bun 1.2 enhances its JavaScript runtime with 90% Node.js compatibility, built-in S3 and Postgres support, HTML Imports, and faster, cloud-first performance.
request-dot-js
Advanced tools
A ~1kB fetch wrapper with convenient error handling, automatic JSON transforms, and support for exponential backoff.
A ~1kB wrapper around fetch
with convenient error handling, automatic JSON transforms, and support for request timeouts and exponential backoff.
It works in the browser, on the server (Node.js), and mobile apps (like React Native), has no dependencies, and ships with TypeScript declarations.
npm i request-dot-js
or yarn add request-dot-js
.
API: async request(url[, options])
url
: a stringoptions
: basically the fetch init objectimport request from 'request-dot-js'
// in an async function...
const { data, type, headers, status, statusText, url } = await request('https://httpbin.org/get', {
headers: { 'Custom-Header': 'abcd' },
params: { a: 'b', c: 'd' }, // query params
})
if (type === 'success') handleSuccess(data)
if (type === 'error') handleError(data)
if (type === 'exception') handleException(data)
For all responses returned by request.js, type
is either 'success'
, 'error'
or 'exception'
.
If no exception is thrown, data
is read from the response stream. If status < 300
type is 'success'
. If status >= 300
, type is 'error'
.
If there's a connection error, timeout, or other exception, type is 'exception'
, and data
is the exception thrown by fetch.
The other properties come from the Response
object returned by fetch:
status
: 200, 204, etc...statusText
: 'OK', 'CREATED', etc...url
: url after redirect(s)headers
: object literal instead of Headers
instance; header names are lowercasedIf type is 'exception'
, these properties are undefined.
request.js is built for easy interaction with JSON APIs, the de facto standard for data exchange on the web.
If you pass an object literal or an array for options.body
, request
adds the 'content-type': 'application/json'
request header and JSON stringifies the body.
By default, request
also adds the 'accept': 'application/json'
request header and tries to return parsed JSON for data
. If you don't want it to do this, pass jsonOut: false
in options
, and it will return the fetch Response object for data
instead.
const { data, type, ...rest } = await request('https://httpbin.org/post', {
method: 'POST', // default method is 'GET'
body: { a: 'b', c: 'd' }, // object body automatically passed to JSON.stringify
})
if (type === 'success') console.log(data.url) // data is parsed JSON by default
const { data, type } = await request('https://httpbin.org/post', { jsonOut: false })
if (type === 'success') {
const blob = await data.blob()
console.log(blob)
}
import request from 'request-dot-js'
// retry request up to 5 times, with 1s, 2s, 4s, 8s and 16s delays between retries
const { data, type, ...rest } = await request('https://httpbin.org/get', {
params: { a: 'b', c: 'd' },
retry: { retries: 5, delay: 1000 },
})
// shouldRetry function
const { data, type, ...rest } = await request('https://httpbin.org/get', {
retry: {
shouldRetry: (response, { retries, delay }) => {
console.log(retries, delay) // do something with current retries and delay if you want
return response.type === 'exception' || response.status === 500
},
},
})
The options
parameter has a special retry
key that can point to an object with retry options:
retries
delay
(in ms)multiplier
(default 2
)shouldRetry
(default response => response.type === 'exception'
)If you pass a retry
object, and your request throws an exception, request.js retries it up to retries
times and backs off exponentially.
If on any retry you regain connectivity and type is 'success'
or 'error'
instead of 'exception'
, the request
method stops retrying your request and returns the usual { data, type, ...rest }
.
If you want to set a custom condition for when to retry a request, pass your own shouldRetry
function. It receives the usual response, { data, type, ...rest }
, and the current backoff values, { retries, delay }
. If it returns a falsy value request.js stops retrying your request.
The shouldRetry
function also lets you react to individual retries before request
is done executing all of them, if you want to do that. See the example above.
Little known fact, but this is actually easy to do with fetch, it's just not supported on older browsers.
The options
parameter has a special timeout
key. If a timeout is specified, request.js aborts the request after timeout
ms, and returns an exception response.
const { data, type } = await request('https://httpbin.org/get', { timeout: 1 })
// time out after 1ms -- type will be 'exception'
request.js has convenience methods for the following HTTP methods: DELETE
, GET
, HEAD
, OPTIONS
, PATCH
, POST
, and PUT
.
import request from 'request-dot-js'
const { data, type } = await request.delete('https://httpbin.org/delete')
const { data, type } = await request.put('https://httpbin.org/put', { body: { a: 'b' } })
These allow you to send non-GET requests without passing the method
key in options
.
request.js comes with a function that converts a query params
object to a query string. If you want a fancier stringify
function, like the one in qs, you can pass your own in options
.
import qs from 'qs'
const { data, type } = await request('https://httpbin.org/get', { params: { a: 'b' }, stringify: qs.stringify })
request.js works great with TypeScript (it's written in TypeScript). Check out examples here.
Make sure you enable esModuleInterop
if you're using TypeScript to compile your application. This option is enabled by default if you run tsc --init
.
Regarding convenience methods, you can't do this:
import request from 'request-dot-js'
request.put(...)
Do this instead:
import { put } from 'request-dot-js'
put(...)
Also, you can't do this, because delete
is a restricted keyword:
import { delete as del } from 'request-dot-js'
Do this instead:
import { del } from 'request-dot-js'
request.js ships with a module that works in the browser, index.js
, and a module that works on the server, server.js
. If you're using it on the server, make sure to import everything from server.js
.
import request from 'request-dot-js/server'
Why not axios, or just fetch? Unlike fetch, request.js is very convenient to use:
await
to read dataAnd unlike either of them, it doesn't require try / catch
to handle exceptions. const { data, type } = await request(...)
has all the info you need to handle successful requests, request errors, connection errors, and request timeouts.
This design leads to another feature the others don't have: flexible, built-in support for retries with exponential backoff.
For modern browsers, or React Native, request.js has no dependencies.
If you're targeting older browsers, like Internet Explorer, you need to polyfill fetch and Promise
, and use Babel to transpile your code.
If you use request.js on the server, node-fetch and abort-controller are the only dependencies.
Clone the repo, then yarn
, then yarn test
.
FAQs
A ~1kB fetch wrapper with convenient error handling, automatic JSON transforms, and support for exponential backoff.
We found that request-dot-js demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Bun 1.2 enhances its JavaScript runtime with 90% Node.js compatibility, built-in S3 and Postgres support, HTML Imports, and faster, cloud-first performance.
Security News
Biden's executive order pushes for AI-driven cybersecurity, software supply chain transparency, and stronger protections for federal and open source systems.
Security News
Fluent Assertions is facing backlash after dropping the Apache license for a commercial model, leaving users blindsided and questioning contributor rights.