Security News
ESLint is Now Language-Agnostic: Linting JSON, Markdown, and Beyond
ESLint has added JSON and Markdown linting support with new officially-supported plugins, expanding its versatility beyond JavaScript.
The undici npm package is a HTTP/1.1 client, written from scratch for Node.js, that is designed to be faster and more efficient than the built-in 'http' and 'https' modules. It provides a low-level API for making HTTP requests and can be used to build higher-level abstractions.
HTTP Request
Make an HTTP request and process the response. This is the basic functionality of undici, allowing you to send HTTP requests and receive responses.
const { request } = require('undici');
(async () => {
const { statusCode, headers, body } = await request('https://example.com')
console.log('response received', statusCode);
for await (const data of body) {
console.log('data', data);
}
})();
HTTP Pool
Use a pool of connections to make HTTP requests. This is useful for making a large number of requests to the same server, as it reuses connections between requests.
const { Pool } = require('undici');
const pool = new Pool('https://example.com')
async function query() {
const { body } = await pool.request({
path: '/path',
method: 'GET'
})
for await (const data of body) {
console.log('data', data);
}
}
query();
HTTP Stream
Stream an HTTP response to a file or another stream. This is useful for handling large responses that you don't want to hold in memory.
const { pipeline } = require('undici');
const fs = require('fs');
pipeline(
'https://example.com',
fs.createWriteStream('output.txt'),
(err) => {
if (err) {
console.error('Pipeline failed', err);
} else {
console.log('Pipeline succeeded');
}
}
);
HTTP Upgrade
Upgrade an HTTP connection to another protocol, such as WebSockets. This is useful for protocols that start with an HTTP handshake and then upgrade to a different protocol.
const { connect } = require('undici');
(async () => {
const { socket, statusCode, headers } = await connect({
path: '/path',
method: 'GET'
});
console.log('upgrade response', statusCode, headers);
socket.on('data', (chunk) => {
console.log('data', chunk.toString());
});
})();
Axios is a promise-based HTTP client for the browser and Node.js. It provides a simple API for making HTTP requests and is often used for its ease of use and wide adoption. Compared to undici, axios is higher-level and more feature-rich, but may not be as performant for certain use cases.
Got is a human-friendly and powerful HTTP request library for Node.js. It supports streams, promises, and provides a rich set of features for making HTTP requests. Got is similar to undici in terms of performance but offers a more comprehensive API surface.
node-fetch is a light-weight module that brings the Fetch API to Node.js. It is designed to mimic the browser fetch API as closely as possible. While undici focuses on HTTP/1.1, node-fetch provides a familiar interface for those used to working with fetch in the browser.
An HTTP/1.1 client, written from scratch for Node.js.
Undici means eleven in Italian. 1.1 -> 11 -> Eleven -> Undici. It is also a Stranger Things reference.
npm i undici
Machine: 2.7 GHz Quad-Core Intel Core i7
Configuration: Node v14.2, HTTP/1.1 without TLS, 100 connections
http - keepalive - pipe x 6,545 ops/sec ±12.47% (64 runs sampled)
undici - pipeline - pipe x 9,560 ops/sec ±3.68% (77 runs sampled)
undici - request - pipe x 9,797 ops/sec ±6.80% (77 runs sampled)
undici - stream - pipe x 11,599 ops/sec ±0.89% (78 runs sampled)
The benchmark is a simple hello world
example.
new undici.Client(url, opts)
A basic HTTP/1.1 client, mapped on top a single TCP/TLS connection. Keepalive is enabled by default, and it cannot be turned off.
url
can be a string or a URL
object.
It should only include the protocol, hostname, and the port.
Options:
socketTimeout
, the timeout after which a socket will time out, in
milliseconds. Monitors time between activity on a connected socket.
Use 0
to disable it entirely. Default: 30e3
milliseconds (30s).
requestTimeout
, the timeout after which a request will time out, in
milliseconds. Monitors time between request being enqueued and receiving
a response. Use 0
to disable it entirely.
Default: 30e3
milliseconds (30s).
maxAbortedPayload
, the maximum number of bytes read after which an
aborted response will close the connection. Closing the connection
will error other inflight requests in the pipeline.
Default: 1e6
bytes (1MiB).
pipelining
, the amount of concurrent requests to be sent over the
single TCP/TLS connection according to
RFC7230.
Default: 1
.
tls
, an options object which in the case of https
will be passed to
tls.connect
.
client.request(opts, callback(err, data))
Performs an HTTP request.
Options:
path
method
body
, it can be a String
, a Buffer
, Uint8Array
or a stream.Readable
.headers
, an object with header-value pairs.signal
, either an AbortController
or an EventEmitter
.requestTimeout
, the timeout after which a request will time out, in
milliseconds. Monitors time between request being enqueued and receiving
a response. Use 0
to disable it entirely.
Default: 30e3
milliseconds (30s).idempotent
, whether the requests can be safely retried or not.
If false
the request won't be sent until all preceeding
requests in the pipeline has completed.
Default: true
if method
is HEAD
or GET
.Headers are represented by an object like this:
{
'content-length': '123',
'content-type': 'text/plain',
connection: 'keep-alive',
host: 'mysite.com',
accept: '*/*'
}
Keys are lowercased. Values are not modified.
If you don't specify a host
header, it will be derived from the url
of the client instance.
The data
parameter in callback
is defined as follow:
statusCode
headers
body
, a stream.Readable
with the body to read. A user must
either fully consume or destroy the body unless there is an error, or no further requests
will be processed.headers
is an object where all keys have been lowercased.
Returns a promise if no callback is provided.
Example:
const { Client } = require('undici')
const client = new Client(`http://localhost:3000`)
client.request({
path: '/',
method: 'GET'
}, function (err, data) {
if (err) {
// handle this in some way!
return
}
const {
statusCode,
headers,
body
} = data
console.log('response received', statusCode)
console.log('headers', headers)
body.setEncoding('utf8')
body.on('data', console.log)
client.close()
})
Non-idempotent requests will not be pipelined in order to avoid indirect failures.
Idempotent requests will be automatically retried if they fail due to indirect failure from the request at the head of the pipeline. This does not apply to idempotent requests with a stream request body.
A request can may be aborted using either an AbortController
or an EventEmitter
.
To use AbortController
, you will need to npm i abort-controller
.
const { AbortController } = require('abort-controller')
const { Client } = require('undici')
const client = new Client'http://localhost:3000')
const abortController = new AbortController()
client.request({
path: '/',
method: 'GET',
signal: abortController.signal
}, function (err, data) {
console.log(err) // RequestAbortedError
client.close()
})
abortController.abort()
Alternatively, any EventEmitter
that emits an 'abort'
event may be used as an abort controller:
const EventEmitter = require('events')
const { Client } = require('undici')
const client = new Client'http://localhost:3000')
const ee = new EventEmitter()
client.request({
path: '/',
method: 'GET',
signal: ee
}, function (err, data) {
console.log(err) // RequestAbortedError
client.close()
})
ee.emit('abort')
Destroying the request or response body will have the same effect.
client.stream(opts, factory(data), callback(err))
A faster version of request
.
Unlike request
this method expects factory
to return a Writable
which the response will be
written to. This improves performance by avoiding
creating an intermediate Readable
when the user
expects to directly pipe the response body to a
Writable
.
Options:
client.request(opts, callback)
.opaque
, passed as opaque
to factory
. Used
to avoid creating a closure.The data
parameter in factory
is defined as follow:
statusCode
headers
opaque
headers
is an object where all keys have been lowercased.
Returns a promise if no callback is provided.
const { Client } = require('undici')
const client = new Client(`http://localhost:3000`)
const fs = require('fs')
client.stream({
path: '/',
method: 'GET',
opaque: filename
}, ({ statusCode, headers, opaque: filename }) => {
console.log('response received', statusCode)
console.log('headers', headers)
return fs.createWriteStream(filename)
}, (err) => {
if (err) {
console.error('failure', err)
} else {
console.log('success')
}
})
opaque
makes it possible to avoid creating a closure
for the factory
method:
function (req, res) {
return client.stream({ ...opts, opaque: res }, proxy)
}
Instead of:
function (req, res) {
return client.stream(opts, (data) => {
// Creates closure to capture `res`.
proxy({ ...data, opaque: res })
}
}
client.pipeline(opts, handler(data))
For easy use with stream.pipeline
.
Options:
client.request(opts, callback)
.objectMode
, true
if the handler
will return an object stream.opaque
, passed as opaque
to handler
. Used
to avoid creating a closure.The data
parameter in handler
is defined as follow:
statusCode
headers
opaque
body
, a stream.Readable
with the body to read. A user must
either fully consume or destroy the body unless there is an error, or no further requests
will be processed.handler
should return a Readable
from which the result will be
read. Usually it should just return the body
argument unless
some kind of transformation needs to be performed based on e.g.
headers
or statusCode
.
headers
is an object where all keys have been lowercased.
The handler
should validate the response and save any
required state. If there is an error it should be thrown.
Returns a Duplex
which writes to the request and reads from
the response.
const { Client } = require('undici')
const client = new Client(`http://localhost:3000`)
const fs = require('fs')
const stream = require('stream')
stream.pipeline(
fs.createReadStream('source.raw'),
client.pipeline({
path: '/',
method: 'PUT',
}, ({ statusCode, headers, body }) => {
if (statusCode !== 201) {
throw new Error('invalid response')
}
if (isZipped(headers)) {
return pipeline(body, unzip(), () => {})
}
return body
}),
fs.createWriteStream('response.raw'),
(err) => {
if (err) {
console.error('failed')
} else {
console.log('succeeded')
}
}
)
client.close([callback])
Closes the client and gracefully waits fo enqueued requests to complete before invoking the callback.
Returns a promise if no callback is provided.
client.destroy([err][, callback])
Destroy the client abruptly with the given err
. All the pending and running
requests will be aborted and error. Waits until socket is closed before
invoking the callback.
Returns a promise if no callback is provided.
client.pipelining
Property to get and set the pipelining factor.
client.pending
Number of queued requests.
client.running
Number of inflight requests.
client.size
Number of pending and running requests.
client.connected
True if the client has an active connection. The client will lazily
create a connection when it receives a request and will destroy it
if there is no activity for the duration of the timeout
value.
client.full
True if client.size
is greater than the client.pipelining
factor.
Keeping a client full ensures that once a inflight requests finishes
the the pipeline will schedule new one and keep the pipeline saturated.
client.closed
True after client.close()
has been called.
client.destroyed
True after client.destroyed()
has been called or client.close()
has been
called and the client shutdown has completed.
'connect'
, emitted when a socket has been created and
connected. The client will connect once client.size > 0
.
'disconnect'
, emitted when socket has disconnected. The
first argument of the event is the error which caused the
socket to disconnect. The client will reconnect if or once
client.size > 0
.
new undici.Pool(url, opts)
A pool of Client
connected to the same upstream target.
Options:
Client
.connections
, the number of clients to create.
Default 100
.pool.request(opts, callback)
Calls client.request(opts, callback)
on one of the clients.
pool.stream(opts, factory, callback)
Calls client.stream(opts, factory, callback)
on one of the clients.
pool.pipeline(opts, handler)
Calls client.pipeline(opts, handler)
on one of the clients.
pool.close([callback])
Calls client.close(callback)
on all the clients.
pool.destroy([err][, callback])
Calls client.destroy(err, callback)
on all the clients.
undici.errors
Undici exposes a variety of error objects that you can use to enhance your error handling.
You can find all the error objects inside the errors
key.
const { errors } = require('undici')
Error | Error Codes | Description |
---|---|---|
InvalidArgumentError | UND_ERR_INVALID_ARG | passed an invalid argument. |
InvalidReturnValueError | UND_ERR_INVALID_RETURN_VALUE | returned an invalid value. |
SocketTimeoutError | UND_ERR_SOCKET_TIMEOUT | a socket exceeds the socketTimeout option. |
RequestTimeoutError | UND_ERR_REQUEST_TIMEOUT | a request exceeds the requestTimeout option. |
RequestAbortedError | UND_ERR_ABORTED | the request has been aborted by the user |
ClientDestroyedError | UND_ERR_DESTROYED | trying to use a destroyed client. |
ClientClosedError | UND_ERR_CLOSED | trying to use a closed client. |
SocketError | UND_ERR_SOCKET | there is an error with the socket. |
NotSupportedError | UND_ERR_NOT_SUPPORTED | encountered unsupported functionality. |
This section documents parts of the HTTP/1.1 specification which Undici does not support or does not fully implement.
Undici does not support 1xx informational responses and will either ignore or error them.
Undici does not support the Expect
request header field. The request
body is always immediately sent and the 100 Continue
response will be
ignored.
Refs: https://tools.ietf.org/html/rfc7231#section-5.1.1
Undici does not support the the Upgrade
request header field. A
101 Switching Protocols
response will cause an UND_ERR_NOT_SUPPORTED
error.
Refs: https://tools.ietf.org/html/rfc7230#section-6.7
Undici does not support early hints. A 103 Early Hint
response will
be ignored.
Refs: https://tools.ietf.org/html/rfc8297
Undici does not support the the Trailer
response header field. Any response
trailer headers will be ignored.
Refs: https://tools.ietf.org/html/rfc7230#section-4.4
Uncidi will only use pipelining if configured with a pipelining
factor
greater than 1
.
Undici always assumes that connections are persistent and will immediatly pipeline requests, without checking whether the connection is persistent. Hence, automatic fallback to HTTP/1.0 or HTTP/1.1 without pipelining is not supported.
Undici will immediately pipeline when retrying requests afters a failed connection. However, Undici will not retry the first remaining requests in the prior pipeline and instead error the corresponding callback/promise/stream.
Refs: https://tools.ietf.org/html/rfc2616#section-8.1.2.2
Refs: https://tools.ietf.org/html/rfc7230#section-6.3.2
MIT
FAQs
An HTTP/1.1 client, written from scratch for Node.js
The npm package undici receives a total of 10,170,454 weekly downloads. As such, undici popularity was classified as popular.
We found that undici demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
ESLint has added JSON and Markdown linting support with new officially-supported plugins, expanding its versatility beyond JavaScript.
Security News
Members Hub is conducting large-scale campaigns to artificially boost Discord server metrics, undermining community trust and platform integrity.
Security News
NIST has failed to meet its self-imposed deadline of clearing the NVD's backlog by the end of the fiscal year. Meanwhile, CVE's awaiting analysis have increased by 33% since June.