Security News
Input Validation Vulnerabilities Dominate MITRE's 2024 CWE Top 25 List
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
The undici npm package is a HTTP/1.1 client, written from scratch for Node.js, that is designed to be faster and more efficient than the built-in 'http' and 'https' modules. It provides a low-level API for making HTTP requests and can be used to build higher-level abstractions.
HTTP Request
Make an HTTP request and process the response. This is the basic functionality of undici, allowing you to send HTTP requests and receive responses.
const { request } = require('undici');
(async () => {
const { statusCode, headers, body } = await request('https://example.com')
console.log('response received', statusCode);
for await (const data of body) {
console.log('data', data);
}
})();
HTTP Pool
Use a pool of connections to make HTTP requests. This is useful for making a large number of requests to the same server, as it reuses connections between requests.
const { Pool } = require('undici');
const pool = new Pool('https://example.com')
async function query() {
const { body } = await pool.request({
path: '/path',
method: 'GET'
})
for await (const data of body) {
console.log('data', data);
}
}
query();
HTTP Stream
Stream an HTTP response to a file or another stream. This is useful for handling large responses that you don't want to hold in memory.
const { pipeline } = require('undici');
const fs = require('fs');
pipeline(
'https://example.com',
fs.createWriteStream('output.txt'),
(err) => {
if (err) {
console.error('Pipeline failed', err);
} else {
console.log('Pipeline succeeded');
}
}
);
HTTP Upgrade
Upgrade an HTTP connection to another protocol, such as WebSockets. This is useful for protocols that start with an HTTP handshake and then upgrade to a different protocol.
const { connect } = require('undici');
(async () => {
const { socket, statusCode, headers } = await connect({
path: '/path',
method: 'GET'
});
console.log('upgrade response', statusCode, headers);
socket.on('data', (chunk) => {
console.log('data', chunk.toString());
});
})();
Axios is a promise-based HTTP client for the browser and Node.js. It provides a simple API for making HTTP requests and is often used for its ease of use and wide adoption. Compared to undici, axios is higher-level and more feature-rich, but may not be as performant for certain use cases.
Got is a human-friendly and powerful HTTP request library for Node.js. It supports streams, promises, and provides a rich set of features for making HTTP requests. Got is similar to undici in terms of performance but offers a more comprehensive API surface.
node-fetch is a light-weight module that brings the Fetch API to Node.js. It is designed to mimic the browser fetch API as closely as possible. While undici focuses on HTTP/1.1, node-fetch provides a familiar interface for those used to working with fetch in the browser.
A HTTP/1.1 client, written from scratch for Node.js.
Undici means eleven in Italian. 1.1 -> 11 -> Eleven -> Undici. It is also a Stranger Things reference.
Have a question about using Undici? Open a Q&A Discussion or join our official OpenJS Slack channel.
npm i undici
The benchmark is a simple hello world
example using a
number of unix sockets (connections) with a pipelining depth of 10 running on Node 16.
The benchmarks below have the simd feature enabled.
Tests | Samples | Result | Tolerance | Difference with slowest |
---|---|---|---|---|
http - no keepalive | 15 | 4.63 req/sec | ± 2.77 % | - |
http - keepalive | 10 | 4.81 req/sec | ± 2.16 % | + 3.94 % |
undici - stream | 25 | 62.22 req/sec | ± 2.67 % | + 1244.58 % |
undici - dispatch | 15 | 64.33 req/sec | ± 2.47 % | + 1290.24 % |
undici - request | 15 | 66.08 req/sec | ± 2.48 % | + 1327.88 % |
undici - pipeline | 10 | 66.13 req/sec | ± 1.39 % | + 1329.08 % |
Tests | Samples | Result | Tolerance | Difference with slowest |
---|---|---|---|---|
http - no keepalive | 50 | 3546.49 req/sec | ± 2.90 % | - |
http - keepalive | 15 | 5692.67 req/sec | ± 2.48 % | + 60.52 % |
undici - pipeline | 25 | 8478.71 req/sec | ± 2.62 % | + 139.07 % |
undici - request | 20 | 9766.66 req/sec | ± 2.79 % | + 175.39 % |
undici - stream | 15 | 10109.74 req/sec | ± 2.94 % | + 185.06 % |
undici - dispatch | 25 | 10949.73 req/sec | ± 2.54 % | + 208.75 % |
import { request } from 'undici'
const {
statusCode,
headers,
trailers,
body
} = await request('http://localhost:3000/foo')
console.log('response received', statusCode)
console.log('headers', headers)
for await (const data of body) {
console.log('data', data)
}
console.log('trailers', trailers)
Using the body mixin from the Fetch Standard.
import { request } from 'undici'
const {
statusCode,
headers,
trailers,
body
} = await request('http://localhost:3000/foo')
console.log('response received', statusCode)
console.log('headers', headers)
console.log('data', await body.json())
console.log('trailers', trailers)
This section documents our most commonly used API methods. Additional APIs are documented in their own files within the docs folder and are accessible via the navigation list on the left side of the docs site.
undici.request([url, options]): Promise
Arguments:
string | URL | UrlObject
RequestOptions
Dispatcher
- Default: getGlobalDispatcherString
- Default: PUT
if options.body
, otherwise GET
Integer
- Default: 0
Returns a promise with the result of the Dispatcher.request
method.
Calls options.dispatcher.request(options)
.
See Dispatcher.request for more details.
undici.stream([url, options, ]factory): Promise
Arguments:
string | URL | UrlObject
StreamOptions
Dispatcher
- Default: getGlobalDispatcherString
- Default: PUT
if options.body
, otherwise GET
Integer
- Default: 0
Dispatcher.stream.factory
Returns a promise with the result of the Dispatcher.stream
method.
Calls options.dispatcher.stream(options, factory)
.
See Dispatcher.stream for more details.
undici.pipeline([url, options, ]handler): Duplex
Arguments:
string | URL | UrlObject
PipelineOptions
Dispatcher
- Default: getGlobalDispatcherString
- Default: PUT
if options.body
, otherwise GET
Integer
- Default: 0
Dispatcher.pipeline.handler
Returns: stream.Duplex
Calls options.dispatch.pipeline(options, handler)
.
See Dispatcher.pipeline for more details.
undici.connect([url, options]): Promise
Starts two-way communications with the requested resource using HTTP CONNECT.
Arguments:
string | URL | UrlObject
ConnectOptions
Dispatcher
- Default: getGlobalDispatcherInteger
- Default: 0
(err: Error | null, data: ConnectData | null) => void
(optional)Returns a promise with the result of the Dispatcher.connect
method.
Calls options.dispatch.connect(options)
.
See Dispatcher.connect for more details.
undici.fetch(input[, init]): Promise
Implements fetch.
Only supported on Node 16.5+.
This is experimental and is not yet fully compliant with the Fetch Standard. We plan to ship breaking changes to this feature until it is out of experimental.
Basic usage example:
import {fetch} from 'undici';
async function fetchJson() {
const res = await fetch('https://example.com')
const json = await res.json()
console.log(json);
}
response.body
Nodejs has two kinds of streams: web streams which follow the API of the WHATWG web standard found in browsers, and an older Node-specific streams API. response.body
returns a readable web stream. If you would prefer to work with a Node stream you can convert a web stream using .fromWeb()
.
import {fetch} from 'undici';
import {Readable} from 'node:stream';
async function fetchStream() {
const response = await fetch('https://example.com')
const readableWebStream = response.body;
const readableNodeStream = Readable.fromWeb(readableWebStream);
}
This section documents parts of the Fetch Standard which Undici does not support or does not fully implement.
The Fetch Standard allows users to skip consuming the response body by relying on garbage collection to release connection resources. Undici does the same. However, garbage collection in Node is less aggressive and deterministic (due to the lack of clear idle periods that browser have through the rendering refresh rate) which means that leaving the release of connection resources to the garbage collector can lead to excessive connection usage, reduced performance (due to less connection re-use), and even stalls or deadlocks when running out of connections. Therefore, it is highly recommended to always either consume or cancel the response body.
// Do
const headers = await fetch(url)
.then(async res => {
for await (const chunk of res) {
// force consumption of body
}
return res.headers
})
// Do not
const headers = await fetch(url)
.then(res => res.headers)
undici.upgrade([url, options]): Promise
Upgrade to a different protocol. See MDN - HTTP - Protocol upgrade mechanism for more details.
Arguments:
string | URL | UrlObject
UpgradeOptions
Dispatcher
- Default: getGlobalDispatcherInteger
- Default: 0
(error: Error | null, data: UpgradeData) => void
(optional)Returns a promise with the result of the Dispatcher.upgrade
method.
Calls options.dispatcher.upgrade(options)
.
See Dispatcher.upgrade for more details.
undici.setGlobalDispatcher(dispatcher)
Dispatcher
Sets the global dispatcher used by Common API Methods.
undici.getGlobalDispatcher()
Gets the global dispatcher used by Common API Methods.
Returns: Dispatcher
UrlObject
string | number
(optional)string
(optional)string
(optional)string
(optional)string
(optional)string
(optional)string
(optional)This section documents parts of the HTTP/1.1 specification which Undici does not support or does not fully implement.
Undici does not support the Expect
request header field. The request
body is always immediately sent and the 100 Continue
response will be
ignored.
Refs: https://tools.ietf.org/html/rfc7231#section-5.1.1
Undici will only use pipelining if configured with a pipelining
factor
greater than 1
.
Undici always assumes that connections are persistent and will immediately pipeline requests, without checking whether the connection is persistent. Hence, automatic fallback to HTTP/1.0 or HTTP/1.1 without pipelining is not supported.
Undici will immediately pipeline when retrying requests afters a failed connection. However, Undici will not retry the first remaining requests in the prior pipeline and instead error the corresponding callback/promise/stream.
Undici will abort all running requests in the pipeline when any of them are aborted.
MIT
FAQs
An HTTP/1.1 client, written from scratch for Node.js
The npm package undici receives a total of 8,954,924 weekly downloads. As such, undici popularity was classified as popular.
We found that undici demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.
Research
Security News
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.