
Security News
High-Severity RCE Vulnerability Disclosed in next-mdx-remote
HashiCorp disclosed a high-severity RCE in next-mdx-remote affecting versions 4.3.0 to 5.x when compiling untrusted MDX on the server.
An HTTP/1.1 client, written from scratch for Node.js.
Undici means eleven in Italian. 1.1 -> 11 -> Eleven -> Undici. It is also a Stranger Things reference.
Have a question about using Undici? Open a Q&A Discussion or join our official OpenJS Slack channel.
Looking to contribute? Start by reading the contributing guide
npm i undici
The benchmark is a simple getting data example using a 50 TCP connections with a pipelining depth of 10 running on Node 22.11.0.
┌────────────────────────┬─────────┬────────────────────┬────────────┬─────────────────────────┐
│ Tests │ Samples │ Result │ Tolerance │ Difference with slowest │
├────────────────────────┼─────────┼────────────────────┼────────────┼─────────────────────────┤
│ 'axios' │ 15 │ '5708.26 req/sec' │ '± 2.91 %' │ '-' │
│ 'http - no keepalive' │ 10 │ '5809.80 req/sec' │ '± 2.30 %' │ '+ 1.78 %' │
│ 'request' │ 30 │ '5828.80 req/sec' │ '± 2.91 %' │ '+ 2.11 %' │
│ 'undici - fetch' │ 40 │ '5903.78 req/sec' │ '± 2.87 %' │ '+ 3.43 %' │
│ 'node-fetch' │ 10 │ '5945.40 req/sec' │ '± 2.13 %' │ '+ 4.15 %' │
│ 'got' │ 35 │ '6511.45 req/sec' │ '± 2.84 %' │ '+ 14.07 %' │
│ 'http - keepalive' │ 65 │ '9193.24 req/sec' │ '± 2.92 %' │ '+ 61.05 %' │
│ 'superagent' │ 35 │ '9339.43 req/sec' │ '± 2.95 %' │ '+ 63.61 %' │
│ 'undici - pipeline' │ 50 │ '13364.62 req/sec' │ '± 2.93 %' │ '+ 134.13 %' │
│ 'undici - stream' │ 95 │ '18245.36 req/sec' │ '± 2.99 %' │ '+ 219.63 %' │
│ 'undici - request' │ 50 │ '18340.17 req/sec' │ '± 2.84 %' │ '+ 221.29 %' │
│ 'undici - dispatch' │ 40 │ '22234.42 req/sec' │ '± 2.94 %' │ '+ 289.51 %' │
└────────────────────────┴─────────┴────────────────────┴────────────┴─────────────────────────┘
Node.js includes a built-in fetch() implementation powered by undici starting from Node.js v18. However, there are important differences between using the built-in fetch and installing undici as a separate module.
Node.js's built-in fetch is powered by a bundled version of undici:
// Available globally in Node.js v18+
const response = await fetch('https://api.example.com/data');
const data = await response.json();
// Check the bundled undici version
console.log(process.versions.undici); // e.g., "5.28.4"
Pros:
Cons:
TypeError)Installing undici as a separate module gives you access to the latest features and APIs:
npm install undici
import { request, fetch, Agent, setGlobalDispatcher } from 'undici';
// Use undici.request for maximum performance
const { statusCode, headers, body } = await request('https://api.example.com/data');
const data = await body.json();
// Or use undici.fetch with custom configuration
const agent = new Agent({ keepAliveTimeout: 10000 });
setGlobalDispatcher(agent);
const response = await fetch('https://api.example.com/data');
Pros:
request, stream, pipeline)undici.requestProxyAgent, MockAgentCons:
ProxyAgent, MockAgent, etc.)undici.request for maximum speed)Based on benchmarks, here's the typical performance hierarchy:
undici.request() - Fastest, most efficientundici.fetch() - Good performance, standard compliancehttp/https - Baseline performanceIf you're currently using built-in fetch and want to migrate to undici:
// Before: Built-in fetch
const response = await fetch('https://api.example.com/data');
// After: Undici fetch (drop-in replacement)
import { fetch } from 'undici';
const response = await fetch('https://api.example.com/data');
// Or: Undici request (better performance)
import { request } from 'undici';
const { statusCode, body } = await request('https://api.example.com/data');
const data = await body.json();
You can check which version of undici is bundled with your Node.js version:
console.log(process.versions.undici);
Installing undici as a module allows you to use a newer version than what's bundled with Node.js, giving you access to the latest features and performance improvements.
import { request } from 'undici'
const {
statusCode,
headers,
trailers,
body
} = await request('http://localhost:3000/foo')
console.log('response received', statusCode)
console.log('headers', headers)
for await (const data of body) { console.log('data', data) }
console.log('trailers', trailers)
Undici provides a powerful HTTP caching interceptor that follows HTTP caching best practices. Here's how to use it:
import { fetch, Agent, interceptors, cacheStores } from 'undici';
// Create a client with cache interceptor
const client = new Agent().compose(interceptors.cache({
// Optional: Configure cache store (defaults to MemoryCacheStore)
store: new cacheStores.MemoryCacheStore({
maxSize: 100 * 1024 * 1024, // 100MB
maxCount: 1000,
maxEntrySize: 5 * 1024 * 1024 // 5MB
}),
// Optional: Specify which HTTP methods to cache (default: ['GET', 'HEAD'])
methods: ['GET', 'HEAD']
}));
// Set the global dispatcher to use our caching client
setGlobalDispatcher(client);
// Now all fetch requests will use the cache
async function getData() {
const response = await fetch('https://api.example.com/data');
// The server should set appropriate Cache-Control headers in the response
// which the cache will respect based on the cache policy
return response.json();
}
// First request - fetches from origin
const data1 = await getData();
// Second request - served from cache if within max-age
const data2 = await getData();
Cache-Control and Expires headersETag and Last-Modified validationUndici provides an install() function to add all WHATWG fetch classes to globalThis, making them available globally:
import { install } from 'undici'
// Install all WHATWG fetch classes globally
install()
// Now you can use fetch classes globally without importing
const response = await fetch('https://api.example.com/data')
const data = await response.json()
// All classes are available globally:
const headers = new Headers([['content-type', 'application/json']])
const request = new Request('https://example.com')
const formData = new FormData()
const ws = new WebSocket('wss://example.com')
const eventSource = new EventSource('https://example.com/events')
The install() function adds the following classes to globalThis:
fetch - The fetch functionHeaders - HTTP headers managementResponse - HTTP response representationRequest - HTTP request representationFormData - Form data handlingWebSocket - WebSocket clientCloseEvent, ErrorEvent, MessageEvent - WebSocket eventsEventSource - Server-sent events clientThis is useful for:
The body mixins are the most common way to format the request/response body. Mixins include:
[!NOTE] The body returned from
undici.requestdoes not implement.formData().
Example usage:
import { request } from 'undici'
const {
statusCode,
headers,
trailers,
body
} = await request('http://localhost:3000/foo')
console.log('response received', statusCode)
console.log('headers', headers)
console.log('data', await body.json())
console.log('trailers', trailers)
Note: Once a mixin has been called then the body cannot be reused, thus calling additional mixins on .body, e.g. .body.json(); .body.text() will result in an error TypeError: unusable being thrown and returned through the Promise rejection.
Should you need to access the body in plain-text after using a mixin, the best practice is to use the .text() mixin first and then manually parse the text to the desired format.
For more information about their behavior, please reference the body mixin from the Fetch Standard.
This section documents our most commonly used API methods. Additional APIs are documented in their own files within the docs folder and are accessible via the navigation list on the left side of the docs site.
undici.request([url, options]): PromiseArguments:
string | URL | UrlObjectRequestOptions
Dispatcher - Default: getGlobalDispatcherString - Default: PUT if options.body, otherwise GETReturns a promise with the result of the Dispatcher.request method.
Calls options.dispatcher.request(options).
See Dispatcher.request for more details, and request examples for examples.
undici.stream([url, options, ]factory): PromiseArguments:
string | URL | UrlObjectStreamOptions
Dispatcher - Default: getGlobalDispatcherString - Default: PUT if options.body, otherwise GETDispatcher.stream.factoryReturns a promise with the result of the Dispatcher.stream method.
Calls options.dispatcher.stream(options, factory).
See Dispatcher.stream for more details.
undici.pipeline([url, options, ]handler): DuplexArguments:
string | URL | UrlObjectPipelineOptions
Dispatcher - Default: getGlobalDispatcherString - Default: PUT if options.body, otherwise GETDispatcher.pipeline.handlerReturns: stream.Duplex
Calls options.dispatch.pipeline(options, handler).
See Dispatcher.pipeline for more details.
undici.connect([url, options]): PromiseStarts two-way communications with the requested resource using HTTP CONNECT.
Arguments:
string | URL | UrlObjectConnectOptions
Dispatcher - Default: getGlobalDispatcher(err: Error | null, data: ConnectData | null) => void (optional)Returns a promise with the result of the Dispatcher.connect method.
Calls options.dispatch.connect(options).
See Dispatcher.connect for more details.
undici.fetch(input[, init]): PromiseImplements fetch.
Basic usage example:
import { fetch } from 'undici'
const res = await fetch('https://example.com')
const json = await res.json()
console.log(json)
You can pass an optional dispatcher to fetch as:
import { fetch, Agent } from 'undici'
const res = await fetch('https://example.com', {
// Mocks are also supported
dispatcher: new Agent({
keepAliveTimeout: 10,
keepAliveMaxTimeout: 10
})
})
const json = await res.json()
console.log(json)
request.bodyA body can be of the following types:
In this implementation of fetch, request.body now accepts Async Iterables. It is not present in the Fetch Standard.
import { fetch } from 'undici'
const data = {
async *[Symbol.asyncIterator]() {
yield 'hello'
yield 'world'
},
}
await fetch('https://example.com', { body: data, method: 'POST', duplex: 'half' })
FormData besides text data and buffers can also utilize streams via Blob objects:
import { openAsBlob } from 'node:fs'
const file = await openAsBlob('./big.csv')
const body = new FormData()
body.set('file', file, 'big.csv')
await fetch('http://example.com', { method: 'POST', body })
request.duplex'half'In this implementation of fetch, request.duplex must be set if request.body is ReadableStream or Async Iterables, however, even though the value must be set to 'half', it is actually a full duplex. For more detail refer to the Fetch Standard.
response.bodyNodejs has two kinds of streams: web streams, which follow the API of the WHATWG web standard found in browsers, and an older Node-specific streams API. response.body returns a readable web stream. If you would prefer to work with a Node stream you can convert a web stream using .fromWeb().
import { fetch } from 'undici'
import { Readable } from 'node:stream'
const response = await fetch('https://example.com')
const readableWebStream = response.body
const readableNodeStream = Readable.fromWeb(readableWebStream)
This section documents parts of the HTTP/1.1 and Fetch Standard that Undici does not support or does not fully implement.
Unlike browsers, Undici does not implement CORS (Cross-Origin Resource Sharing) checks by default. This means:
Access-Control-Allow-Origin headers is performedThis behavior is intentional for server-side environments where CORS restrictions are typically unnecessary. If your application requires CORS-like protections, you will need to implement these checks manually.
The Fetch Standard allows users to skip consuming the response body by relying on garbage collection to release connection resources.
Garbage collection in Node is less aggressive and deterministic (due to the lack of clear idle periods that browsers have through the rendering refresh rate) which means that leaving the release of connection resources to the garbage collector can lead to excessive connection usage, reduced performance (due to less connection re-use), and even stalls or deadlocks when running out of connections. Therefore, it is important to always either consume or cancel the response body anyway.
// Do
const { body, headers } = await fetch(url);
for await (const chunk of body) {
// force consumption of body
}
// Do not
const { headers } = await fetch(url);
However, if you want to get only headers, it might be better to use HEAD request method. Usage of this method will obviate the need for consumption or cancelling of the response body. See MDN - HTTP - HTTP request methods - HEAD for more details.
const headers = await fetch(url, { method: 'HEAD' })
.then(res => res.headers)
Note that consuming the response body is mandatory for request:
// Do
const { body, headers } = await request(url);
await body.dump(); // force consumption of body
// Do not
const { headers } = await request(url);
The Fetch Standard requires implementations to exclude certain headers from requests and responses. In browser environments, some headers are forbidden so the user agent remains in full control over them. In Undici, these constraints are removed to give more control to the user.
Undici limits the number of Content-Encoding layers in a response to 5 to prevent resource exhaustion attacks. If a server responds with more than 5 content-encodings (e.g., Content-Encoding: gzip, gzip, gzip, gzip, gzip, gzip), the fetch will be rejected with an error. This limit matches the approach taken by curl and urllib3.
undici.upgrade([url, options]): PromiseUpgrade to a different protocol. See MDN - HTTP - Protocol upgrade mechanism for more details.
Arguments:
string | URL | UrlObjectUpgradeOptions
Dispatcher - Default: getGlobalDispatcher(error: Error | null, data: UpgradeData) => void (optional)Returns a promise with the result of the Dispatcher.upgrade method.
Calls options.dispatcher.upgrade(options).
See Dispatcher.upgrade for more details.
undici.setGlobalDispatcher(dispatcher)DispatcherSets the global dispatcher used by Common API Methods. Global dispatcher is shared among compatible undici modules, including undici that is bundled internally with node.js.
undici.getGlobalDispatcher()Gets the global dispatcher used by Common API Methods.
Returns: Dispatcher
undici.setGlobalOrigin(origin)string | URL | undefinedSets the global origin used in fetch.
If undefined is passed, the global origin will be reset. This will cause Response.redirect, new Request(), and fetch to throw an error when a relative path is passed.
setGlobalOrigin('http://localhost:3000')
const response = await fetch('/api/ping')
console.log(response.url) // http://localhost:3000/api/ping
undici.getGlobalOrigin()Gets the global origin used in fetch.
Returns: URL
UrlObjectstring | number (optional)string (optional)string (optional)string (optional)string (optional)string (optional)string (optional)Undici does not support the Expect request header field. The request
body is always immediately sent and the 100 Continue response will be
ignored.
Refs: https://tools.ietf.org/html/rfc7231#section-5.1.1
Undici will only use pipelining if configured with a pipelining factor
greater than 1. Also it is important to pass blocking: false to the
request options to properly pipeline requests.
Undici always assumes that connections are persistent and will immediately pipeline requests, without checking whether the connection is persistent. Hence, automatic fallback to HTTP/1.0 or HTTP/1.1 without pipelining is not supported.
Undici will immediately pipeline when retrying requests after a failed connection. However, Undici will not retry the first remaining requests in the prior pipeline and instead error the corresponding callback/promise/stream.
Undici will abort all running requests in the pipeline when any of them are aborted.
Since it is not possible to manually follow an HTTP redirect on the server-side,
Undici returns the actual response instead of an opaqueredirect filtered one
when invoked with a manual redirect. This aligns fetch() with the other
implementations in Deno and Cloudflare Workers.
Refs: https://fetch.spec.whatwg.org/#atomic-http-redirect-handling
If you experience problem when connecting to a remote server that is resolved by your DNS servers to a IPv6 (AAAA record)
first, there are chances that your local router or ISP might have problem connecting to IPv6 networks. In that case
undici will throw an error with code UND_ERR_CONNECT_TIMEOUT.
If the target server resolves to both a IPv6 and IPv4 (A records) address and you are using a compatible Node version
(18.3.0 and above), you can fix the problem by providing the autoSelectFamily option (support by both undici.request
and undici.Agent) which will enable the family autoselection algorithm when establishing the connection.
Undici aligns with the Node.js LTS schedule. The following table shows the supported versions:
| Undici Version | Bundled in Node.js | Node.js Versions Supported | End of Life |
|---|---|---|---|
| 5.x | 18.x | ≥14.0 (tested: 14, 16, 18) | 2024-04-30 |
| 6.x | 20.x, 22.x | ≥18.17 (tested: 18, 20, 21, 22) | 2026-04-30 |
| 7.x | 24.x | ≥20.18.1 (tested: 20, 22, 24) | 2027-04-30 |
MIT
Axios is a promise-based HTTP client for the browser and Node.js. It provides a simple API for making HTTP requests and is often used for its ease of use and wide adoption. Compared to undici, axios is higher-level and more feature-rich, but may not be as performant for certain use cases.
Got is a human-friendly and powerful HTTP request library for Node.js. It supports streams, promises, and provides a rich set of features for making HTTP requests. Got is similar to undici in terms of performance but offers a more comprehensive API surface.
node-fetch is a light-weight module that brings the Fetch API to Node.js. It is designed to mimic the browser fetch API as closely as possible. While undici focuses on HTTP/1.1, node-fetch provides a familiar interface for those used to working with fetch in the browser.
FAQs
An HTTP/1.1 client, written from scratch for Node.js
The npm package undici receives a total of 40,767,881 weekly downloads. As such, undici popularity was classified as popular.
We found that undici demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
HashiCorp disclosed a high-severity RCE in next-mdx-remote affecting versions 4.3.0 to 5.x when compiling untrusted MDX on the server.

Security News
Security researchers report widespread abuse of OpenClaw skills to deliver info-stealing malware, exposing a new supply chain risk as agent ecosystems scale.

Security News
Claude Opus 4.6 has uncovered more than 500 open source vulnerabilities, raising new considerations for disclosure, triage, and patching at scale.