Research
Security News
Threat Actor Exposes Playbook for Exploiting npm to Build Blockchain-Powered Botnets
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
The undici npm package is a HTTP/1.1 client, written from scratch for Node.js, that is designed to be faster and more efficient than the built-in 'http' and 'https' modules. It provides a low-level API for making HTTP requests and can be used to build higher-level abstractions.
HTTP Request
Make an HTTP request and process the response. This is the basic functionality of undici, allowing you to send HTTP requests and receive responses.
const { request } = require('undici');
(async () => {
const { statusCode, headers, body } = await request('https://example.com')
console.log('response received', statusCode);
for await (const data of body) {
console.log('data', data);
}
})();
HTTP Pool
Use a pool of connections to make HTTP requests. This is useful for making a large number of requests to the same server, as it reuses connections between requests.
const { Pool } = require('undici');
const pool = new Pool('https://example.com')
async function query() {
const { body } = await pool.request({
path: '/path',
method: 'GET'
})
for await (const data of body) {
console.log('data', data);
}
}
query();
HTTP Stream
Stream an HTTP response to a file or another stream. This is useful for handling large responses that you don't want to hold in memory.
const { pipeline } = require('undici');
const fs = require('fs');
pipeline(
'https://example.com',
fs.createWriteStream('output.txt'),
(err) => {
if (err) {
console.error('Pipeline failed', err);
} else {
console.log('Pipeline succeeded');
}
}
);
HTTP Upgrade
Upgrade an HTTP connection to another protocol, such as WebSockets. This is useful for protocols that start with an HTTP handshake and then upgrade to a different protocol.
const { connect } = require('undici');
(async () => {
const { socket, statusCode, headers } = await connect({
path: '/path',
method: 'GET'
});
console.log('upgrade response', statusCode, headers);
socket.on('data', (chunk) => {
console.log('data', chunk.toString());
});
})();
Axios is a promise-based HTTP client for the browser and Node.js. It provides a simple API for making HTTP requests and is often used for its ease of use and wide adoption. Compared to undici, axios is higher-level and more feature-rich, but may not be as performant for certain use cases.
Got is a human-friendly and powerful HTTP request library for Node.js. It supports streams, promises, and provides a rich set of features for making HTTP requests. Got is similar to undici in terms of performance but offers a more comprehensive API surface.
node-fetch is a light-weight module that brings the Fetch API to Node.js. It is designed to mimic the browser fetch API as closely as possible. While undici focuses on HTTP/1.1, node-fetch provides a familiar interface for those used to working with fetch in the browser.
A HTTP/1.1 client, written from scratch for Node.js.
Undici means eleven in Italian. 1.1 -> 11 -> Eleven -> Undici. It is also a Stranger Things reference.
npm i undici
Machine: AMD EPYC 7502P
Node 15
http - keepalive x 12,028 ops/sec ±2.60% (265 runs sampled)
undici - pipeline x 31,321 ops/sec ±0.77% (276 runs sampled)
undici - request x 36,612 ops/sec ±0.71% (277 runs sampled)
undici - stream x 41,291 ops/sec ±0.90% (268 runs sampled)
undici - dispatch x 47,319 ops/sec ±1.17% (263 runs sampled)
The benchmark is a simple hello world
example using a
single unix socket with pipelining.
import { request } from 'undici'
const {
statusCode,
headers,
trailers,
body
} = await request('http://localhost:3000/foo')
console.log('response received', statusCode)
console.log('headers', headers)
for await (const data of body) {
console.log('data', data)
}
console.log('trailers', trailers)
new undici.Client(url, opts)
A basic HTTP/1.1 client, mapped on top of a single TCP/TLS connection. Pipelining is disabled by default.
Requests are not guaranteeed to be dispatched in order of invocation.
url
can be a string or a URL
object.
It should only include the protocol, hostname, and port.
Options:
socketPath: String|Null
, an IPC endpoint, either Unix domain socket or Windows named pipe.
Default: null
.
keepAliveTimeout: Number
, the timeout after which a socket without active requests
will time out. Monitors time between activity on a connected socket.
This value may be overridden by keep-alive hints from the server.
Default: 4e3
milliseconds (4s).
keepAliveMaxTimeout: Number
, the maximum allowed keepAliveTimeout
when overridden by
keep-alive hints from the server.
Default: 600e3
milliseconds (10min).
keepAliveTimeoutThreshold: Number
, a number subtracted from server keep-alive hints
when overriding keepAliveTimeout
to account for timing inaccuracies caused by e.g.
transport latency.
Default: 1e3
milliseconds (1s).
headersTimeout: Number
, the timeout after which a request will time out, in
milliseconds. Monitors time between receiving complete headers.
Use 0
to disable it entirely. Default: 30e3
milliseconds (30s).
bodyTimeout: Number
, the timeout after which a request will time out, in
milliseconds. Monitors time between receiving body data.
Use 0
to disable it entirely. Default: 30e3
milliseconds (30s).
pipelining: Number
, the amount of concurrent requests to be sent over the
single TCP/TLS connection according to RFC7230.
Carefully consider your workload and environment before enabling concurrent requests
as pipelining may reduce performance if used incorrectly. Pipelining is sensitive
to network stack settings as well as head of line blocking caused by e.g. long running requests.
Set to 0
to disable keep-alive connections.
Default: 1
.
tls: Object|Null
, an options object which in the case of https
will be passed to
tls.connect
.
Default: null
.
maxHeaderSize: Number
, the maximum length of request headers in bytes.
Default: 16384
(16KiB).
client.request(opts[, callback(err, data)]): Promise|Void
Performs a HTTP request.
Options:
path: String
method: String
opaque: Any
body: String|Buffer|Uint8Array|stream.Readable|Null
Default: null
.headers: Object|Array|Null
, an object with header-value pairs or an array with header-value pairs bi-indexed (['header1', 'value1', 'header2', 'value2']
).
Default: null
.signal: AbortSignal|EventEmitter|Null
Default: null
.idempotent: Boolean
, whether the requests can be safely retried or not.
If false
the request won't be sent until all preceding
requests in the pipeline has completed.
Default: true
if method
is HEAD
or GET
.Headers are represented by an object like this:
{
'content-length': '123',
'content-type': 'text/plain',
connection: 'keep-alive',
host: 'mysite.com',
accept: '*/*'
}
Or an array like this:
[
'content-length', '123',
'content-type', 'text/plain',
'connection', 'keep-alive',
'host', 'mysite.com',
'accept', '*/*'
]
Keys are lowercased. Values are not modified.
If you don't specify a host
header, it will be derived from the url
of the client instance.
The data
parameter in callback
is defined as follow:
statusCode: Number
opaque: Any
headers: Object
, an object where all keys have been lowercased.trailers: Object
, an object where all keys have been lowercased. This object start out
as empty and will be mutated to contain trailers after body
has emitted 'end'
.body: stream.Readable
response payload. A user must
either fully consume or destroy the body unless there is an error, or no further requests
will be processed.Returns a promise if no callback is provided.
Example:
const { Client } = require('undici')
const client = new Client(`http://localhost:3000`)
client.request({
path: '/',
method: 'GET'
}, function (err, data) {
if (err) {
// handle this in some way!
return
}
const {
statusCode,
headers,
trailers,
body
} = data
console.log('response received', statusCode)
console.log('headers', headers)
body.setEncoding('utf8')
body.on('data', console.log)
body.on('end', () => {
console.log('trailers', trailers)
})
client.close()
})
Non-idempotent requests will not be pipelined in order to avoid indirect failures.
Idempotent requests will be automatically retried if they fail due to indirect failure from the request at the head of the pipeline. This does not apply to idempotent requests with a stream request body.
A request can be aborted using either an AbortController
or an EventEmitter
.
To use AbortController
in Node.js versions earlier than 15, you will need to
install a shim - npm i abort-controller
.
const { Client } = require('undici')
const client = new Client('http://localhost:3000')
const abortController = new AbortController()
client.request({
path: '/',
method: 'GET',
signal: abortController.signal
}, function (err, data) {
console.log(err) // RequestAbortedError
client.close()
})
abortController.abort()
Alternatively, any EventEmitter
that emits an 'abort'
event may be used as an abort controller:
const EventEmitter = require('events')
const { Client } = require('undici')
const client = new Client('http://localhost:3000')
const ee = new EventEmitter()
client.request({
path: '/',
method: 'GET',
signal: ee
}, function (err, data) {
console.log(err) // RequestAbortedError
client.close()
})
ee.emit('abort')
Destroying the request or response body will have the same effect.
client.stream(opts, factory(data)[, callback(err)]): Promise|Void
A faster version of request
.
Unlike request
this method expects factory
to return a Writable
which the response will be
written to. This improves performance by avoiding
creating an intermediate Readable
when the user
expects to directly pipe the response body to a
Writable
.
Options:
client.request(opts[, callback])
.The data
parameter in factory
is defined as follow:
statusCode: Number
headers: Object
, an object where all keys have been lowercased.opaque: Any
The data
parameter in callback
is defined as follow:
opaque: Any
trailers: Object
, an object where all keys have been lowercased.Returns a promise if no callback is provided.
const { Client } = require('undici')
const client = new Client(`http://localhost:3000`)
const fs = require('fs')
client.stream({
path: '/',
method: 'GET',
opaque: filename
}, ({ statusCode, headers, opaque: filename }) => {
console.log('response received', statusCode)
console.log('headers', headers)
return fs.createWriteStream(filename)
}, (err) => {
if (err) {
console.error('failure', err)
} else {
console.log('success')
}
})
opaque
makes it possible to avoid creating a closure
for the factory
method:
function (req, res) {
return client.stream({ ...opts, opaque: res }, proxy)
}
Instead of:
function (req, res) {
return client.stream(opts, (data) => {
// Creates closure to capture `res`.
proxy({ ...data, opaque: res })
}
}
client.pipeline(opts, handler(data)): Duplex
For easy use with stream.pipeline
.
Options:
client.request(opts, callback)
.objectMode: Boolean
, true
if the handler
will return an object stream.
Default: false
The data
parameter in handler
is defined as follow:
statusCode: Number
headers: Object
, an object where all keys have been lowercased.opaque: Any
body: stream.Readable
response payload. A user must
either fully consume or destroy the body unless there is an error, or no further requests
will be processed.handler
should return a Readable
from which the result will be
read. Usually it should just return the body
argument unless
some kind of transformation needs to be performed based on e.g.
headers
or statusCode
.
The handler
should validate the response and save any
required state. If there is an error it should be thrown.
Returns a Duplex
which writes to the request and reads from
the response.
const { Client } = require('undici')
const client = new Client(`http://localhost:3000`)
const fs = require('fs')
const stream = require('stream')
stream.pipeline(
fs.createReadStream('source.raw'),
client.pipeline({
path: '/',
method: 'PUT',
}, ({ statusCode, headers, body }) => {
if (statusCode !== 201) {
throw new Error('invalid response')
}
if (isZipped(headers)) {
return pipeline(body, unzip(), () => {})
}
return body
}),
fs.createWriteStream('response.raw'),
(err) => {
if (err) {
console.error('failed')
} else {
console.log('succeeded')
}
}
)
client.upgrade(opts[, callback(err, data)]): Promise|Void
Upgrade to a different protocol.
Options:
path: String
opaque: Any
method: String
Default: GET
headers: Object|Null
, an object with header-value pairs.
Default: null
signal: AbortSignal|EventEmitter|Null
.
Default: null
protocol: String
, a string of comma separated protocols, in descending preference order.
Default: Websocket
.The data
parameter in callback
is defined as follow:
headers: Object
, an object where all keys have been lowercased.socket: Duplex
opaque
Returns a promise if no callback is provided.
client.connect(opts[, callback(err, data)]): Promise|Void
Starts two-way communications with the requested resource.
Options:
path: String
opaque: Any
headers: Object|Null
, an object with header-value pairs.
Default: null
signal: AbortSignal|EventEmitter|Null
.
Default: null
The data
parameter in callback
is defined as follow:
statusCode: Number
headers: Object
, an object where all keys have been lowercased.socket: Duplex
opaque: Any
Returns a promise if no callback is provided.
client.dispatch(opts, handler): Void
This is the low level API which all the preceding APIs are implemented on top of.
This API is expected to evolve through semver-major versions and is less stable than the preceding higher level APIs. It is primarily intended for library developers who implement higher level APIs on top of this.
Multiple handler methods may be invoked in the same tick.
Options:
path: String
method: String
body: String|Buffer|Uint8Array|stream.Readable|Null
Default: null
.headers: Object|Null
, an object with header-value pairs.
Default: null
.idempotent: Boolean
, whether the requests can be safely retried or not.
If false
the request won't be sent until all preceding
requests in the pipeline has completed.
Default: true
if method
is HEAD
or GET
.The handler
parameter is defined as follow:
onConnect(abort)
, invoked before request is dispatched on socket.
May be invoked multiple times when a request is retried when the request at the head of the pipeline fails.
abort(): Void
, abort request.onUpgrade(statusCode, headers, socket): Void
, invoked when request is upgraded either due to a Upgrade
header or CONNECT
method.
statusCode: Number
headers: Array|Null
socket: Duplex
onHeaders(statusCode, headers, resume): Boolean
, invoked when statusCode and headers have been received.
May be invoked multiple times due to 1xx informational headers.
statusCode: Number
headers: Array|Null
, an array of key-value pairs. Keys are not automatically lowercased.resume(): Void
, resume onData
after returning false
.onData(chunk): Boolean
, invoked when response payload data is received.
chunk: Buffer
onComplete(trailers): Void
, invoked when response payload and trailers have been received and the request has completed.
trailers: Array|Null
onError(err): Void
, invoked when an error has occurred.
err: Error
The caller is responsible for handling the body
argument, in terms of 'error'
events and destroy()
:ing up until
the onConnect
handler has been invoked.
client.close([callback]): Promise|Void
Closes the client and gracefully waits for enqueued requests to complete before invoking the callback.
Returns a promise if no callback is provided.
client.destroy([err][, callback]): Promise|Void
Destroy the client abruptly with the given err
. All the pending and running
requests will be asynchronously aborted and error. Waits until socket is closed
before invoking the callback. Since this operation is asynchronously dispatched
there might still be some progress on dispatched requests.
Returns a promise if no callback is provided.
client.url: URL
Returns url passed to undici.Pool(url, opts)
.
client.pipelining: Number
Property to get and set the pipelining factor.
client.pending: Number
Number of queued requests.
client.running: Number
Number of inflight requests.
client.size: Number
Number of pending and running requests.
client.connected: Boolean|Integer
Thruthy if the client has an active connection. The client will lazily
create a connection when it receives a request and will destroy it
if there is no activity for the duration of the timeout
value.
client.busy: Boolean
True if pipeline is saturated or blocked. Indicates whether dispatching further requests is meaningful.
client.closed: Boolean
True after client.close()
has been called.
client.destroyed: Boolean
True after client.destroyed()
has been called or client.close()
has been
called and the client shutdown has completed.
'drain'
, emitted when pipeline is no longer fully
saturated.
'connect'
, emitted when a socket has been created and
connected. The client will connect once client.size > 0
.
'disconnect'
, emitted when socket has disconnected. The
first argument of the event is the error which caused the
socket to disconnect. The client will reconnect if or once
client.size > 0
.
new undici.Pool(url, opts)
A pool of Client
connected to the same upstream target.
Implements the same api as Client
with a few minor
differences.
Requests are not guaranteeed to be dispatched in order of invocation.
Options:
Client
.connections
, the number of clients to create.
Default 10
.'connect'
, emitted when a client has connected. The first argument is the
Client
instance
'disconnect'
, emitted when a client has disconnected. The first argument is the
Client
instance, the second is the the error that caused the disconnection.
new undici.Agent(opts)
undici.Pool.options
- options passed through to Pool constructorReturns: Agent
Returns a new Agent instance for use with pool based requests or the following top-level methods request
, pipeline
, and stream
.
agent.get(origin): Pool
string
- A pool origin to be retrieved from the Agent.This method retrieves Pool instances from the Agent. If the pool does not exist it is automatically added. You do not need to manually close these pools as they are automatically removed using a WeakCache based on WeakRef and FinalizationRegistry.
The following methods request
, pipeline
, and stream
utilize this feature.
agent.close(): Promise
Returns a Promise.all
operation closing all of the pool instances in the Agent instance. This calls pool.close
under the hood.
agent.destroy(): Promise
Returns a Promise.all
operation destroying all of the pool instances in the Agent instance. This calls pool.destroy
under the hood.
undici.setGlobalAgent(agent)
Agent
Sets the global agent used by request
, pipeline
, and stream
methods.
The default global agent creates undici.Pool
s with no max number of
connections.
The agent must only implement the Agent
API; not necessary extend from it.
undici.request(url[, opts]): Promise
string | URL | object
{ agent: Agent } & client.request.opts
url
may contain path. opts
may not contain path. opts.method
is GET
by default.
Calls pool.request(opts)
on the pool returned from either the globalAgent (see setGlobalAgent) or the agent passed to the opts
argument.
Returns a promise with the result of the request
method.
undici.stream(url, opts, factory): Promise
string | URL | object
{ agent: Agent } & client.stream.opts
client.stream.factory
url
may contain path. opts
may not contain path.
See client.stream for details on the opts
and factory
arguments.
Calls pool.stream(opts, factory)
on the pool returned from either the globalAgent (see setGlobalAgent) or the agent passed to the opts
argument.
Result is returned in the factory function. See client.stream for more details.
undici.pipeline(url, opts, handler): Duplex
string | URL | object
{ agent: Agent } & client.pipeline.opts
client.pipeline.handler
url
may contain path. opts
may not contain path.
See client.pipeline for details on the opts
and handler
arguments.
Calls pool.pipeline(opts, factory)
on the pool returned from either the globalAgent (see setGlobalAgent) or the agent passed to the opts
argument.
See client.pipeline for more details.
client.upgrade(opts[, callback(err, data)]): Promise|Void
string | URL | object
{ agent: Agent } & client.upgrade.opts
url
may contain path. opts
may not contain path.
client.connect(opts[, callback(err, data)]): Promise|Void
string | URL | object
{ agent: Agent } & client.connect.opts
url
may contain path. opts
may not contain path.
undici.errors
Undici exposes a variety of error objects that you can use to enhance your error handling.
You can find all the error objects inside the errors
key.
const { errors } = require('undici')
Error | Error Codes | Description |
---|---|---|
InvalidArgumentError | UND_ERR_INVALID_ARG | passed an invalid argument. |
InvalidReturnValueError | UND_ERR_INVALID_RETURN_VALUE | returned an invalid value. |
RequestAbortedError | UND_ERR_ABORTED | the request has been aborted by the user |
ClientDestroyedError | UND_ERR_DESTROYED | trying to use a destroyed client. |
ClientClosedError | UND_ERR_CLOSED | trying to use a closed client. |
SocketError | UND_ERR_SOCKET | there is an error with the socket. |
NotSupportedError | UND_ERR_NOT_SUPPORTED | encountered unsupported functionality. |
ContentLengthMismatchError | UND_ERR_CONTENT_LENGTH_MISMATCH | body does not match content-length header |
InformationalError | UND_ERR_INFO | expected error with reason |
TrailerMismatchError | UND_ERR_TRAILER_MISMATCH | trailers did not match specification |
This section documents parts of the HTTP/1.1 specification which Undici does not support or does not fully implement.
Undici does not support the Expect
request header field. The request
body is always immediately sent and the 100 Continue
response will be
ignored.
Refs: https://tools.ietf.org/html/rfc7231#section-5.1.1
Uncidi will only use pipelining if configured with a pipelining
factor
greater than 1
.
Undici always assumes that connections are persistent and will immediately pipeline requests, without checking whether the connection is persistent. Hence, automatic fallback to HTTP/1.0 or HTTP/1.1 without pipelining is not supported.
Undici will immediately pipeline when retrying requests afters a failed connection. However, Undici will not retry the first remaining requests in the prior pipeline and instead error the corresponding callback/promise/stream.
Refs: https://tools.ietf.org/html/rfc2616#section-8.1.2.2
Refs: https://tools.ietf.org/html/rfc7230#section-6.3.2
MIT
FAQs
An HTTP/1.1 client, written from scratch for Node.js
We found that undici demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
Security News
NVD’s backlog surpasses 20,000 CVEs as analysis slows and NIST announces new system updates to address ongoing delays.
Security News
Research
A malicious npm package disguised as a WhatsApp client is exploiting authentication flows with a remote kill switch to exfiltrate data and destroy files.