Tinybench π

Benchmark your code easily with Tinybench, a simple, tiny and light-weight 10KB (2KB minified and gzipped) benchmarking library!
You can run your benchmarks in multiple JavaScript runtimes, Tinybench is completely based on the Web APIs with proper timing using
process.hrtime or performance.now.
- Accurate and precise timing based on the environment
- Statistically analyzed latency and throughput values: standard deviation, margin of error, variance, percentiles, etc.
- Concurrency support
Event and EventTarget compatible events
- No dependencies
In case you need more tiny libraries like tinypool or tinyspy, please consider submitting an RFC
Installing
$ npm install -D tinybench
Usage
You can start benchmarking by instantiating the Bench class and adding benchmark tasks to it.
import { Bench } from 'tinybench'
const bench = new Bench({ name: 'simple benchmark', time: 100 })
bench
.add('faster task', () => {
console.log('I am faster')
})
.add('slower task', async () => {
await new Promise(resolve => setTimeout(resolve, 1))
console.log('I am slower')
})
await bench.run()
console.log(bench.name)
console.table(bench.table())
The add method accepts a task name and a task function, so it can benchmark
it! This method returns a reference to the Bench instance, so it's possible to
use it to create an another task for that instance.
Note that the task name should always be unique in an instance, because Tinybench stores the tasks based
on their names in a Map.
Also note that tinybench does not log any result by default. You can extract the relevant stats
from bench.tasks or any other API after running the benchmark, and process them however you want.
More usage examples can be found in the examples directory.
Docs
Events
Both the Task and Bench classes extend the EventTarget object. So you can attach listeners to different types of events in each class instance using the universal addEventListener and removeEventListener methods.
bench.addEventListener('cycle', (evt) => {
const task = evt.task!;
});
task.addEventListener('cycle', (evt) => {
const task = evt.task!;
});
Async Detection
Tinybench automatically detects if a task function is asynchronous by
checking if provided function is an AsyncFunction or if it returns a
Promise, by calling the provided function once.
You can also explicitly set the async option to true or false when adding
a task, thus avoiding the detection. This can be useful, for example, for
functions that return a Promise but are actually synchronous.
const bench = new Bench()
bench.add('asyncTask', async () => {
}, { async: true })
bench.add('syncTask', () => {
}, { async: false })
bench.add('syncTaskReturningPromiseAsAsync', () => {
return Promise.resolve()
}, { async: true })
bench.add('syncTaskReturningPromiseAsSync', () => {
return Promise.resolve()
}, { async: false })
await bench.run()
Concurrency
- When
mode is set to null (default), concurrency is disabled.
- When
mode is set to 'task', each task's iterations (calls of a task function) run concurrently.
- When
mode is set to 'bench', different tasks within the bench run concurrently. Concurrent cycles.
bench.threshold = 10
bench.concurrency = 'task'
await bench.run()
Convert task results for console.table()
You can convert the benchmark results to a table format suitable for
console.table() using the bench.table() method.
const table = bench.table()
console.table(table)
You can also customize the table output by providing a convert-function to the table method.
import { Bench, type ConsoleTableConverter, formatNumber, mToNs, type Task } from 'tinybench'
const defaultConverter: ConsoleTableConverter = (
task: Task
): Record<string, number | string> => {
const state = task.result.state
return {
'Task name': task.name,
...(state === 'aborted-with-statistics' || state === 'completed'
? {
'Latency avg (ns)': `${formatNumber(mToNs(task.result.latency.mean))} \xb1 ${task.result.latency.rme.toFixed(2)}%`,
'Latency med (ns)': `${formatNumber(mToNs(task.result.latency.p50))} \xb1 ${formatNumber(mToNs(task.result.latency.mad))}`,
'Throughput avg (ops/s)': `${Math.round(task.result.throughput.mean).toString()} \xb1 ${task.result.throughput.rme.toFixed(2)}%`,
'Throughput med (ops/s)': `${Math.round(task.result.throughput.p50).toString()} \xb1 ${Math.round(task.result.throughput.mad).toString()}`,
Samples: task.result.latency.samplesCount,
}
: state !== 'errored'
? {
'Latency avg (ns)': 'N/A',
'Latency med (ns)': 'N/A',
'Throughput avg (ops/s)': 'N/A',
'Throughput med (ops/s)': 'N/A',
Samples: 'N/A',
Remarks: state,
}
: {
Error: task.result.error.message,
Stack: task.result.error.stack ?? 'N/A',
}),
...(state === 'aborted-with-statistics' && {
Remarks: state,
}),
}
}
const bench = new Bench({ name: 'custom table benchmark', time: 100 })
console.table(bench.table(defaultConverter))
Retaining Samples
By default Tinybench does not keep the samples for latency and throughput to
minimize memory usage. Enable sample retention if you need the raw samples for
plotting, custom analysis, or exporting results.
You can enable samples retention at the bench level by setting the
retainSamples option to true when creating a Bench instance:
const bench = new Bench({ retainSamples: true })
You can also enable samples retention by setting the retainSamples option to
true when adding a task:
bench.add('task with samples', () => {
}, { retainSamples: true })
Timestamp Providers
Tinybench can utilize different timestamp providers for measuring time intervals.
By default it uses performance.now().
The timestampProvider option can be set when creating a Bench instance. It
accepts either a TimestampProvider object or shorthands for the common
providers hrtimeNow and performanceNow.
If you use bun runtime, you can also use bunNanoseconds shorthand.
You can set the timestampProvider to auto to let Tinybench choose the most
precise available timestamp provider based on the runtime.
import { Bench } from 'tinybench'
const bench = new Bench({
timestampProvider: 'hrtimeNow'
})
If you want to provide a custom timestamp provider, you can create an object that implements
the TimestampProvider interface:
import { Bench, TimestampProvider } from 'tinybench'
const dateNowTimestampProvider: TimestampProvider = {
name: 'dateNow',
fn: Date.now,
toMs: ts => ts,
fromMs: ts => ts
}
const bench = new Bench({
timestampProvider: dateNowTimestampProvider
})
You can also set the now option to a function that returns the current timestamp.
It will be converted to a TimestampProvider internally.
import { Bench } from 'tinybench'
const bench = new Bench({
now: Date.now
})
Aborting Benchmarks
Tinybench supports aborting benchmarks using AbortSignal at both the bench and task levels:
Bench-level Abort
Abort all tasks in a benchmark by passing a signal to the Bench constructor:
const controller = new AbortController()
const bench = new Bench({ signal: controller.signal })
bench
.add('task1', () => {
})
.add('task2', () => {
})
controller.abort()
await bench.run()
Task-level Abort
Abort individual tasks without affecting other tasks by passing a signal to the task options:
const controller = new AbortController()
const bench = new Bench()
bench
.add('abortable task', () => {
}, { signal: controller.signal })
.add('normal task', () => {
})
controller.abort()
await bench.run()
Abort During Execution
You can abort benchmarks while they're running:
const controller = new AbortController()
const bench = new Bench({ time: 10000 })
bench.add('long task', async () => {
await new Promise(resolve => setTimeout(resolve, 100))
}, { signal: controller.signal })
setTimeout(() => controller.abort(), 1000)
await bench.run()
Abort Events
Both Bench and Task emit abort events when aborted:
const controller = new AbortController()
const bench = new Bench()
bench.add('task', () => {
}, { signal: controller.signal })
const task = bench.getTask('task')
task.addEventListener('abort', () => {
console.log('Task aborted!')
})
bench.addEventListener('abort', () => {
console.log('Bench received abort event!')
})
controller.abort()
await bench.run()
Note: When a task is aborted, task.result.aborted will be true, and the task will have completed any iterations that were running when the abort signal was received.
Prior art
Authors
Credits
Contributing
Feel free to create issues/discussions and then PRs for the project!
Your sponsorship can make a huge difference in continuing our work in open source!