
Security News
GitHub Actions Pricing Whiplash: Self-Hosted Actions Billing Change Postponed
GitHub postponed a new billing model for self-hosted Actions after developer pushback, but moved forward with hosted runner price cuts on January 1.
contextor is a powerful but simple tool helping you to pass a context along an asynchronous process.
Note that, contextor is build on async hooks available since version 8 of Node.js and still in experimental state as of version 12.
Here is a simple example with an express request context:
const express = require('express');
const contextualizer = require('./index');
const app = express();
let id = 0;
function getCurrentRequestId() {
// Retrieve current request id.
return contextualizer.get('request').id;
}
function logSomething(message) {
console.log({
requestId: getCurrentRequestId(),
message
});
}
app.use((req, res, next) => {
req.id = id++;
// Create a new context and add current request and response objects to it.
contextualizer.create()
.set('request', req)
.set('response', res);
next();
});
app.use((req, res, next) => {
logSomething('something');
next();
});
app.get('/', (req, res) => {
res.send({
requestId: req.id,
contextRequestId: getCurrentRequestId()
});
// `requestId` and `contextRequestId` should be the same!
});
app.listen(3000);
Run the following command to add the package to your dependencies:
$ npm install --save contextor
// CommonJS
const contextor = require('contextor');
// ES6 modules
import contextor from 'contextor';
You can create a context just calling following method:
contextor.create();
This will create a context associated with the current asynchronous resource processing and all its descendants, overriding the one of its ancestors.
You can set a value in the current context:
contextor.set('foo', 'bar');
You can get a value of the current context:
contextor.get('foo');
This will throw a ReferenceError if the key does not exist.
Instead, you can specify a default value in case the key does not exist:
contextor.get('foo', 'bar');
Some environment variables are available in order to customize context cleaning:
CONTEXTOR_CLEAN_CHECK_CONTEXT_SIZE: number of created contexts before a cleaning is executed (default: 100)CONTEXTOR_CONTEXT_TTL: TTL of contexts in ms; set to 0 to make it infinite (default: 6e4)Contextor create context for async hooks. A bad usage can lead to memory leaks.
Function getMemoryUsage has been build to help you investigate that kind of issue:
const { inspect } = require('util');
const memoryUsage = contextor.getMemoryUsage();
console.log(inspect(memoryUsage, {
compact: false,
colors: true,
depth: 6,
}));
Many npm scripts are available to help testing:
$ npm run {script}
check: lint and check unit and integration testslint: lintlint-fix: try to fix lint automaticallytest: check unit teststest-coverage: check coverage of unit teststest-debug: debug unit teststest-watch: work in TDD!Use npm run check to check that everything is ok.
If you want to contribute, just fork this repository and make a pull request!
Your development must respect these rules:
You must keep test coverage at 100%.
FAQs
Package allowing to pass a context along an asynchronous process
We found that contextor demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
GitHub postponed a new billing model for self-hosted Actions after developer pushback, but moved forward with hosted runner price cuts on January 1.

Research
Destructive malware is rising across open source registries, using delays and kill switches to wipe code, break builds, and disrupt CI/CD.

Security News
Socket CTO Ahmad Nassri shares practical AI coding techniques, tools, and team workflows, plus what still feels noisy and why shipping remains human-led.