
Research
Namastex.ai npm Packages Hit with TeamPCP-Style CanisterWorm Malware
Malicious Namastex.ai npm packages appear to replicate TeamPCP-style Canister Worm tradecraft, including exfiltration and self-propagation.
asylum-serve
Advanced tools
Micro — Async HTTP microservices
async and await (more)The following example sleep.js will wait before responding (without blocking!)
import { send } from 'micro';
import sleep from 'then-sleep';
export default async function (req, res) {
await sleep(500);
send(res, 200, 'Ready!');
}
To run the microservice on port 3000, use the micro command:
$ micro -p 3000 sleep.js
Note: micro requires Node 0.12 or later
Install from NPM:
$ npm init
$ npm install micro --save
Then in your package.json:
"scripts": {
"start": "micro -p 3000 index.js"
}
Then write your index.js (see above for an example). To run your
app and make it listen on http://localhost:3000 run:
$ npm start
Usage: micro [options] <file>
Options:
-h, --help output usage information
-V, --version output the version number
-p, --port Port to listen on (3000)
-n, --no-babel Skip Babel transformation
By default, micro will transpile the target file and its relative dependencies so that ES6 and async/await an ES6 work for you.
For production, we recommend you first transpile and use --no-babel to make bootup time much faster. That said, if you don't care about how long it takes to boot, the default flags are perfectly suitable for production.
Read more about Transpilation to understand what transformations are recommended.
micro(fn, { onError = null })
This function is exposed as the default export.
Use import micro from 'micro' or require('micro').
Returns a http.Server that uses the provided fn as the request handler.
The supplied function is run with await. It can be async!
The onError function is invoked with req, res, err if supplied (see Error Handling)
Example:
import micro from 'micro';
import sleep from 'then-sleep';
const srv = micro(async function (req, res) {
await sleep(500);
res.writeHead(200);
res.end('woot');
});
srv.listen(3000);
json(req, { limit = '1mb' })
Use import { json } from 'micro' or require('micro').json.
Buffers and parses the incoming body and returns it.
Exposes an async function that can be run with await.
limit is how much data is aggregated before parsing at max. Otherwise, an Error is thrown with statusCode set to 413 (see Error Handling). It can be a Number of bytes or a string like '1mb'.
If JSON parsing fails, an Error is thrown with statusCode set to 400 (see Error Handling)
Example:
import { json, send } from 'micro';
export default async function (req, res) {
const data = await json(req);
console.log(data.price);
send(res, 200);
}
send(res, statusCode, data = null)
Use import { send } from 'micro' or require('micro').send.
statusCode is a Number with the HTTP error code, and must always be supplied.
If data is supplied and is an object, it's automatically serialized as JSON. Content-Type and Content-Length are automatically set.
If JSON serialization fails (for example, if a cyclical reference is found), a 400 error is thrown (see Error Handling).
Example
import { send } from 'micro';
export default async function (req, res) {
send(res, 400, { error: 'Please use a valid email' });
}
send(req, res, error)
import { sendError } from 'micro' or require('micro').sendError.onError.error.statusCode.error.message as the body.NODE_ENV is set to 'development'), stacks are printed out with console.error and also sent in responses.throw.Micro allows you to write robust microservices. This is accomplished primarily by bringing sanity back to error handling and avoiding callback soup.
If an error is thrown and not caught by you, the response will automatically be 500. Important: during development mode (if the env variable NODE_ENV is 'development'), error stacks will be printed as console.error and included in the responses.
If the Error object that's thrown contains a statusCode property, that's used as the HTTP code to be sent. Let's say you want to write a rate limiting module:
import rateLimit from 'my-rate-limit';
export default async function (req, res) {
await rateLimit(req);
// … your code
}
If the API endpoint is abused, it can throw an error like so:
if (tooMany) {
const err = new Error('Rate limit exceeded');
err.statusCode = 429;
throw err;
}
The nice thing about this model is that the statusCode is merely a suggestion. The user can override it:
try {
await rateLimit(req);
} catch (err) {
if (429 == err.statusCode) {
// perhaps send 500 instead?
send(res, 500);
}
}
If the error is based on another error that Micro caught, like a JSON.parse exception, then originalError will point to it.
If a generic error is caught, the status will be set to 500.
In order to set up your own error handling mechanism, you can pass a custom onError function to micro:
const myErrorHandler = async (req, res, err) => {
// your own logging here
res.writeHead(500);
res.end('error!');
};
micro(handler, { onError: myErrorHandler });
Micro makes tests compact and a pleasure to read and write. We recommend ava, a highly parallel micro test framework with built-in support for async tests:
import test from 'ava';
import listen from './listen';
import { send } from 'micro';
import request from 'request-promise';
test('my endpoint', async t => {
const fn = async function (req, res) {
send(res, 200, { test: 'woot' });
};
const url = await listen(fn);
const body = await request(url);
t.same(body.test, 'woot');
});
Look at the test/_listen helper for a function that returns a URL with an ephemeral port every time it's called.
The Babel configuration micro uses is:
{
"presets": ["es2015"],
"plugins": [
"transform-runtime",
"syntax-async-functions",
"transform-async-to-generator"
]
}
These requires the following NPM modules (versions might vary)
{
"babel-plugin-syntax-async-functions": "6.3.13",
"babel-plugin-transform-async-to-generator": "6.4.6",
"babel-plugin-transform-runtime": "6.4.3",
"babel-preset-es2015": "6.3.13"
}
You can use the micro CLI for npm start:
{
"name": "my-microservice",
"dependencies": {
"micro": "x.y.z"
},
"scripts": {
"start": "micro -p 3000 microservice.js"
}
}
Then your Dockerfile can look like this:
FROM node:argon
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN npm install
COPY . /usr/src/app
EXPOSE 3000
CMD [ "npm", "start" ]
gulp help to see available tasks.gulp lint and gulp test.standard + semicolons.micro npm name.FAQs
Async HTTP microservices runtime
The npm package asylum-serve receives a total of 12 weekly downloads. As such, asylum-serve popularity was classified as not popular.
We found that asylum-serve demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Research
Malicious Namastex.ai npm packages appear to replicate TeamPCP-style Canister Worm tradecraft, including exfiltration and self-propagation.

Product
Explore exportable charts for vulnerabilities, dependencies, and usage with Reports, Socket’s new extensible reporting framework.

Product
Socket for Jira lets teams turn alerts into Jira tickets with manual creation, automated ticketing rules, and two-way sync.