
Security News
Open Source CAI Framework Handles Pen Testing Tasks up to 3,600× Faster Than Humans
CAI is a new open source AI framework that automates penetration testing tasks like scanning and exploitation up to 3,600× faster than humans.
@jschr/lambda-response
Advanced tools
Express-like API for sending responses from Lambda Integration Proxy to API Gateway and a CLI for local development.
Express-like API for sending responses from Lambda Integration Proxy to API Gateway.
Includes a CLI tool and express middleware for local development.
npm install @jschr/lambda-response
const { Response } = require('@jschr/lambda-response')
export default function handler(event, context) {
const res = new Response()
context.succeed(res.send('OK'))
// => { statusCode: 200, body: 'OK' }
context.succeed(res.json({ foo: bar }))
// => { statusCode: 200, body: '{"foo":"bar"}' }
context.succeed(res.status(404).json({ message: 'Not found.' }))
// => { statusCode: 404, body: '{"message":"Not found."}' }
context.succeed(res.redirect('https://github.com'))
// => { statusCode: 302, headers: { Location: 'https://github.com'} } }
}
Even more express-like with async/await:
const { Response } = require('@jschr/lambda-response')
async function route(req, res) {
const data = await someAsyncFunction(req.query.id)
if (data) {
res.json(data)
} else {
res.status(404).json({ message: 'Not found'. })
}
}
export default function handler(event, context) {
const req = { query: event.queryStringParameters || {} }
const res = new Response()
route(req, res)
.then(() => context.succeed(res))
.catch(context.fail)
}
Default headers can be passed when creating a new response:
const headers = { 'Content-Type': 'application/json' }
const res = new Response({ headers })
Or on an instance:
const res = new Response()
const headers = { 'Content-Type': 'application/json' }
res.set(headers)
CORS is enabled by default. You can pass in cors options when creating a new response:
const cors = { origin: 'example.com', methods: ['GET'], headers: ['X-Api-Key'] }
const res = new Response({ cors })
Check out the tests for more examples.
You can use the CLI for local development. If you've installed @jschr/lambda-response
globally:
$ lambda-response foo/bar.default --port 8080
Where foo/bar
is the path to your lambda handler and default
is the exported function.
For advanced use cases you can use the lambda-response
express middleware:
import * as express from 'express'
import { middleware } from '@jschr/lambda-response'
import handler from './foo/bar'
const app = express()
app.use(middleware(handler))
app.listen(8080)
FAQs
Express-like API for sending responses from Lambda Integration Proxy to API Gateway and a CLI for local development.
We found that @jschr/lambda-response demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
CAI is a new open source AI framework that automates penetration testing tasks like scanning and exploitation up to 3,600× faster than humans.
Security News
Deno 2.4 brings back bundling, improves dependency updates and telemetry, and makes the runtime more practical for real-world JavaScript projects.
Security News
CVEForecast.org uses machine learning to project a record-breaking surge in vulnerability disclosures in 2025.