Introduction
Inspired by the AWSLABS aws-serverless-express library tailor made for the Fastify web framework.
No use of internal sockets, makes use of Fastify's inject function.
Seems faster (as the name implies) than aws-serverless-express and aws-serverless-fastify 😉
👨🏻💻Installation
$ npm install aws-lambda-fastify
Options
aws-lambda-fastify can take options by passing them with : awsLambdaFastify(app, options)
property | description | default value |
---|
binaryMimeTypes | Array of binary MimeTypes to handle | [] |
serializeLambdaArguments | Activate the serialization of lambda Event and Context in http header x-apigateway-event x-apigateway-context | false (was true for <v2.0.0) |
decorateRequest | Decorates the fastify request with the lambda Event and Context request.awsLambda.event request.awsLambda.context | true |
decorationPropertyName | The default property name for request decoration | awsLambda |
callbackWaitsForEmptyEventLoop | See: Official Documentation | undefined |
📖Example
lambda.js
const awsLambdaFastify = require('aws-lambda-fastify')
const app = require('./app')
const proxy = awsLambdaFastify(app)
exports.handler = proxy
app.js
const fastify = require('fastify')
const app = fastify()
app.get('/', (request, reply) => reply.send({ hello: 'world' }))
if (require.main === module) {
app.listen(3000, (err) => {
if (err) console.error(err)
console.log('server listening on 3000')
})
} else {
module.exports = app
}
When executed in your lambda function we don't need to listen to a specific port,
so we just export the app
in this case.
The lambda.js
file will use this export.
When you execute your Fastify application like always,
i.e. node app.js
(the detection for this could be require.main === module
),
you can normally listen to your port, so you can still run your Fastify function locally.
📣Hint
The original lambda event and context are passed via Fastify request and can be used like this:
app.get('/', (request, reply) => {
const event = request.awsLambda.event
const context = request.awsLambda.context
})
If you do not like it, you can disable this by setting the decorateRequest
option to false
.
Alternatively the original lambda event and context are passed via headers and can be used like this, if setting the serializeLambdaArguments
option to true
:
app.get('/', (request, reply) => {
const event = JSON.parse(decodeURIComponent(request.headers['x-apigateway-event']))
const context = JSON.parse(decodeURIComponent(request.headers['x-apigateway-context']))
})
⚡️Some basic performance metrics
aws-lambda-fastify (decorateRequest : false) x 56,892 ops/sec ±3.73% (79 runs sampled)
aws-lambda-fastify x 56,571 ops/sec ±3.52% (82 runs sampled)
aws-lambda-fastify (serializeLambdaArguments : true) x 56,499 ops/sec ±3.56% (76 runs sampled)
serverless-http x 45,867 ops/sec ±4.42% (83 runs sampled)
aws-serverless-fastify x 17,937 ops/sec ±1.83% (86 runs sampled)
aws-serverless-express x 16,647 ops/sec ±2.88% (87 runs sampled)
Fastest is aws-lambda-fastify (decorateRequest : false), aws-lambda-fastify
⚠️Considerations
- For apps that may not see traffic for several minutes at a time, you could see cold starts
- Stateless only
- API Gateway has a timeout of 29 seconds, and Lambda has a maximum execution time of 15 minutes. (Using Application Load Balancer has no timeout limit, so the lambda maximum execution time is relevant)
- If you are using another web framework (Connect, Express, Koa, Restana, Sails, Hapi, Fastify, Restify) or want to use a more generic serverless proxy framework, have a look at: serverless-http
🎖Who is using it?
The logos displayed in this page are property of the respective organisations and they are
not distributed under the same license as aws-lambda-fastify (MIT).