
Security News
OWASP 2025 Top 10 Adds Software Supply Chain Failures, Ranked Top Community Concern
OWASP’s 2025 Top 10 introduces Software Supply Chain Failures as a new category, reflecting rising concern over dependency and build system risks.
@corva/local-testing-framework
Advanced tools
Local Testing Framework (LTF) is designed to develop Corva Lambda applications and generate workload on the local environments and remote machines.
corva-api-lite - simplified and fast Corva API.redis - for apps that need to save state between runsmongo - for apps that need to save state between runsUse npm to install the application globally:
npm i -g @corva/local-testing-framework
If there are no breaking changes:
npm update -g @corva/local-testing-framework
in other case you need to reinstall the LTF
Currently the app supports external MongoDB, local CSV, JSON files, and MongoDB's JS scripts.
To use external MongoDB, first of all, you should disable local MongoDB setup by setting infra.mongo.url.
{
infra: {
mongo: {
url: 'mongodb://external-mongo.com',
},
}
}
To use local MongoDB enable it's setup by setting infra.mongo.image and infra.mongo.port.
{
infra: {
mongo: {
image: 'mongo:latest',
port: 27018
},
}
}
Then you should specify the directory of file location with data you want to import.
For CSV (TSV) and JSON data, you need to specify the directory where the data you want to import is located. It should follow the next structure
import-dir/
├── database_1/
│ ├── collection_1.csv
│ ├── collection_2.json
│ ├── collection_3.tsv
│ └── collection_4.json
└── database_2/
├── collection_5.json
└── collection_6.tsv
NOTE if you're exporting JSON data from MongoDB, use --jsonArray flag to export valid JSON.
This app allows importing data into local MongoDB by writing MongoDB scripts. Comparing to CSV/JSON import, it will enable more advanced configuration of your data.
Here's an example of such script:
**
* @var {import('mongodb').Db} db
*/
var error = true;
/**
* @type {import('mongodb').Db}
*/
// @ts-ignore
var mydb = db.getSiblingDB('test-database');
/**
* @type {import('mongodb').Collection}
*/
// @ts-ignore
var myCollection = mydb.testcollection;
var res = [
myCollection.drop(),
myCollection.createIndex({ myfield: 1 }, { unique: true }),
myCollection.createIndex({ thatfield: 1 }),
myCollection.createIndex({ thatfield: 1 }),
myCollection.insert({ myfield: 'hello', thatfield: 'testing' }),
myCollection.insert({ myfield: 'hello2', thatfield: 'testing' }),
myCollection.insert({ myfield: 'hello3', thatfield: 'testing' }),
myCollection.insert({ myfield: 'hello3', thatfield: 'testing' }),
];
// @ts-ignore
printjson(res);
if (error) {
// @ts-ignore
print('Error, exiting');
// @ts-ignore
quit(1);
}
To use this kind of import, you should provide the path to a directory with such JS files.
For Node.js based example see project in examples directory.
To run Lambda App via LTF, you'll need to create a Docker image. Use the following example:
FROM lambci/lambda:nodejs12.x
COPY . /var/task
If you need other languages than Node, you can find them on dockerhub
NOTE corva init command can generate Dockerfile for you
If you want to use prebuilt image specify lambda.image option.
If no lambda.image option provided or no image found the LTF will try to build the image from sources on each run.
To build your image, you need to run the following command
docker build -t <lambda_name> .
Replace <lambda_name> with your actual Lambda App name, so it will be easier to recognize what images do you have.
npm i -g @corva/local-testing-frameworknode bin/corva.js --help and node bin/corva.js --help$ corva --help
corva --help
corva <command>
Commands:
corva.js init [cwd] Generate .corvarc file
corva.js local Run lambda function locally
corva.js completion generate completion script
Options:
--skip-version-upgrade Display warning on version update [boolean] [default: false]
--help Show help [boolean]
--version Show version number
$ corva init --help
corva init [cwd]
Generate .corvarc file
Positionals:
cwd Directory where to generate a .corvarc file [default: "."]
Options:
--skip-version-upgrade Display warning on version update [boolean] [default: false]
--help Show help [boolean]
--version Show version number [boolean]
$ corva local --help
corva.js local
Run lambda function locally
Lambda options
--lambda.image Lambda docker image [string]
--lambda.type Lambda type [string] [choices: "scheduler", "stream"] [default: "scheduler"]
--lambda.handler Lambda handler function name [string] [default: "index.handler"]
--lambda.env Additional env KEY=VALUE pairs that should be passed to lambda [array] [default: ["LOG_LEVEL=DEBUG"]]
Infrastructure options
--infra.stopContainers Decide to stop the containers after the test run [boolean] [default: false]
--infra.redis.url External Redis URL for lambda cache [string]
--infra.redis.image Docker image for Redis [string] [default: "redis:latest"]
--infra.redis.port Docker port for redis [string] [default: 6380]
--infra.mongo.url External MongoDB URL for lambda cache [string]
--infra.mongo.image Docker image for MongoDB [string] [default: "mongo"]
--infra.mongo.port Docker port for MongoDB [string] [default: 27018]
Source options
--source.import Should the test data be imported or not [boolean] [default: true]
--source.dir Directory where the test-sources are located (relative to the project root) [string] [default: "./test-sources"]
--source.database Source database for events [string] [default: "corva"]
--source.collection Events source collection [string] [default: "runner#wits"]
Event options
--event.json Pass single event to lambda from provided file [string]
--event.assetId Event asset id [number] [default: 1234]
--event.companyId Event company id [number] [default: 1]
--event.appKey App key [string] [default: "my-company.my-drilling-app"]
--event.sourceType Event source type [string] [choices: "drilling", "drillout", "wireline", "frac"] [default: "drilling"]
--event.config.batchSize Max amount of records in event [number] [default: 10]
--event.config.interval Event invocation interval in seconds (min 60): [number] [default: 60]
--event.config.logType [choices: "time", "depth"] [default: "time"]
Export options
--export.enabled Enables data exporting [boolean] [default: false]
--export.collections Collections to export [array] [default: []]
--export.format Format for export [string] [choices: "json", "csv"] [default: "json"]
--export.fields Fields to be exported in csv [string] [default: {}]
Options:
--skip-version-upgrade Display warning on version update [boolean] [default: false]
--help Show help [boolean]
Before running your app you should decide what type of events does it consume. Currently, app re-runner supports Real-Time (Stream) and Polling (Scheduled) apps.
Here are some examples of such events:
Stream Event:
[
{
"metadata": {
"apps": {
"corva.wits-depth-summary": {
"app_connection_id": 123
}
},
"app_stream_id": 456
},
"records": [
{
"asset_id": 1,
"timestamp": 1546300800,
"company_id": 24,
"version": 1,
"data": {
"hole_depth": 99.4,
"weight_on_bit": 1,
"state": "Some unnecessary drilling that's excluded"
}
},
{
"asset_id": 1,
"timestamp": 1546300800,
"company_id": 24,
"version": 1,
"data": {
"hole_depth": 99.4,
"weight_on_bit": 1,
"state": "Rotary Drilling"
}
},
{
"asset_id": 1,
"timestamp": 1546300900,
"company_id": 24,
"version": 1,
"data": {
"hole_depth": 99.5,
"weight_on_bit": 1,
"state": "Rotary Drilling"
}
},
{
"asset_id": 1,
"timestamp": 1546301000,
"company_id": 24,
"version": 1,
"data": {
"hole_depth": 99.9,
"weight_on_bit": 1,
"state": "Rotary Drilling"
}
},
{
"asset_id": 1,
"timestamp": 1546301100,
"company_id": 24,
"version": 1,
"data": {
"hole_depth": 100.3,
"weight_on_bit": 1,
"state": "Rotary Drilling"
}
},
{
"asset_id": 1,
"timestamp": 1546301200,
"company_id": 24,
"version": 1,
"data": {
"hole_depth": 100.5,
"weight_on_bit": 1,
"state": "Rotary Drilling"
}
},
{
"asset_id": 1,
"timestamp": 1546301300,
"company_id": 24,
"version": 1,
"data": {
"hole_depth": 100.6,
"weight_on_bit": 1,
"state": "Rotary Drilling"
}
}
]
}
]
Scheduled Event:
{
"collection": "operations",
"source_type": "drilling",
"environment": "qa",
"interval": 300,
"schedule_start": 1578420000000,
"schedule_end": 1578420300000,
"asset_id": 16280
}
LTF based on app type generates one of those event types. By default, for streams, it queries WITS collection and creates events from it from the very beginning of the well until drilling is completed. For scheduled events, it gets intervals between first and last wits record and generates 300s events within a given interval.
Also, it's possible to launch the app for a single event. See LTF help for how to.
To create .corvarc configuration file please run corva init in your $CWD
Example:
{
"version": 4,
"lambda": {
"type": "scheduler",
"handler": "index.handler",
"env": [
"LOG_LEVEL=DEBUG"
],
"port": 9002
},
"infra": {
"stopContainers": false,
"restartContainers": false,
"redis": {
"image": "redis:latest",
"port": 6380
},
"mongo": {
"image": "mongo",
"port": 27018
},
"apiLite": {
"image": "corva/corva-api-lite:latest",
"mode": "data",
"port": 3008
}
},
"source": {
"import": true,
"dir": "./test-sources",
"database": "corva",
"collection": "runner#wits",
"limits": {}
},
"event": {
"assetId": 1234,
"companyId": 1,
"sourceType": "drilling",
"config": {
"batchSize": 10,
"interval": 60
},
"appKey": "my-company.my-drilling-app",
"options": {}
},
"export": {
"enabled": true,
"format": "json",
"collections": [
"corva#destination-collection"
]
},
"debugOutput": "file"
};
Here's an example how to launch LTF:
corva local --lambda.env MY_ENV_VARIABLE="<some_value>" --lambda.image=test-lambda
--lambda.image=test-lambda - the name/uri of docker image of your application--lambda.env MY_ENV_VARIABLE="<some_value>" - environment variables that will be passed to lambda docker containerYou may override values from the config file with CLI options.
Useful variables:
LOG_LEVEL controls on which log messages should be printed (info by default), e.g. LOG_LEVEL=debug will allow debug, info, warn and error messages, while LOG_LEVEL=error will permit error messages onlyLOG_THRESHOLD_MESSAGE_SIZE controls on how many symbols will be printer per logger invocation (1000 by default), e.g. LOG_THRESHOLD_MESSAGE_SIZE=10000. This will not affect the deployed app.LOG_THRESHOLD_MESSAGE_COUNT controls on how many log messages will be allowed to print per lambda invocation (15 by default), e.g. LOG_THRESHOLD_MESSAGE_COUNT=100. This will not affect the deployed app.NOTES:
Priority of the environment variables:
CLI lambda.env options
.corvarc file lambda.env options
settings.env section from manifest.json file.
the CLI lambda.env options will replace .corvarc provided options
CLI/corvarc will extend manifest.json provided options
only settings.env section from manifest.json will be applied on app deployment
In general, LTF follows the next steps
redis, mongo, corva-api-lite, your app)corva-api-litedebugOutput config option to file.infra.stopContainers and set it to true.infra.restartContainers and set it to true.It is possible to export the content of the MongoDB collections in JSON or CSV format.
The results of export will be available in output directory in your $CWD, the file format is $LAMBDA_NAME_$ASSETID$TIMESTAMP_$COLLECTION.$FORMAT, e.g dev-center-gamma-depth_1234_1615369834679_corva#data.drillstring.csv
To export in JSON set export property to
{
enabled: true,
format: 'json',
collections: ['corva#example1', 'corva#example2] // list of collections you want to export
}
To export in CSV set export property to
{
enabled: true,
format: 'csv',
collections: ['corva#example1', 'corva#example2'] // list of collections you want to export
fields: {
'corva#example1': ['field1', 'field2', 'deep.structure.field'],
'corva#example2': ['some-other-field', 'field_with_dashes']
}
}
FAQs
Corva lambda local testing framework
We found that @corva/local-testing-framework demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 10 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
OWASP’s 2025 Top 10 introduces Software Supply Chain Failures as a new category, reflecting rising concern over dependency and build system risks.

Research
/Security News
Socket researchers discovered nine malicious NuGet packages that use time-delayed payloads to crash applications and corrupt industrial control systems.

Security News
Socket CTO Ahmad Nassri discusses why supply chain attacks now target developer machines and what AI means for the future of enterprise security.