
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
document-service
Advanced tools
Document Service allows consumers to get and update data about Endpoint Documents.
The Document Service manages the state related to documents that are shared between our customers and operations teams. All requests to create, access, or modify documents - by both internal and external systems - are serviced here.
Further documentation on the service can be found here.
Note: In order to support multi-tenancy in our platform, apps/document-service/src/AppModule.ts#32 configures the application to utilize the common endpoint auth guard. It is configured to only be used in non-local development environments to enable better developer experience.
This service is written in TypeScript, using the NestJS framework. It is packaged and run as a Docker container, which internally contains the NodeJS runtime.
Other significant technologies to note are:
In order to run the service locally, you will at minimum need to have docker installed. See local development instructions for details. If you want to run scripts from the package.json outside of the docker container, then you will need to ensure that your local node version is compatible with the version range listed in package.json -> engines -> node.
The service is backed by PostgreSQL. TypeORM manages the repository layer and migrations.
docker-compose up -d db
The run scripts in package.json allow easy execution of the 5 basic migration run scripts: create, generate, show, run and revert,
yarn orm:mig:show:local
It shows all ready-for-execution migrations. It iterates all created files in the dist/migrations folder.
Notice: you must build the app to get the javascript-format migrations.
yarn orm:mig:create -n <migration_name>
It generates a migration class with empty methods under the following path:
./migrations/<numeric_timestamp>-<migration_name>.ts, with class name <migration_name><numeric_timestamp>
Example
migrations/1655932765632-MigrationName.ts
yarn orm:mig:revert:local
It reverts all the already-executed queries in a previous run. e.g. the ones defined in the down() method of the migration classes.
Ask devops to perform the following actions:
It is possible to execute the Document Service almost entirely using local infrastructure. This is made possible by auto-mocking several external services and using localstack to mock AWS infrastructure.
If one has not been created already, create a .env file at the root of the application. This should be automatically git-ignored. The file should contain values for the following variables:
<your NPM token>LAUNCH_DARKLY_SDK_KEY for local development in the root level .env file. This can be found in AWS secrets manager.To start the document-service and all necessary infrastructure, run
docker-compose up
This will start the service using local as the DEPLOY_ENV and watching for any changes via the start:dev script.
To start the document-service in debug mode, run
docker-compose up
docker container ls
docker container stop <container-id-of-document-service>
DEPLOY_ENV={ENV} yarn start:dev
Run dev
DEPLOY_ENV=local yarn start:dev
Scaffolds commit messages and enforces required format using Endpoint's Commitizen configuration.
yarn commit
Runs linter
yarn lint
Fixes linter errors
yarn lint:fix
Resets node_modules, clears yarn cache, and reinstalls node_modules
yarn reset-modules
Runs integration tests using cypress.
yarn run test:cypress
Use this space to describe any requirements that need to be set up before the app can be deployed to production.
DEPLOY_ENVThis variable designates which of our environments this service is running in (i.e. local, development, integration, test, sandbox, staging, production).
DYNAMO_REGIONThe AWS region to connect to for dynamo. In development or testing environments with a localstack dynamo instance this value should be local.
TODO: Document why this env variable needs to be set and what precisely it influences.
PG_HOSTThe URL of the PostgreSQL (RDS) host to connect to.
ep-k8s repo
document-service/config/.ENV_NAME.env filePG_PASSWORDThe password for the specified PG_HOST. This value is configured and stored in AWS secrets manager.
SENTRY_DSNThe sentryDSN url from Sentry to enable sentry logging of errors, if not set it will disable sentry reporting for errors.
S3_SERVER_SIDE_ENCRYPTIONThe type of encryption to set for objects stored in S3. Information about server side encryption (SSE) can be found in the S3 SSE documentation.
S3_REGIONThe AWS region where the S3 bucket for this service resides.
S3_BUCKET_NAMEThe name of the S3 bucket to connect to.
ENABLE_REQUEST_LOGGINGEnable logging for incoming requests through the Endpoint Core Module when set to true (see docs).
PROVIDER_TEMPLATES_SYNCHRONIZE_CIRCUIT_BREAKER_MILLISECONDSMax number of milliseconds of timeout for synchronizing document templates from provider
To run unit tests:
yarn test
We use jest-dynalite to enable validation of dynamo queries.
Because not all unit test suites need an instance of dynalite, it is only spun up by manually including it in your tests. The simplest way is to include the following import at the top of your test suite:
import "jest-dynalite/withDb";
This will add the required setup and teardown hooks for dynalite. If you need more granular control, see further options in the Advanced Setup instructions.
User Admin credentials are required to run the cypress test. The credentials are stored in the
Cypress Test User (dev, int, stg) entry in the Engineering vault.
cypress.env.json file at the project root. This is already git-ignored and thus safe for secrets.{
"ADMIN_USER_NAME": "{username}",
"ADMIN_USER_PASSWORD": "{password}"
}
Example: run the following in two shells to run the cypress tests locally against a local service process using the development config. Cypress is setup to run integration test.
yarn run test:cypress
By default, the test will run against your local server. An env variable can be passed to cypress to
specify which environment you want to run the test (development, integration, staging).
yarn run test:cypress -e name=staging
Note:
transactionId and documentId will be auto provided as env vars when running the cypress run script. For more details see file /cypress/plugins/setUpTransaction/setUpTransaction.ts.Tags help us to organize and run tests based on the type of case they are. Right now, we only tag integration test cases. We use the tagging from the cypress/grep package to help us tag and run cases.
To add a tag to your test case you'll need to add { tags: ['@integration'] } to your it statement.
Ex.
it('C290718 - Open Order - Prelim conversion to Escrow', { tags: ['@integration'] }, () => {
To run tagged test cases you'll need to add grepTags={tags} to your cypress run script, where {tags} is replaced with the text of your tag
Ex.
yarn cypress open --env name=integration,grepTags='@integration'
We use K6 for our performance tests. You'll need to have K6 installed on your machine before attempting to execute the tests.
This script measures the time between a file upload completing and that file being available in Verse. It accomplishes this by uploading a file to document-service and then polling the service for that document's etag. It is configured to simulate 5 virtual users and have each user upload 100 files for a total of 500 uploads.
The following environment variables are required:
TX - a transaction IDTOKEN - a token to include in the Authorization header of outgoing requestsFILE - a path to a file on your local machine to uploadMake sure you're connected to the VPN, then run:
k6 run -e TX="TRANSACTION_ID" -e TOKEN="TOKEN" -e FILE="PATH_TO_FILE" upload-latency.js
Be sure to substitute a valid transaction ID and authorization token for the TX and TOKEN environment variables.
This script measures the availability of downloading a file. It is configured to simulate 5 virtual users and have each user download 300 files for a total of 1500 downloads. This script can set up a document, upload a document and then download it.
This script uses the config.json file to set its environment variables:
EP2_URL - ep2-services endpoint (auth token generator)DOCUMENT_SERVICE_URL - document-service urlFILE_PATH - file to upload and downloadCOGNITO_CLIENT_ID - cognito client id to get Auth TokensCOGNITO_CLIENT_SECRET - cognito client secret to get Auth TokensMake sure you're connected to the VPN, then run:
yarn build
k6 run download-availability.js
This script measures the availability of uploading a file. It is configured to simulate 5 virtual users and have each user download 20 files for a total of 100 uploads. This script can set a document and upload the document's content
This script uses the config.json file to set its environment variables:
EP2_URL - ep2-services endpoint (auth token generator)DOCUMENT_SERVICE_URL - document-service urlFILE_PATH - file to upload and downloadCOGNITO_CLIENT_ID - cognito client id to get Auth TokensCOGNITO_CLIENT_SECRET - cognito client secret to get Auth TokensMake sure you're connected to the VPN, then run:
yarn build
We recommend execute just a single scenario to avoid uploading a lot of files to our environments. Scenario list:
k6 run --env SCENARIO=constant_request_rate upload-availability.js
FAQs
A Typescript and NestJS repo for Endpoint's Document Service
We found that document-service demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.