
Security News
vlt Launches "reproduce": A New Tool Challenging the Limits of Package Provenance
vlt's new "reproduce" tool verifies npm packages against their source code, outperforming traditional provenance adoption in the JavaScript ecosystem.
@redocly/christi
Advanced tools
The key benefits of christi
:
You MUST have a working API server running in order to run the tests.
christi
npm i @redocly/christi --registry http://3.236.95.236:8000/
christi auto-generate-config <your-OAS-definition-file>
christi test-file.yaml
Create your own flows (chained requests) for testing.
christi <your-test-file> [-f | --flow] [-v | --verbose]
Run API tests based on the test-file config.
Options
Option | Type | Description |
---|---|---|
-f, --flow | Array<String> | Flow names to run. Example: christi test-file.yaml --flow first-flow second-flow |
-v, --verbose | boolean | Verbose mode. Example: christi test-file.yaml --verbose |
--har-output | string | Path for the har file for saving logs. Example christi test-file.yaml --har-output='logs.har' |
Run the tests by running the following command: christi <your-test-file>
.
christi auto-generate-config <your-OAS-definition-file> [-o | --output-file] [--extended]
Auto generate the test-config file based on the OpenAPI definition file. If examples
are provided in the OpenAPI definition, they are used as input data for test requests. In case schema
is provided, config generates with fake data based on the definition schema. By default, data for requests come from the definition in run-time. To materialize tests with the data, use the --extended
option.
:::info Tip
Use --extended
option only to learn how christi
uses data from a definition. In most cases it is better to use the data from the definition in run-time.
:::
Options
Option | Type | Description |
---|---|---|
-o, --output-file | string | Path to the OAS definition file. If file name not provided default name is used - auto-generate.test.yaml . Example: christi auto-generate-config OAS-file.yaml -o=example.tests.yaml |
--extended | boolean | By default, data for requests come from the definition in runtime. This option generates a test config file with data populated from the definition. Example: christi auto-generate-config OAS-file.yaml -o=example.tests.yaml --extended . |
christi [--version] [--help]
Prints the version or help information.
The configuration file is a YAML file with the following fixed fields.
Property | Type | Description |
---|---|---|
flows | Map[string , [TestCase]] | REQUIRED. Define sequence of API requests and expected responses. |
before | [TestCase] | List of calls that run before the flows begin and before constructing defaults. |
definition | string | Path to your OAS definition. |
inherit | string | Definition data reuse strategy. Possible values: auto, none . Default value: auto . |
serverUrl | string | Base path for all the requests with relative path. |
defaults | Defaults | Default parameters used for every request. |
Property | Type | Description |
---|---|---|
path | string | REQUIRED. The path to the API is appended to the server. |
method | string | REQUIRED. HTTP method. Possible values: GET , POST , PUT , PATCH , DELETE , OPTION , (TODO add more). |
name | string | Unique name assigned to the test useful for output and request chaining. |
requestBody | object | JSON (YAML) object. |
parameters | [Parameter] | List of parameters and values. |
expect | Expect | Expectations of the response. |
assert | Assertion | Custom assertion. See the assertion description for more details. |
inherit | string | Definition data reuse strategy. Possible values: auto, none . Default value: auto . |
content-type | string | Request body content type. It is different from the Content-Type header. Use it only if your server doesn't accept this header. |
This example contains the flows
object:
flows:
my-testing-flow:
- path: https://jsonplaceholder.typicode.com/posts/1
method: get
expect:
status: 200
body:
matchesObject:
id: 1
Each flow is isolated and consists of steps declared as an array. It MUST contain the path
and method
properties. You can also specify a name to able to reference its response in the next steps.
You can also create a chain of requests and reference each response (see How to reference response):
flows:
my-testing-flow:
- name: createItem
path: /items
method: post
requestBody:
title: foo
expect:
status: 201
- path: /items/{id}
method: get
parameters:
- name: id
in: path
value: ${responses.createItem.body.id}
expect:
status: 200
schema:
type: object
properties:
id:
type: number
title:
type: string
The status
and schema
fields could be omitted and inferred from the OAS definition.
Property | Type | Description |
---|---|---|
status | integer | [integer] | HTTP status code of the response (or list of possible status codes). |
schema | object | JSON schema. If an OpenAPI definition is provided, the appropriate schema is determined automatically (and if this value is set it overrides that). |
body | Body | Compare expectations about the response body. |
mimeType | string | Expected mime type of the response. |
Property | Type | Description |
---|---|---|
equal | any | Compares response body to be strict equal to passed values |
matches | string | Compares response body to match passed string. |
matchesObject | object | Compares properties in the response body to those in the defined object. |
Property | Type | Description |
---|---|---|
name | string | REQUIRED. Parameter name. |
in | string | REQUIRED. Parameter location. Possible values: query, header, path, cookie . |
value | any | REQUIRED. Parameter value. |
Property | Type | Description |
---|---|---|
parameters | [Parameter] | Default parameters to include in every request. |
defaults:
parameters:
- in: header
name: Authorization
value: Bearer ${secrets.TOKEN}
The runner gets secrets from the env variables:
TOKEN=<your-token> christi <your-test-file>
before
configures API requests that run before all of the flows and before constructing Defaults and has the same structure as a flow.
before:
- path: /signin
name: signIn
method: post
inherit: none
requestBody:
email: some-email@mail.com
password: ****
expect:
status: 200
Inheritance works the same as for flows. The only difference is that you can refer to this block from other flows:
flows:
my-testing-flow:
- name: createItem
path: /items
method: post
parameters:
- in: header
name: Authorization
value: Bearer ${ beforeResponses.signIn.body.token }
requestBody:
title: foo
Or you can refer from defaults object:
defaults:
parameters:
- in: header
name: Authorization
value: Bearer ${beforeResponses.signIn.body.token}
Responses are scoped to their flow. In order to extract any data (status, body, etc.) from other responses these responses must be declared prior to that request and must have a name
field provided.
Example:
${responses.<yourRequestName>.body}
OR
${responses.<yourRequestName>.status}
There might be only one reference ${...}
per value (for example, you cannot concatenate multiple values into a single value).
If it is embedded in a string (for example, Bearer ${responses.signIn.body.token}
) it is treated as a string.
See Context for more data available.
If you provided an OAS definition, you can reuse it in the flow.
You can define the strategy using inherit
keyword in each step of the flow.
By default it is inherit: auto
which means that the runner will try to figure out parameters
(only required ones), requestBody
, expect.status
, and expect.schema
from the definition if it exists.
If you want to redefine any of these fields you can do it explicitly.
You can opt-out any data from the definition by using inherit: none
on each test step.
It is also possible to define the default inherit strategy on the top level for the whole test file.
The runner first applies the data from the definition
(least priority) if provided, then defaults
, and then the flows
(highest priority).
Context is accessible by using ${ <fieldsFromContext> }
notation.
Property | Type | Description |
---|---|---|
secrets | Map[string, string] | Secrets passed as environment variables. |
definition | Map[string, any] | JSON representation of the linked OAS definition. |
responses | Map[string, any] | Responses from requests of the specific flow. NOTE: responses are scoped to their flows (you cannot access response from different flow). responses fill in as it goes through the requests. See How to reference responses for details. |
requests | Map[string, any] | Requests data of the specific flow. NOTE: requests are scoped to their flows (you cannot access response from different flow). |
beforeResponses | Map[string, any] | Access responses from TestCase items that run before as part of the setup process. |
beforeRequests | Map[string, any] | Access requests of TestCase items that run before as part of the setup process. |
faker | Faker Object | Faker to generate random data. See Faker Object for more details. |
storage | Storage | Key-value storage object. |
You can use key-value storage in the Context to store and retrieve data between requests. Example
flows:
using-storage:
- path: /items
method: post
assert: |
(expect, response, ctx) => {
ctx.storage.setItem('created-item-id', response.body.id)
}
- path: /items/{id}
method: get
parameters:
- in: path
name: id
value: ${storage.data.created-item-id}
expect:
status: 200
body:
matchObject:
id: ${storage.data.created-item-id}
If you have an OAS definition, you may provide path to the file:
definition: ./path/to/definition.yaml
and then reference in a flow or in the defaults section, e. g.: ${definition.paths.items/{id}.responses.201.content.application/json.schema}
.
Data Type | Type | Usage example | Output example |
---|---|---|---|
string | FakeString | ${ faker.string.fullName() } | Camille Mohr |
number | FakeNumber | ${ faker.number.integer({ min: 10, max: 20 }) } | 12 |
address | FakeAddress | ${ faker.address.city() } | Lake Raoulfort |
date | FakeDate | ${ faker.date.past() } | Sat Oct 20 2018 04:19:38 GMT-0700 (Pacific Daylight Time) |
Example:
flows:
create-feedback:
- name: addFeedback
path: /feedback
method: post
requestBody:
contentId: ${ faker.string.uuid() }
rating: '${ faker.number.integer({ min: 1, max: 5 }) }'
suggestion: A suggestion.
sentiment: true
reason: one
Method | Parameters | Usage example |
---|---|---|
userName | - | ${ faker.string.userName() } |
firstName | - | ${ faker.string.firstName() } |
lastName | - | ${ faker.string.lastName() } |
fullName | - | ${ faker.string.fullName() } |
{ provider? : string } | ${ faker.string.email({ provider: 'gmail' }) } | |
uuid | - | ${ faker.string.uuid() } |
string | { length?: number } | ${ faker.string.string({ length: 5 }) } |
Method | Parameters | Usage example |
---|---|---|
integer | { min?: number; max?: number } | ${ faker.number.integer({ max: 30 }) } |
float | { min?: number; max?: number; precision?: number } | ${ faker.number.float({ precision: 0.001 }) } |
Method | Parameters | Usage example |
---|---|---|
city | - | ${ faker.address.city() } |
country | - | ${ faker.address.country() } |
zipCode | - | ${ faker.address.zipCode() } |
street | - | ${ faker.address.street() } |
[Note: is there a way to format the date?]
Method | Parameters | Usage example |
---|---|---|
past | - | ${ faker.date.past() } |
future | - | ${ faker.date.future() } |
You can write your custom assertions by providing string as a valid JavaScript function as in the following example.
assert: |
(expect, response) => {
expect(response.body.message).match(/OK/i);
}
Or use a reference to a .js
file as in the following example.
assert:
$ref: ./custom-assertion.js
Assertion function accepts the following arguments.
Argument | Description |
---|---|
expect | Expect function. (Caveat: currently it supports function expressions only, e.g. () => {} or (function a () {}) . Function declaration must be wrapped by grouping operator () in order vm.Script to parse it properly). |
response | Response of the request under testing. |
FAQs
API TEST FRAMEWORK
We found that @redocly/christi demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 9 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
vlt's new "reproduce" tool verifies npm packages against their source code, outperforming traditional provenance adoption in the JavaScript ecosystem.
Research
Security News
Socket researchers uncovered a malicious PyPI package exploiting Deezer’s API to enable coordinated music piracy through API abuse and C2 server control.
Research
The Socket Research Team discovered a malicious npm package, '@ton-wallet/create', stealing cryptocurrency wallet keys from developers and users in the TON ecosystem.