[!NOTE]
This package is under development so expect breaking changes in future releases.
Cypress AI


🧪 Cypress AI command that generates E2E tests using an LLM (Large Language Model):
cy.ai(string)
Prerequisites
Install
NPM:
npm install cy-ai --save-dev
Yarn:
yarn add cy-ai --dev
Usage
If you're using TypeScript, import the command using ES2015 syntax:
echo "import 'cy-ai'" >> cypress/support/commands.ts
Or if you're using JavaScript, use CommonJS require:
echo "require('cy-ai')" >> cypress/support/commands.js
Start the Ollama server:
ollama serve
Download the LLM:
ollama pull qwen2.5-coder
Write a test:
it('visits example.com', () => {
cy.ai('go to https://example.com and see heading "Example Domain"')
})
[!TIP]
If you're running Chrome, disable chromeWebSecurity
so the LLM requests aren't blocked by CORS:
import { defineConfig } from 'cypress'
export default defineConfig({
chromeWebSecurity: false,
})
cy.ai
Generate Cypress tests with AI:
cy.ai(string[, options])
llm
LangChain Runnable to invoke. Defaults to a prompt template using the Ollama model qwen2.5-coder
.
Use a different large language model:
import { Ollama } from '@langchain/ollama'
import { prompt } from 'cy-ai'
const llm = new Ollama({
model: 'codellama',
numCtx: 16384,
})
const chain = prompt.pipe(llm)
cy.ai('prompt', { llm: chain })
Or customize the template:
import { PromptTemplate } from '@langchain/core/prompts'
import { Ollama } from '@langchain/ollama'
const llm = new Ollama({
model: 'codellama',
numCtx: 16384,
})
const prompt = PromptTemplate.fromTemplate(`
You are writing an E2E test step with Cypress.
Rules:
1. Return JavaScript Cypress code without "describe" and "it".
Task: {task}
HTML:
\`\`\`html
{html}
\`\`\`
`)
const chain = prompt.pipe(llm)
cy.ai('prompt', { llm: chain })
[!IMPORTANT]
Don't forget to pull the Ollama model:
ollama pull codellama
log
Whether to display the command logs. Defaults to true
:
cy.ai('prompt', { log: true })
Hide Cypress and console logs:
cy.ai('prompt', { log: false })
regenerate
Whether to regenerate the Cypress step with AI. Defaults to false
:
cy.ai('prompt', { regenerate: false })
Regenerate the Cypress step with AI:
cy.ai('prompt', { regenerate: true })
timeout
Time to wait in milliseconds. Defaults to 2 minutes:
cy.ai('prompt', { timeout: 120000 })
Set timeout to 5 minutes:
cy.ai('prompt', { timeout: 1000 * 60 * 5 })
cy.aiConfig
Configure global options for cy.ai:
cy.aiConfig(options)
options
Override default options:
cy.aiConfig({
llm: chain,
log: false,
regenerate: true,
timeout: 1000 * 60 * 3,
})
Set timeout to 5 minutes:
cy.aiConfig({
timeout: 1000 * 60 * 5,
})
Set the LLM to Anthropic.
How It Works
- A prompt is created from your task, the HTML body, and the template.
- The prompt is sent to the LLM server.
- The LLM server responds with Cypress code.
- The Cypress code is cleaned and run.
- If the steps pass, the code is saved to
cypress/e2e/**/__generated__/*.json
.
- If the steps fail, an error is thrown and the LLM response can be inspected in the browser Console.
When running tests, if the generated Cypress code exists, the command will reuse the existing code.
To regenerate a step, enable the regenerate option or delete the generated code in cypress/e2e/**/__generated__/*.json
.
[!WARNING]
If you have tests with duplicate or identical titles (describe
and it
), it could cause the generated tests to fail.
Release
Release is automated with Release Please.
Resources
License
MIT