Visual Regression Tester
A NPM package that exports functions to create visual regression tests
Setup
Requirements
Getting started
-
Clone visual-regression-tester
repository
-
Switch to project root directory
-
Checkout branch master by executing git checkout master
-
Execute command npm login --registry=https://porscheui.jfrog.io/porscheui/api/npm/npm/
-
Enter username, password (Artifactory API Key, not Artifactory password!) and e-mail address when asked in terminal
-
Execute cat ~/.npmrc
, find following line //porscheui.jfrog.io/porscheui/api/npm/npm/:_authToken=
and copy the generated npm registry token from the file to your clipboard
-
Create an .env
file within project root directory (never push this file to Git because it will contain secrets – by default it's ignored by .gitignore
)
-
Add npm registry token in following format PORSCHE_NPM_REGISTRY_TOKEN=YOUR_TOKEN_GOES_HERE
-
Make sure that Docker app is running
-
Run ./docker.sh run-install
- this may take up to several minutes at first start depending on your internet connection
Note: ./docker.sh run-install
should be executed after every pull.
Setup prettier
- Go to Webstorm
Preferences
- Click on the Plugins tab and search for
prettier
- Install prettier
- In
Preferences
go to Languages and Frameworks
-> Javascript
-> Prettier
- Set
Prettier Package
to {PATH_TO_YOUR_DIRECTORY}/node_modules/prettier
- Change
Run for files
to {**/*,*}.{js,ts,jsx,tsx,vue,scss,json,css}
- Click checkbox
on save
and apply - You should be good to go.
- If you have to exclude code fom being prettified, see Prettier configuration
Docker installation steps
- Register your Docker account on Hub-Docker
- Download Docker app locally on your machine and login
- Start Docker
Start
- Switch to project root directory
- Run
./docker.sh run-start
(starts test server for visual-regression-tester itself)
Build
- Switch to project root directory
- Run
./docker.sh run-build
(builds releasable visual-regression-tester npm package)
Test
- Switch to project root directory
- Run
./docker.sh run-test
(executes test for visual-regression-tester)
Dependency updates
Every week, we update our NPM packages:
- Switch to project root directory
- Run
./docker.sh run-upgrade
This should output the dependencies you might want to update. Select the NPM dependencies to be updated and press
Enter. Afterwards execute automated tests to make sure application still works. - Run
./docker.sh run-test
Get Visual Regression Tester up & running within in application
It's highly recommended to execute the visual regression tester within a Docker container to get reliable tests
results across any operating system and machine.
Installation
- Be sure that your project is configured to be able to install npm packages from Porsche UI Artifactory instance
- run
npm install --save-dev @porsche-design-system/visual-regression-tester
or yarn add --dev @porsche-design-system/visual-regression-tester
How to start
Check out
the Basic integration example
for an example how to get the visual regression tester up and running.
API
VisualRegressionTester
Constructor
The constructor expects 2 parameters:
browser: Browser
options: VisualRegressionTestOptions
(optional)
Browser should be
a Puppeteer Browser instance. Check
the basic integration example
for how to create a Puppeteer browser
.
test() Method
In the actual visual regression test you have to call
the test(snapshotId: string, scenario: Function, options: TestOptions = {elementSelector: '', maskSelectors: [], regressionSuffix: ''})
-method in your expect block, taken a unique name of the reference shot as first parameter (snapshotId: string
).
As second parameter (scenario: Function
) within the scenario callback function you call the goTo()
method with
the extended URL (will be concatinated with the baseURL
), as well as click()
, hover()
, focus()
and type()
if
necessary and prepare the state to compare.
goTo()
, click()
, hover()
, focus()
and type()
method accept following optional
parameters networkIdleTimeout: number
and maxInflightRequests: number
which means, — consider loading has finished
when there are no more than maxInflightRequests
network connections for at least networkIdleTimeout
ms.
As a third and optional parameter (options: TestOptions
) you can pass following options:
elementSelector: string = ''
- pass a css selector for the element (selector is allowed to match exactly one element
only) that should be included in your visual regression test.maskSelectors: string[] = []
- pass a string array which includes css selectors for the elements that should be
ignored in your visual regression test. If maskSelectors
is used in combination with elementSelector
then those
two selectors are concatenated automatically to match elements nested in elementSelector
.regressionSuffix: string = ''
- pass a string to add a suffix in regression filenames
To make use of Puppeteers Page instance within the scenario: Function
you call the getPage()
method and apply any
supported Puppeteer method like click()
, hover()
or type()
.
VisualRegressionTestOptions
viewports
selects the viewports of your browserdeviceScaleFactor
specify device scale factor (can be thought of as dpr)fixturesDir
directory where the reference-shots will be savedresultsDir
directory where the diffing- and the regression-shots will be savedtolerance
gives the tolerance range for your visual regression diffsbaseUrl
the base URL of the page you would like to testtimeout
impacts the timeout limit of page loadmode
defines the method with which the height for snapshot is determined
Note: All the VisualRegressionTestOptions are optional, those are the default options:
viewports: [320, 480, 760, 1000, 1300, 1760]
deviceScaleFactor: 1
fixturesDir: 'vrt/fixtures'
resultsDir: 'vrt/results'
tolerance: 0
baseUrl: 'http://localhost'
timeout: 30000
mode: 'auto'
Notes:
What to do when tests are failing
- Switch to your
resultsDir
directory. Here you can find the belonging diff
and regression
images. - Check if you would like to accept the changes
- If yes: Replace the reference shot in the
fixturesDir
folder with the belonging one in the resultsDir
folder
and delete the images in the resultsDir
directory afterwards manually. - If no: Recheck your code and run the tests again, when you think you fixed it.