overlooker
This package is a set of utilities that allow you to configure your frontend profiling in CI/CD.
At the same time, you will receive comprehensive information about what your pages contain.
Overlooker allows you to run batch profiling of a set of pages for all relevant metrics for the frontend.
You can set the number of measurements for each page,
write scripts that need to be measured for execution speed,
analyze the content of resources on your pages, and much more..
In addition, you can use the received data to compare them with previous profiles,
while receiving a report on what has changed on your pages.
As a tool for performance tests, there is an opportunity to set thresholds
that will limit the degradation of performance on your project and help you track what exactly led to them.
Installation
npm i overlooker
Usage
First, let's figure out how to start profiling.
const config = {
host: 'https://example.com',
throttling: {
cpu: 1,
network: 'WiFi'
},
cookies: [{
name: 'cookie_name',
value: 'cookie_value',
domain: 'example.com'
}],
cache: {
type: 'wpr'
},
count: 10,
platform: 'desktop',
pages: [{
name: 'main',
url: '/',
layers: {
meaningfulLayer: '.selector'
},
actions: [{
name: 'test-action',
action: async (page) => {
await page.click('button');
await page.waitForSelector('#loaded-image');
}
}]
}, {
name: 'category',
url: '/'
}],
logger: (msg) => console.log(msg),
buildData: {
url: '/build.json',
},
requests: {
ignore: (url) => url.includes('ad'),
merge: (url) => url.includes('stats'),
internalTest: (url) => url.startsWith('https://example.com') || url.startsWith('https://example.io'),
}
};
Most of the parameters are optional and the brief config can be as follows:
const config = {
host: 'https://example.com',
pages: [{
name: 'main',
url: '/',
}],
count: 10
};
To collect some product-centric metrics, you can use standard API on your pages:
Element Timing API
<span elementtiming="some-element-paint"></span>
User Timing API
const config = {
customMetrics: {
timing: /^product-timing\.(.*?)$/i
}
}
and on your page you have to execute this:
performance.mark('some-metric');
These metrics will be collected during profiling and presented in the resulting json.
Profiling can be organized in a simple way.
const { profile } = require('overlooker');
const profileResult = await profile({
...config,
host: 'https://master.example.com'
});
await db.saveProfile(revision, profileResult);
As a result, you will receive data about the performance of your pages in json format.
I recommend saving the data to your favorite database or directly to the file system (do not forget about rotation) with the identifier of the measured revision.
Impact Analysis (types)
To reduce the cost of profiling and speed it up, I recommend using the impact analyzer on the page.
To use it, you need the previous impact analysis data.
const { impactAnalysis, affectConfigByImpact } = require('overlooker');
const impactData = await impactAnalysis(
masterDescription,
config,
(element) => element.request.url.includes('ad')
);
await db.saveImpactData(revision, impactData);
const impactedConfig = affectConfigByImpact(config, impactData);
As a result of executing this code, you will get a configuration with pages that have changes compared to another revision.
Impact analysis data is best saved in a database too.
After profiling is over, you can use it to compare against a profile with an earlier revision.
or the revision profile from where your branch you are testing was forked from.
const { comparePages } = require('overlooker');
const profileDataFeature = await db.getProfileByRevision(featureRevision);
const profileDataMaster = await db.getProfileByRevision(masterRevision);
const comparison = comparePages(profileDataMaster, profileDataFeature);
As a result of the comparison, you will get the full difference for all metrics,
requests, modules in chunks (if you used bundle-internals-plugin) and this data
can be used to analyse performance impact of your code changes.
Analyzing the comparison results (types)
There is also a separate method to check the comparison results by custom thresholds.
But first let's see how to define thresholds for your performance metrics.
That will allow you to set up the deviation limits that you need.
Example of thresholds:
const thresholds = {
default: {
'percent.stats.userCentric.timeToInteractive.median': 0.05,
'percent.stats.elementsTimings.**.median': 0.05
},
main: {
'percent.stats.custom.timings.*.median': 0.1,
'percent.stats.custom.userCentric.{timeToInteractive, speedIndex}.median': 0.1
}
}
And you can use these thresholds to check your comparison
const { check } = require('overlooker');
const result = check(comparison, thresholds);
if (!result.success) {
return 1;
}
You can also use this approach to budget performance by using the check method
on some profile.
const { check } = require('overlooker');
const result = check(profileDataFeature, budget);
Tool for getting build data
Bundle Internals Plugin
Tool for view trace events
flame-chart-js