wdio-performancetotal-service
Advanced tools
Comparing version 4.0.1-1 to 4.0.2
@@ -78,3 +78,3 @@ "use strict"; | ||
if (!root) { | ||
this._logger.error("Performance-Total error: Can't get root folder"); | ||
this._logger.error("Can't get root folder"); | ||
return ""; | ||
@@ -96,3 +96,3 @@ } | ||
catch (err) { | ||
this._logger.error(`Performance-Total error: can't create dir ${dirPath}: ${err}`); | ||
this._logger.error(`can't create directory ${dirPath}: ${err}`); | ||
} | ||
@@ -99,0 +99,0 @@ } |
@@ -9,5 +9,5 @@ "use strict"; | ||
error(message) { | ||
console.error(message); | ||
console.error(`performance-total error: ${message}`); | ||
} | ||
} | ||
exports.Plogger = Plogger; |
{ | ||
"name": "wdio-performancetotal-service", | ||
"version": "4.0.1-1", | ||
"version": "4.0.2", | ||
"description": "WebdriverIO service for analyzing test flow performance", | ||
@@ -25,2 +25,3 @@ "main": "build/app.js", | ||
"@types/mocha": "^10.0.1", | ||
"@types/node": "^22.10.5", | ||
"@types/objects-to-csv": "^1.3.1", | ||
@@ -27,0 +28,0 @@ "@wdio/cli": "^9.5.1", |
# performance-total | ||
Note:<br/> | ||
For WebdriverIO v9 use version 4.x.x.<br/> | ||
For WebdriverIO v8 use version 3.x.x.<br/> | ||
@@ -13,3 +14,4 @@ For WebdriverIO v7 use version 2.x.x.<br/> | ||
<h2>Installation</h2> | ||
## Installation | ||
The easiest way to install this module as a dev dependency is by using the following command: | ||
@@ -21,7 +23,7 @@ | ||
<h2>Usage</h2> | ||
## Usage | ||
Add wdio-performancetotal-service to your `wdio.conf.js`: | ||
``` | ||
```typescript | ||
exports.config = { | ||
@@ -35,3 +37,3 @@ // ... | ||
``` | ||
```typescript | ||
exports.config = { | ||
@@ -55,5 +57,5 @@ // ... | ||
<h2>Options</h2> | ||
## Options | ||
<h3>disableAppendToExistingFile</h3> | ||
### disableAppendToExistingFile | ||
@@ -63,8 +65,12 @@ When set to `true`, new test runs will start fresh and overwrite any existing performance data. | ||
<h3>performanceResultsFileName</h3> | ||
> **⚠️ Caution:** | ||
> | ||
> This action will delete all your performance data permanently. Ensure that you have a backup before proceeding. | ||
### performanceResultsFileName | ||
You can set the default results file name (`performance-results`). | ||
A newly created results file normally overwrites the old file. If you want to keep old files, it is recommended to add a timestamp to the file name. For example: | ||
``` | ||
```typescript | ||
... | ||
@@ -75,10 +81,12 @@ performanceResultsFileName: `performance-results_${new Date().getTime()}` | ||
<h3>dropResultsFromFailedTest</h3> | ||
### dropResultsFromFailedTest | ||
Default is `false`. When the value is set to `true`, performance analysis from failed tests would be excluded. | ||
<h3>recentDays</h3> | ||
To set the number of days to consider for performance analysis set the number of days. Partial days are also supported (e.g. `recentDays: 0.5`) | ||
### recentDays | ||
<h3>performanceResultsDirectory</h3> | ||
Default is `0` (no limit). To set the number of days to consider for performance analysis set the number of days. Partial days are also supported (e.g. `recentDays: 0.5`) | ||
### performanceResultsDirectory | ||
You can override the default path for the results directory in the project's root dir. | ||
@@ -93,12 +101,13 @@ For example: | ||
<h3>analyzeByBrowser</h3> | ||
Default is `false`. If true, the performance data would be grouped also by the browser type. | ||
### analyzeByBrowser | ||
Default is `false`. If `true`, the performance data would be analyzed also by the browser type. | ||
<h2>Usage in test</h2> | ||
Just import <b>performancetotal</b> where you need it, whether it be in your test file or any other class. This object provides methods for measuring performance data in your tests, including sampleStart and sampleEnd for starting and ending performance measurements. | ||
## Usage in test | ||
Just import __performancetotal__ where you need it, whether it be in your test file or any other class. This object provides methods for measuring performance data in your tests, including sampleStart and sampleEnd for starting and ending performance measurements. | ||
Here's an example of how you might use the performancetotal object to measure the startup performance of two websites: | ||
``` | ||
```typescript | ||
// This test case measures the startup performance of Github and SourceForge using the performancetotal object. | ||
@@ -134,3 +143,3 @@ | ||
``` | ||
```typescript | ||
// Get the time taken for a single sample | ||
@@ -141,8 +150,12 @@ const sampleTime = performancetotal.getSampleTime(sampleName); | ||
<h2>Getting the results</h2> | ||
## Getting the results | ||
When all the tests are completed, a new results directory is created in your project's root folder (the default directory name is performance-results). Inside this directory, two files are created: performance-results.json and performance-results.csv. These files contain analyzed data for each sample, including the average time, standard error of mean (SEM), number of samples, minimum value, maximum value, earliest time, and latest time. You can use this data to identify any performance regressions or improvements over time. | ||
<h2>Typescript support</h2> | ||
## Typescript support | ||
Typescript is supported for this plugin. | ||
Typescript is supported for this plugin. | ||
## Support | ||
For support and suggestions, feel free to contact me at [tzur.paldi@outlook.com](mailto:tzur.paldi@outlook.com). |
No v1
QualityPackage is not semver >=1. This means it is not stable and does not support ^ ranges.
Found 1 instance in 1 package
35600
2
152
10