You're Invited:Meet the Socket Team at BlackHat and DEF CON in Las Vegas, Aug 4-6.RSVP
Socket
Book a DemoInstallSign in
Socket

pytest-codspeed

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

pytest-codspeed

Pytest plugin to create CodSpeed benchmarks

4.0.0
Source
pipPyPI
Maintainers
1

pytest-codspeed

CI PyPi Version Python Version Discord CodSpeed Badge

Pytest plugin to create CodSpeed benchmarks

Documentation: https://codspeed.io/docs/reference/pytest-codspeed

Installation

pip install pytest-codspeed

Usage

Creating benchmarks

In a nutshell, pytest-codspeed offers two approaches to create performance benchmarks that integrate seamlessly with your existing test suite.

Use @pytest.mark.benchmark to measure entire test functions automatically:

import pytest
from statistics import median

@pytest.mark.benchmark
def test_median_performance():
    input = [1, 2, 3, 4, 5]
    output = sum(i**2 for i in input)
    assert output == 55

Since this measure the entire function, you might want to use the benchmark fixture for precise control over what code gets measured:

def test_mean_performance(benchmark):
    data = [1, 2, 3, 4, 5]
    # Only the function call is measured
    result = benchmark(lambda: sum(i**2 for i in data))
    assert result == 55

Check out the full documentation for more details.

Testing the benchmarks locally

If you want to run the benchmarks tests locally, you can use the --codspeed pytest flag:

$ pytest tests/ --codspeed
============================= test session starts ====================
platform darwin -- Python 3.13.0, pytest-7.4.4, pluggy-1.5.0
codspeed: 3.0.0 (enabled, mode: walltime, timer_resolution: 41.7ns)
rootdir: /home/user/codspeed-test, configfile: pytest.ini
plugins: codspeed-3.0.0
collected 1 items

tests/test_sum_squares.py .                                    [ 100%]

                         Benchmark Results
┏━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━┳━━━━━━━━┓
┃     Benchmark  ┃ Time (best) ┃ Rel. StdDev ┃ Run time ┃ Iters  ┃
┣━━━━━━━━━━━━━━━━╋━━━━━━━━━━━━━╋━━━━━━━━━━━━━╋━━━━━━━━━━╋━━━━━━━━┫
┃test_sum_squares┃     1,873ns ┃        4.8% ┃    3.00s ┃ 66,930 ┃
┗━━━━━━━━━━━━━━━━┻━━━━━━━━━━━━━┻━━━━━━━━━━━━━┻━━━━━━━━━━┻━━━━━━━━┛
=============================== 1 benchmarked ========================
=============================== 1 passed in 4.12s ====================

Running the benchmarks in your CI

You can use the CodSpeedHQ/action to run the benchmarks in Github Actions and upload the results to CodSpeed.

Here is an example of a GitHub Actions workflow that runs the benchmarks and reports the results to CodSpeed on every push to the main branch and every pull request:

name: CodSpeed

on:
  push:
    branches:
      - "main" # or "master"
  pull_request:
  # `workflow_dispatch` allows CodSpeed to trigger backtest
  # performance analysis in order to generate initial data.
  workflow_dispatch:

jobs:
  benchmarks:
    name: Run benchmarks
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: "3.13"

      - name: Install dependencies
        run: pip install -r requirements.txt

      - name: Run benchmarks
        uses: CodSpeedHQ/action@v3
        with:
          token: ${{ secrets.CODSPEED_TOKEN }}
          run: pytest tests/ --codspeed

Keywords

codspeed

FAQs

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts