![Create React App Officially Deprecated Amid React 19 Compatibility Issues](https://cdn.sanity.io/images/cgdhsj6q/production/04fa08cf844d798abc0e1a6391c129363cc7e2ab-1024x1024.webp?w=400&fit=max&auto=format)
Security News
Create React App Officially Deprecated Amid React 19 Compatibility Issues
Create React App is officially deprecated due to React 19 issues and lack of maintenance—developers should switch to Vite or other modern alternatives.
Saves previous test runs and allow re-execute previous pytest runs to reproduce crashes or flaky tests
.. image:: http://img.shields.io/pypi/v/pytest-replay.svg :target: https://pypi.python.org/pypi/pytest-replay
.. image:: https://anaconda.org/conda-forge/pytest-replay/badges/version.svg :target: https://anaconda.org/conda-forge/pytest-replay
.. image:: https://github.com/ESSS/pytest-replay/workflows/test/badge.svg :target: https://github.com/ESSS/pytest-replay/actions?query=workflow%3Atest
.. image:: https://img.shields.io/pypi/pyversions/pytest-replay.svg :target: https://pypi.python.org/pypi/pytest-replay
.. image:: https://img.shields.io/badge/code%20style-black-000000.svg :target: https://github.com/psf/black
Saves previous test runs and allow re-execute previous pytest runs to reproduce crashes or flaky tests
This pytest
_ plugin was generated with Cookiecutter
_ along with @hackebrot
's Cookiecutter-pytest-plugin
template.
This plugin helps to reproduce random or flaky behavior when running tests with xdist. pytest-xdist
executes tests
in a non-predictable order, making it hard to reproduce a behavior seen in CI locally because there's no convenient way
to track which test executed in which worker.
This plugin records the executed node ids by each worker in the directory given by --replay-record-dir=<dir>
flag,
and a --replay=<file>
can be used to re-run the tests from a previous run. For example::
$ pytest -n auto --replay-record-dir=build/tests/replay
This will generate files with each line being a json
with the following content:
node identification, start time, end time and outcome. It is interesting to note
that usually the node id is repeated twice, that is necessary in case of a test
suddenly crashes we will still have the record of that test started. After the
test finishes, pytest-replay
will add another json
line with the
complete information.
That is also useful to analyze concurrent tests which might have some kind of
race condition and interfere in each other.
For example worker gw1
will generate a file
.pytest-replay-gw1.txt
with contents like this::
{"nodeid": "test_foo.py::test[1]", "start": 0.000}
{"nodeid": "test_foo.py::test[1]", "start": 0.000, "finish": 1.5, "outcome": "passed"}
{"nodeid": "test_foo.py::test[3]", "start": 1.5}
{"nodeid": "test_foo.py::test[3]", "start": 1.5, "finish": 2.5, "outcome": "passed"}
{"nodeid": "test_foo.py::test[5]", "start": 2.5}
{"nodeid": "test_foo.py::test[5]", "start": 2.5, "finish": 3.5, "outcome": "passed"}
{"nodeid": "test_foo.py::test[7]", "start": 3.5}
{"nodeid": "test_foo.py::test[7]", "start": 3.5, "finish": 4.5, "outcome": "passed"}
{"nodeid": "test_foo.py::test[8]", "start": 4.5}
{"nodeid": "test_foo.py::test[8]", "start": 4.5, "finish": 5.5, "outcome": "passed"}
If there is a crash or a flaky failure in the tests of the worker gw1
, one can take that file from the CI server and
execute the tests in the same order with::
$ pytest --replay=.pytest-replay-gw1.txt
Hopefully this will make it easier to reproduce the problem and fix it.
Version added: 1.6
In cases where it is necessary to add new metadata to the replay file to make the test reproducible, pytest-replay
provides a fixture called replay_metadata
that allows new information to be added using the metadata
attribute.
Example:
.. code-block:: python
import pytest
import numpy as np
import random
@pytest.fixture
def rng(replay_metadata):
seed = replay_metadata.metadata.setdefault("seed", random.randint(0, 100))
return np.random.default_rng(seed=seed)
def test_random(rng):
data = rng.standard_normal((100, 100))
assert data.shape == (100, 100)
When using it with pytest-replay it generates a replay file similar to
.. code-block:: json
{"nodeid": "test_bar.py::test_random", "start": 0.000}
{"nodeid": "test_bar.py::test_random", "start": 0.000, "finish": 1.5, "outcome": "passed", "metadata": {"seed": 12}}
FAQ
1. ``pytest`` has its own `cache <https://docs.pytest.org/en/latest/cache.html>`_, why use a different mechanism?
The internal cache saves its data using ``json``, which is not suitable in the advent of a crash because the file
will not be readable.
2. Shouldn't the ability of selecting tests from a file be part of the ``pytest`` core?
Sure, but let's try to use this a bit as a separate plugin before proposing
its inclusion into the core.
Installation
------------
You can install ``pytest-replay`` via `pip`_ from `PyPI`_::
$ pip install pytest-replay
Or with conda::
$ conda install -c conda-forge pytest-replay
Contributing
------------
Contributions are very welcome.
Tests can be run with `tox`_ if you are using a native Python installation.
To run tests with `conda <https://conda.io/docs/>`_, first create a virtual environment and execute tests from there
(conda with Python 3.5+ in the root environment)::
$ python -m venv .env
$ .env\scripts\activate
$ pip install -e . pytest-xdist
$ pytest tests
Releases
Follow these steps to make a new release:
release-X.Y.Z
from master
;CHANGELOG.rst
;X.Y.Z
;GitHub Actions will deploy to PyPI automatically.
Afterwards, update the recipe in conda-forge/pytest-replay-feedstock <https://github.com/conda-forge/pytest-replay-feedstock>
_.
Distributed under the terms of the MIT
_ license.
If you encounter any problems, please file an issue
_ along with a detailed description.
.. _Cookiecutter
: https://github.com/audreyr/cookiecutter
.. _@hackebrot
: https://github.com/hackebrot
.. _MIT
: http://opensource.org/licenses/MIT
.. _BSD-3
: http://opensource.org/licenses/BSD-3-Clause
.. _GNU GPL v3.0
: http://www.gnu.org/licenses/gpl-3.0.txt
.. _Apache Software License 2.0
: http://www.apache.org/licenses/LICENSE-2.0
.. _cookiecutter-pytest-plugin
: https://github.com/pytest-dev/cookiecutter-pytest-plugin
.. _file an issue
: https://github.com/ESSS/pytest-replay/issues
.. _pytest
: https://github.com/pytest-dev/pytest
.. _tox
: https://tox.readthedocs.io/en/latest/
.. _pip
: https://pypi.python.org/pypi/pip/
.. _PyPI
: https://pypi.python.org/pypi
FAQs
Saves previous test runs and allow re-execute previous pytest runs to reproduce crashes or flaky tests
We found that pytest-replay demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 5 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Create React App is officially deprecated due to React 19 issues and lack of maintenance—developers should switch to Vite or other modern alternatives.
Security News
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
Security News
The Linux Foundation is warning open source developers that compliance with global sanctions is mandatory, highlighting legal risks and restrictions on contributions.