Security News
38% of CISOs Fear They’re Not Moving Fast Enough on AI
CISOs are racing to adopt AI for cybersecurity, but hurdles in budgets and governance may leave some falling behind in the fight against cyber threats.
nrt-pytest-soft-asserts
Advanced tools
Assert | Description | Example | Return |
---|---|---|---|
assert_true(condition, message=None) | Verify that condition is True. | soft_asserts.assert_true(a == b) | True if assertion passes, False if assertion fails. |
assert_false(condition, message=None) | Verify that condition is False. | soft_asserts.assert_false(a == b) | True if assertion passes, False if assertion fails. |
assert_equal(first, second, message=None) | Verify that first is equal to second. | soft_asserts.assert_equal(a, b) | True if assertion passes, False if assertion fails. |
assert_not_equal(first, second, message=None) | Verify that first is not equal to second. | soft_asserts.assert_not_equal(a, b) | True if assertion passes, False if assertion fails. |
assert_is(first, second, message=None) | Verify that first and second are the same object. | soft_asserts.assert_is(a, b) | True if assertion passes, False if assertion fails. |
assert_is_not(first, second, message=None) | Verify that first and second are not the same object. | soft_asserts.assert_is_not(a, b) | True if assertion passes, False if assertion fails. |
assert_is_none(obj, message=None) | Verify that obj is None. | soft_asserts.assert_is_none(a) | True if assertion passes, False if assertion fails. |
assert_is_not_none(obj, message=None) | Verify that obj is not None. | soft_asserts.assert_is_not_none(a) | True if assertion passes, False if assertion fails. |
assert_in(obj, container, message=None) | Verify that obj is in container. | soft_asserts.assert_in(a, [a, b, c]) | True if assertion passes, False if assertion fails. |
assert_not_in(obj, container, message=None) | Verify that obj is not in container. | soft_asserts.assert_not_in(a, [b, c]) | True if assertion passes, False if assertion fails. |
assert_is_instance(obj, cls, message=None) | Verify that obj is instance of cls. | soft_asserts.assert_is_instance(a, A) | True if assertion passes, False if assertion fails. |
assert_is_not_instance(obj, cls, message=None) | Verify that obj is not instance of cls. | soft_asserts.assert_is_not_instance(a, B) | True if assertion passes, False if assertion fails. |
assert_almost_equal(first, second, delta, message=None) | Verify that first is almost equal to second, and the different is equal or less to delta. | soft_asserts.assert_almost_equal(1.001, 1.002, 0.1) | True if assertion passes, False if assertion fails. |
assert_not_almost_equal(first, second, delta, message=None) | Verify that first is not almost equal to second, and the different is more than delta. | soft_asserts.assert_not_almost_equal(1.001, 1.002, 0.00001) | True if assertion passes, False if assertion fails. |
assert_raises(exception, method: Callable, *args, **kwargs) | Verify that method execution raise exception. | soft_asserts.assert_raises(TypeError, sum, 'a', 2) | True if assertion passes, False if assertion fails. |
assert_raises_with(exception, message=None) | Verify that execution in 'with' block raise exception. | with soft_asserts.assert_raised_with(ValueError): raise ValueError(ERROR_MESSAGE_1) |
In the end of each test, the soft asserts will be verified and the test will be marked as failed if any of the asserts failed.
To verify the soft asserts in the middle of the test, call soft_asserts.assert_all()
.
assert_all() will raise AssertionError if any of the asserts failed.
Each testing section can be divided to steps.
The meaning of this is that if one of the asserts in a step failed,
then the step will be entered to list of failure steps and next test can be skipped
if it is depended on the failed step.
Example:
To make test be skipped if step failed, a custom marker should be created.
This is an example of such custom marker, but user can create its own custom marker.
In conftest.py file:
import pytest
@pytest.fixture(autouse=True)
def run_before_test(request):
markers = request.node.own_markers
for marker in markers:
if marker.name == 'soft_asserts':
marker_params = marker.kwargs
soft_asserts = marker_params['soft_asserts']
skip_steps = marker_params['skip_steps']
for step in skip_steps:
if soft_asserts.is_step_in_failure_steps(step):
pytest.skip(f'Skipped because [{step}] failed.')
import pytest
from nrt_pytest_soft_asserts.soft_asserts import SoftAsserts
soft_asserts = SoftAsserts()
STEP_1 = 'step_1'
STEP_2 = 'step_2'
def test_assert_with_steps():
soft_asserts.set_step(STEP_1)
# result is False
result = soft_asserts.assert_true(False)
# print False
print(result)
soft_asserts.set_step(STEP_2)
soft_asserts.assert_true(False)
# From this code section steps will not be attached to failure asserts
soft_asserts.unset_step()
soft_asserts.assert_true(False)
soft_asserts.assert_all()
@pytest.mark.soft_asserts(soft_asserts=soft_asserts, skip_steps=[STEP_1])
def test_skip_if_step_1_fail():
soft_asserts.assert_true(True)
@pytest.mark.soft_asserts(soft_asserts=soft_asserts, skip_steps=[STEP_2])
def test_skip_if_step_2_fail():
soft_asserts.assert_true(True)
Each assertion failure can be printed.
This can be done by adding logger or by adding a print method.
message [file_path: line_number] code_line
import logging
from nrt_pytest_soft_asserts.soft_asserts import SoftAsserts
logger = logging.getLogger('test')
soft_asserts = SoftAsserts()
# logger will be used to print message after each assert fail.
soft_asserts.set_logger(logger)
def test_assert_true_fail():
i = 1
j = 2
# logger.error() will print messages to console for each assert that fails
soft_asserts.assert_true(i + j == 5)
# f'{i} is different from {j}' will be printed by logger.error() after assert will fail
soft_asserts.assert_equal(i, j, f'{i} is different from {j}')
soft_asserts.assert_all()
from nrt_pytest_soft_asserts.soft_asserts import SoftAsserts
def print_method(message):
print(message)
soft_asserts = SoftAsserts()
# print_method will be used to print message after each assert fail.
soft_asserts.set_print_method(print_method)
def test_assert_true_fail():
i = 1
j = 2
# print_method will print messages to console for each assert that fails
soft_asserts.assert_true(i + j == 5)
# f'{i} is different from {j}' will be printed by print_method after assert will fail
soft_asserts.assert_equal(i, j, f'{i} is different from {j}')
soft_asserts.assert_all()
Wiki: https://github.com/etuzon/python-nrt-pytest-soft-asserts/wiki
FAQs
Soft asserts for pytest
We found that nrt-pytest-soft-asserts demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
CISOs are racing to adopt AI for cybersecurity, but hurdles in budgets and governance may leave some falling behind in the fight against cyber threats.
Research
Security News
Socket researchers uncovered a backdoored typosquat of BoltDB in the Go ecosystem, exploiting Go Module Proxy caching to persist undetected for years.
Security News
Company News
Socket is joining TC54 to help develop standards for software supply chain security, contributing to the evolution of SBOMs, CycloneDX, and Package URL specifications.