Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
nrt-pytest-soft-asserts
Advanced tools
Assert | Description | Example | Return |
---|---|---|---|
assert_true(condition, message=None) | Verify that condition is True. | soft_asserts.assert_true(a == b) | True if assertion passes, False if assertion fails. |
assert_false(condition, message=None) | Verify that condition is False. | soft_asserts.assert_false(a == b) | True if assertion passes, False if assertion fails. |
assert_equal(first, second, message=None) | Verify that first is equal to second. | soft_asserts.assert_equal(a, b) | True if assertion passes, False if assertion fails. |
assert_not_equal(first, second, message=None) | Verify that first is not equal to second. | soft_asserts.assert_not_equal(a, b) | True if assertion passes, False if assertion fails. |
assert_is(first, second, message=None) | Verify that first and second are the same object. | soft_asserts.assert_is(a, b) | True if assertion passes, False if assertion fails. |
assert_is_not(first, second, message=None) | Verify that first and second are not the same object. | soft_asserts.assert_is_not(a, b) | True if assertion passes, False if assertion fails. |
assert_is_none(obj, message=None) | Verify that obj is None. | soft_asserts.assert_is_none(a) | True if assertion passes, False if assertion fails. |
assert_is_not_none(obj, message=None) | Verify that obj is not None. | soft_asserts.assert_is_not_none(a) | True if assertion passes, False if assertion fails. |
assert_in(obj, container, message=None) | Verify that obj is in container. | soft_asserts.assert_in(a, [a, b, c]) | True if assertion passes, False if assertion fails. |
assert_not_in(obj, container, message=None) | Verify that obj is not in container. | soft_asserts.assert_not_in(a, [b, c]) | True if assertion passes, False if assertion fails. |
assert_is_instance(obj, cls, message=None) | Verify that obj is instance of cls. | soft_asserts.assert_is_instance(a, A) | True if assertion passes, False if assertion fails. |
assert_is_not_instance(obj, cls, message=None) | Verify that obj is not instance of cls. | soft_asserts.assert_is_not_instance(a, B) | True if assertion passes, False if assertion fails. |
assert_almost_equal(first, second, delta, message=None) | Verify that first is almost equal to second, and the different is equal or less to delta. | soft_asserts.assert_almost_equal(1.001, 1.002, 0.1) | True if assertion passes, False if assertion fails. |
assert_not_almost_equal(first, second, delta, message=None) | Verify that first is not almost equal to second, and the different is more than delta. | soft_asserts.assert_not_almost_equal(1.001, 1.002, 0.00001) | True if assertion passes, False if assertion fails. |
assert_raises(exception, method: Callable, *args, **kwargs) | Verify that method execution raise exception. | soft_asserts.assert_raises(TypeError, sum, 'a', 2) | True if assertion passes, False if assertion fails. |
assert_raises_with(exception, message=None) | Verify that execution in 'with' block raise exception. | with soft_asserts.assert_raised_with(ValueError): raise ValueError(ERROR_MESSAGE_1) |
In the end of each test, the soft asserts will be verified and the test will be marked as failed if any of the asserts failed.
To verify the soft asserts in the middle of the test, call soft_asserts.assert_all()
.
assert_all() will raise AssertionError if any of the asserts failed.
Each testing section can be divided to steps.
The meaning of this is that if one of the asserts in a step failed,
then the step will be entered to list of failure steps and next test can be skipped
if it is depended on the failed step.
Example:
To make test be skipped if step failed, a custom marker should be created.
This is an example of such custom marker, but user can create its own custom marker.
In conftest.py file:
import pytest
@pytest.fixture(autouse=True)
def run_before_test(request):
markers = request.node.own_markers
for marker in markers:
if marker.name == 'soft_asserts':
marker_params = marker.kwargs
soft_asserts = marker_params['soft_asserts']
skip_steps = marker_params['skip_steps']
for step in skip_steps:
if soft_asserts.is_step_in_failure_steps(step):
pytest.skip(f'Skipped because [{step}] failed.')
import pytest
from nrt_pytest_soft_asserts.soft_asserts import SoftAsserts
soft_asserts = SoftAsserts()
STEP_1 = 'step_1'
STEP_2 = 'step_2'
def test_assert_with_steps():
soft_asserts.set_step(STEP_1)
# result is False
result = soft_asserts.assert_true(False)
# print False
print(result)
soft_asserts.set_step(STEP_2)
soft_asserts.assert_true(False)
# From this code section steps will not be attached to failure asserts
soft_asserts.unset_step()
soft_asserts.assert_true(False)
soft_asserts.assert_all()
@pytest.mark.soft_asserts(soft_asserts=soft_asserts, skip_steps=[STEP_1])
def test_skip_if_step_1_fail():
soft_asserts.assert_true(True)
@pytest.mark.soft_asserts(soft_asserts=soft_asserts, skip_steps=[STEP_2])
def test_skip_if_step_2_fail():
soft_asserts.assert_true(True)
Each assertion failure can be printed.
This can be done by adding logger or by adding a print method.
message [file_path: line_number] code_line
import logging
from nrt_pytest_soft_asserts.soft_asserts import SoftAsserts
logger = logging.getLogger('test')
soft_asserts = SoftAsserts()
# logger will be used to print message after each assert fail.
soft_asserts.set_logger(logger)
def test_assert_true_fail():
i = 1
j = 2
# logger.error() will print messages to console for each assert that fails
soft_asserts.assert_true(i + j == 5)
# f'{i} is different from {j}' will be printed by logger.error() after assert will fail
soft_asserts.assert_equal(i, j, f'{i} is different from {j}')
soft_asserts.assert_all()
from nrt_pytest_soft_asserts.soft_asserts import SoftAsserts
def print_method(message):
print(message)
soft_asserts = SoftAsserts()
# print_method will be used to print message after each assert fail.
soft_asserts.set_print_method(print_method)
def test_assert_true_fail():
i = 1
j = 2
# print_method will print messages to console for each assert that fails
soft_asserts.assert_true(i + j == 5)
# f'{i} is different from {j}' will be printed by print_method after assert will fail
soft_asserts.assert_equal(i, j, f'{i} is different from {j}')
soft_asserts.assert_all()
Wiki: https://github.com/etuzon/python-nrt-pytest-soft-asserts/wiki
FAQs
Soft asserts for pytest
We found that nrt-pytest-soft-asserts demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.