
Security News
NVD Quietly Sweeps 100K+ CVEs Into a “Deferred” Black Hole
NVD now marks all pre-2018 CVEs as "Deferred," signaling it will no longer enrich older vulnerabilities, further eroding trust in its data.
A human-friendly framework for testing and evaluating LLMs, RAGs, and chatbots.
ContextCheck is an open-source framework designed to evaluate, test, and validate large language models (LLMs), Retrieval-Augmented Generation (RAG) systems, and chatbots. It provides tools to automatically generate queries, request completions, detect regressions, perform penetration tests, and assess hallucinations, ensuring the robustness and reliability of these systems. ContextCheck is configurable via YAML and can be integrated into continuous integration (CI) pipelines for automated testing.
.yaml
filesrich
package for clear, readable displaysInstall the package directly from PyPI using pip:
pip install ccheck
After installation, you can access the ccheck
CLI command:
ccheck --help
This will display all available options and help you get started with using ContextCheck.
If you wish to contribute to the project or modify it for your own use, you can set up a development environment using Poetry.
git clone https://github.com/<your_username>/contextcheck.git
cd contextcheck
poetry install
poetry shell
ccheck
CLI command using:poetry run ccheck --help
Please refer to examples/
folder for the tutorial.
ccheck --output-type console --filename path/to/file.yaml
ccheck --output-type console --filename path/to/file.yaml path/to/another_file.yaml
To automatically stop the CI/CD process if any tests fail, add the --exit-on-failure
flag. Failed test will cause the script to exit with code 1:
ccheck --exit-on-failure --output-type console --folder my_tests
Use env variable OPENAI_API_KEY
to be able to run:
tests/scenario_openai.yaml
tests/scenario_defaults.yaml
Contributions are welcomed!
To run tests:
poetry run pytest tests/
To include tests which require calling LLM APIs (currently OpenAI and Ollama), run one of:
poetry run pytest --openai # includes tests that use OpenAI API
poetry run pytest --ollama # includes tests that use Ollama API
poetry run pytest --openai --ollama # includes tests that use both OpenAI and Ollama API
Made with ❤️ by the Addepto Team
ContextCheck is an extension of the ContextClue product, created by the Addepto team. This project is the result of our team’s dedication, combining innovation and expertise.
Addepto Team:
Like what we’re building? ⭐ Give it a star to support its development!
This project is licensed under the MIT License - see the LICENSE file for details
FAQs
A human-friendly framework for testing and evaluating LLMs, RAGs, and chatbots.
We found that ccheck demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
NVD now marks all pre-2018 CVEs as "Deferred," signaling it will no longer enrich older vulnerabilities, further eroding trust in its data.
Research
Security News
Lazarus-linked threat actors expand their npm malware campaign with new RAT loaders, hex obfuscation, and over 5,600 downloads across 11 packages.
Security News
Safari 18.4 adds support for Iterator Helpers and two other TC39 JavaScript features, bringing full cross-browser coverage to key parts of the ECMAScript spec.