ai_changelog
ai_changelog
is a Python project that automatically generates changelog files summarizing code changes, using AI.
It uses LangChain and OpenAI models to analyze Git commit diffs and generate natural language descriptions of the changes. This allows maintaining an up-to-date changelog without manual effort.
This README was originally written by Claude, an LLM from Anthropic.
Usage
usage: ai_changelog [-h] [--provider {openai,anthropic,anyscale}] [--model MODEL] [--temperature TEMPERATURE] [--max_tokens MAX_TOKENS] [--hub_prompt HUB_PROMPT]
[--context_lines CONTEXT_LINES] [--max_concurrency MAX_CONCURRENCY] [-v]
refs
Process command line arguments.
positional arguments:
refs Reference comparison with standard git syntax
options:
-h, --help show this help message and exit
--provider {openai,anthropic,anyscale}
Model API provider
--model MODEL Model name
--temperature TEMPERATURE
Model temperature
--max_tokens MAX_TOKENS
Max tokens in output
--hub_prompt HUB_PROMPT
Prompt to pull from LangChain Hub
--context_lines CONTEXT_LINES
Number of context lines for each commit
--max_concurrency MAX_CONCURRENCY
Number of concurrent connections to llm provider (0 means no limit)
-v, --verbose Run LangChain in verbose mode
http://github.com/joshuasundance-swca/ai_changelog
Local install
To generate a changelog locally:
pip install ai_changelog
ai_changelog --help
ai_changelog main..HEAD
Docker
docker pull joshuasundance/ai_changelog:latest
docker run \
--env-file .env \
-v /local_repo_dir:/container_dir_in_repo \
-w /container_dir_in_repo \
joshuasundance/ai_changelog:latest \
main..HEAD
GitHub Workflow
The ai_changelog_main_pr.yml workflow runs on pushes to main
.
It generates summaries for the new commits and appends them to AI_CHANGELOG.md
. The updated file is then committed back to the PR branch.
ai_changelog origin/main^..origin/main
ai_changelog origin/main..HEAD
Another flow was made to commit an updated changelog to an incoming PR before it was merged, but that seemed less useful although it did work well.
Configuration
- Set environment variables as needed for your provider of choice (default requires
OPENAI_API_KEY
). - Set LangSmith environment variables to enable LangSmith integration, if applicable.
- Use command line arguments.
License
This project is licensed under the MIT License - see the LICENSE file for details.
TODO
- Testing
- Get CodeLlama working reliably in CICD (currently hit or miss on structured output)