Readme-ai is a developer tool that auto-generates README.md files using a combination of data extraction and generative ai. Simply provide a repository URL or local path to your codebase and a well-structured and detailed README file will be generated for you.
Motivation
Streamlines documentation creation and maintenance, enhancing developer productivity. This project aims to enable all skill levels, across all domains, to better understand, use, and contribute to open-source software.
[!IMPORTANT]
Readme-ai is currently under development with an opinionated configuration and setup. It is vital to review all generated text from the LLM API to ensure it accurately represents your project.
Offline mode is useful for generating a boilerplate README at no cost. View the offline README.md example here!
๐งฉ Features
Flexible README Generation
Readme-ai uses a balanced approach to building README files, combining data extraction and generative AI to create comprehensive and informative documentation.
Data Extraction & Analysis: File parsers and analyzers are used to extract project metadata, dependencies, and other relevant details. This data is used to both populate many sections of the README, as well as provide context to the LLM API.
Generative Content: For more abstract or creative sections, readme-ai uses LLM APIs to generate content that is both informative and engaging. This includes sections such as a project slogan, overview, features table, and file summaries.
CLI Customization
Over a dozen CLI options are available to customize the README generation process:
LLM Options: Run the tool with OpenAI, Ollama, Google Gemini, or in offline mode.
Offline Mode: Generate a README without making API calls. Readme-ai is still able to populate a significant portion of the README using metadata collected during preprocessing.
Project Badges: Choose from an array of badge styles, colors, and alignments.
Project Logo: Select from the default set, upload your own, or let the LLM give it a try!
A few examples of the CLI options in action:
default output (no options provided to cli)
--alignment left --badge-style flat-square --image cloud
--alignment left --badge-style flat --image gradient
--badge-style flat --image custom
--badge-style skills-light --image grey
--badge-style flat-square
--badge-style flat --image black
See the Configuration section for a complete list of CLI options.
OpenAI: Recommended, requires an account setup and API key.
Ollama: Free and open-source, potentially slower and more resource-intensive.
Google Gemini: Requires a Google Cloud account and API key.
Offline Mode: Generates a boilerplate README without making API calls.
โ๏ธ Installation
Using pip
pip install readmeai
[!TIP]
Use pipx to install and run Python command-line applications without causing dependency conflicts with other packages!
Using docker
docker pull zeroxeli/readme-ai:latest
Using conda
conda install -c conda-forge readmeai
From source
Clone and Install
Clone repository and change directory.
$ git clone https://github.com/eli64s/readme-ai$ cd readme-ai
Using bash
$ bash setup/setup.sh
Using poetry
$ poetry install
Similiary you can use pipenv or pip to install the requirements.txt.
๐ค Usage
Environment Variables
Using OpenAI
Set your OpenAI API key as an environment variable.
# Using Linux or macOS$ export OPENAI_API_KEY=<your_api_key>
# Using Windows$ set OPENAI_API=<your_api_key>
Using Ollama
Set Ollama local host as an environment variable.
$ export OLLAMA_HOST=127.0.0.1$ ollama pull mistral:latest # llama2, etc.$ ollama serve # run if not using the Ollama desktop app
For more details, check out the Ollama repository.
Using Google Gemini
Set your Google Cloud project ID and location as environment variables.
$ export GOOGLE_API_KEY=<your_api_key>
Run the CLI
Using pip
# Using OpenAI API
readmeai --repository https://github.com/eli64s/readme-ai --api openai
# Using Ollama local model
readmeai --repository https://github.com/eli64s/readme-ai --api ollama --model mistral
๐พ Automated README file generator, powered by large language model APIs.
We found that readmeai demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.ย It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
NIST's new AI Risk Management Framework aims to enhance the security and reliability of generative AI systems and address the unique challenges of malicious AI exploits.
This episode of the Risky Biz podcast discusses how the rise of small open source packages and the shift towards individual maintainers makes the ecosystem more vulnerable to supply chain attacks.
Streamline your login process and enhance security by enabling Single Sign-On (SSO) on the Socket platform, now available for all customers on the Enterprise plan, supporting 20+ identity providers.