The Open-Source Slack AI App
This repository is a ready-to-run basic Slack AI solution you can host yourself and unlock the ability to summarize
threads and channels on demand using OpenAI (support for alternative and open source LLMs will be added if there's
demand). The official Slack AI product looks great, but with limited access and add-on pricing, I decided to open-source
the version I built in September 2023. Learn more
about how and why I built an open-source Slack AI.
Once up and running (instructions for the whole process are provided below), all your Slack users will be able to
generate to both public and private:
- Thread summaries - Generate a detailed summary of any Slack thread (powered by GPT-3.5-Turbo)
- Channel overviews - Generate an outline of the channel's purpose based on the extended message history (powered
by an ensemble of NLP models and a little GPT-4 to explain the analysis in natural language)
- Channel summaries since - Generate a detailed summary of a channel's messages since a given point in time (powered by
GPT-3.5-Turbo). Now with support for custom prompts! e.g.
/tldr_since anonymize the summary
. Note: this doesn't include threads yet. - Full channel summaries (experimental) - Generate a detailed summary of a channel's extended history (powered by
GPT-3.5-Turbo). Now with support for custom prompts! e.g.
/tldr_extended anonymize the summary
. Note: this can get very long!
Table of Contents
Getting Started
Follow these instructions to get a copy of the project up and running on your local machine for development and testing
purposes.
Prerequisites
Ensure you have the following preconfigured or installed on your local development machine:
Installation
- Clone the repository to your local machine.
- Navigate to the project directory.
- Install the required Python packages using Poetry:
poetry install
- Install the dictionary model
poetry run python -m spacy download en_core_web_md
- Create a
.env
file in the root directory of the project, and fill it with your API keys and tokens. Use
the example.env
file as a template.
cp example.env .env && open .env
Slack app configuration
Make a copy of manifest.json
and change the request URL to your ngrok or server URL.
Create a new Slack app here and configure it using your manifest.yaml
file.
You shouldn't need to make any other changes but you can change the name, description, and other
copy related settings.
If you wish to adjust the name of the slash commands, you'll need to modify slack_server.py
.
Once configured, retrieve the "Bot User OAuth Token" from the "Install App" page and add it to your .env
file as SLACK_BOT_TOKEN
.
Then, on the Basic Information page under the App-Level Tokens heading create a token with the scop connections:write
and add it to your .env
file as SLACK_APP_TOKEN
.
Usage
To run the application, run the FastAPI server:
poetry run uvicorn ossai.slack_server:app --reload
You'll then need to expose the server to the internet using ngrok.
Run ngrok with the following command: ngrok http 8000
Then add the ngrok URL to your Slack app's settings.
Customization
The main customization options are:
- Channel Summary: customize the ChatGPT prompt in
topic_analysis.py
- Thread Summary: customize the ChatGPT prompt in
summarizer.py
Testing
This project uses pytest
and pytest-cov
to run tests and measure test coverage.
Follow these steps to run the tests with coverage:
-
Navigate to the project root directory.
-
Run the following command to execute the tests with coverage:
pytest --cov=ossai tests/
This command will run all the tests in the tests/
directory and generate a coverage report for the ossai
module.
-
After running the tests, you will see a report in your terminal that shows the percentage of code covered by tests
and highlights any lines that are not covered.
Please note that if you're using a virtual environment, make sure it's activated before running these commands.
Future Enhancements
Contributing
I more than welcome contributions! Please read CONTRIBUTING.md
for details on how to submit feedback, bugs, feature
requests,
enhancements, or your own pull requests.
License
This project is licensed under the GPL-3.0 License - see the LICENSE.md
file for details.