🚨 Shai-Hulud Strikes Again:834 Packages Compromised.Technical Analysis →
Socket
Book a DemoInstallSign in
Socket

fraggle

Package Overview
Dependencies
Maintainers
1
Versions
6
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

fraggle

A simple RAG API - build a question and answer interface to your own content in minutes

pipPyPI
Version
0.1.5
Maintainers
1

Fraggle

The simplest RAG API for Python. Build a question and answer interface to your own content in minutes.

pip install fraggle
fraggle index
fraggle serve
curl -X POST http://localhost:8000/api/ask \
  -H "Content-Type: application/json" \
  -d '{"question": "What is Fraggle?"}'

Fraggle is the successor to my badly named microllama project, modernised with:

  • Provider-agnostic LLM support via any-llm
  • Modern LangChain (0.3+)
  • Simple deployment and usage

Installation

pip install fraggle

Quick Start

  • Prepare your content in a source.json file:
[
  {
    "content": "Fraggle is a RAG API for building Q&A interfaces.",
    "title": "What is Fraggle",
    "url": "https://example.com/docs"
  },
  {
    "content": "You can use OpenAI or Anthropic models with Fraggle.",
    "title": "Supported Models"
  }
]
  • Set your API key:
export OPENAI_API_KEY="your-key-here"
# or
export ANTHROPIC_API_KEY="your-key-here"
  • Create an index of embeddings:
fraggle index
  • Start the API server:
fraggle serve
  • Query your content:
curl -X POST http://localhost:8000/api/ask \
  -H "Content-Type: application/json" \
  -d '{"question": "What is Fraggle?"}'

Configuration

Configure Fraggle with environment variables:

VariableDefaultDescription
SOURCE_JSON_PATHsource.jsonPath to your content JSON file
INDEX_PATHfaiss_indexPath to store/load the FAISS index
LLM_PROVIDERopenaiLLM provider: openai or anthropic
LLM_MODELgpt-4o-miniModel ID (e.g., gpt-4o, claude-3-5-sonnet-20241022)
EMBEDDINGS_PROVIDERopenaiEmbeddings provider
EMBEDDINGS_MODELtext-embedding-3-smallEmbeddings model
CHUNK_SIZE1000Text chunk size for indexing
CHUNK_OVERLAP100Overlap between chunks
K_CONTEXT_DOCS4Number of documents to retrieve
UVICORN_HOST0.0.0.0Host to bind the server to
UVICORN_PORT8000Port to bind the server to

Using Anthropic Claude

export ANTHROPIC_API_KEY="your-key-here"
export LLM_PROVIDER="anthropic"
export LLM_MODEL="claude-3-5-sonnet-20241022"
fraggle serve

CLI Commands

fraggle serve

Start the API server. Automatically serves the frontend at / if the frontend/ directory exists.

fraggle index

Create a FAISS index from your source documents.

Options:

  • --source: Path to source JSON file (default: source.json)
  • --output: Path to save the index (default: faiss_index)

fraggle make-front-end

Generate a simple HTML frontend for your Q&A interface.

Options:

  • --output: Directory for frontend files (default: frontend)

fraggle make-dockerfile

Generate a Dockerfile for containerized deployment.

API Endpoints

POST /api/ask

Non-streaming question answering.

Request:

{
  "question": "What is Fraggle?"
}

Response:

{
  "answer": "Fraggle is a RAG API for building Q&A interfaces to your content."
}

POST /api/stream

Streaming question answering (Server-Sent Events).

Request:

{
  "question": "What is Fraggle?"
}

Response: SSE stream of text chunks.

Deployment

Docker

fraggle make-dockerfile
docker build -t fraggle .
docker run -p 8000:8000 -e OPENAI_API_KEY=your-key fraggle

Pre-building the Index

For faster startup, create the index at build time by uncommenting the RUN fraggle index line in the Dockerfile.

Development

# Clone the repo
git clone https://github.com/tomdyson/fraggle.git
cd fraggle

# Install with uv
uv sync

# Run in development mode
uv run fraggle serve

Comparison with microllama

Fraggle improves on microllama by:

  • Provider-agnostic: Use any LLM via any-llm
  • Modern dependencies: LangChain 0.3+
  • Better naming: Focus on RAG, not LLMs
  • Same simple API and deployment story

License

MIT

Publishing a New Version

Fraggle uses GitHub Actions to automatically publish to PyPI when you push a version tag:

# 1. Update the version in pyproject.toml
# 2. Commit and tag the release
git add pyproject.toml
git commit -m "Bump version to 0.1.x"
git tag v0.1.x
git push && git push --tags

The GitHub Actions workflow will automatically build and publish to PyPI.

First-time setup: Configure PyPI Trusted Publishing at https://pypi.org/manage/account/publishing/ with:

  • PyPI Project Name: fraggle
  • Owner: tomdyson
  • Repository: fraggle
  • Workflow: publish.yml
  • Environment: pypi

FAQs

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts