
📦 StackScribe: AI-Powered Code Documentation
StackScribe is a next-generation CLI tool and NPM module that automatically generates professional, high-quality comments for your code. It scans your codebase, identifies functions and APIs, and leverages powerful Large Language Models (LLMs) to write detailed explanations—saving you hours of manual documentation.
⚠️ Under Development
Currently supports JavaScript and TypeScript. Support for additional languages like Python, Java, and more is planned in future releases.
🚀 Usage
Getting started with StackScribe is quick, whether as a CLI tool or a module in your project.
Install the package
npm install stackscribe
Commands Guide
npx stackscribe --help
npx stackscribe --version
1️⃣ Command-Line (CLI) Usage
Step 1: Configure Your API Key (One-time setup)
Ollama runs locally and requires no key. For other providers, set your API key:
npx stackscribe config --provider groq --apiKey YOUR_GROQ_API_KEY
npx stackscribe config --provider openai --apiKey YOUR_OPENAI_API_KEY
npx stackscribe config --provider gemini --apiKey YOUR_GEMINI_API_KEY
Step 2: Run on Your Codebase
Generate comments for a single file or an entire directory:
npx stackscribe run --path ./src --provider groq --model llama3-8b-8192
npx stackscribe run --path ./my_script.py --provider gemini --model gemini-1.5-pro
npx stackscribe run --path ./src/myFile.js
2️⃣ Programmatic (Module) Usage
Integrate StackScribe directly into scripts, CI/CD pipelines, or other tools.
Step 1: Install the package
npm install stackscribe
Step 2: Import and use
import { main } from "stackscribe";
main("./src");
main("./src", "gemini");
main("./src", "groq", "llama3-8b-8192");
📝 About the Project
StackScribe is designed to be simple, powerful, and easy to integrate into modern development workflows.
Key Features
-
🤖 Intelligent Code Analysis
Uses AST parsing to accurately detect all functions and API calls in JavaScript/TypeScript code, ignoring irrelevant sections. Other languages are fully processed.
-
🔗 Multi-Provider LLM Support
Choose from OpenAI, Google Gemini, Groq, or run offline with Ollama. Specify models for each provider for maximum control.
-
✍️ High-Quality Comment Generation
Generates professional, list-style explanations of function logic, purpose, and expected inputs/outputs.
function loginUser(req, res) {
}
Project Architecture
stackscribe/
├─ bin/
│ └─ stackscribe.ts # CLI commands (yargs)
├─ src/
│ ├─ index.ts # Main logic (CLI + module)
│ ├─ parser.ts # AST parsing
│ ├─ annotator.ts # Insert comments
│ ├─ llm/
│ │ ├─ openai.ts # LLM wrappers
│ │ ├─ gemini.ts
│ │ ├─ groq.ts
│ │ └─ ollama.ts
│ ├─ config.ts # API keys & config
│ └─ utils.ts # Helper functions
├─ package.json
└─ README.md
Core Technologies
- Code Parsing:
@babel/parser, @babel/traverse
- Code Generation:
recast (preserves formatting)
- LLM SDKs:
openai, @google/generative-ai, groq-sdk, ollama
- CLI Framework:
yargs
🛠 Development Roadmap
- Phase 1 (MVP): JS/TS support with OpenAI — ✅ Completed
- Phase 2 (Multi-Provider): Gemini, Groq, Ollama integration — ✅ Completed
- Phase 3 (Config): Robust CLI +
stackscribe.json for project defaults — 🟡 Partially Completed
- Phase 4 (Multi-Language): Python, Java, etc. support — 🟡 Partially Completed