New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details
Socket
Book a DemoSign in
Socket

stackscribe

Package Overview
Dependencies
Maintainers
1
Versions
3
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

stackscribe

StackScribe is a powerful command-line tool and NPM module that automatically generates professional, high-quality comments for your code. It scans your codebase, identifies functions and APIs, and uses cutting-edge Large Language Models (LLMs) to write d

latest
Source
npmnpm
Version
0.0.3
Version published
Maintainers
1
Created
Source
npm version

stars forks npm npm downloads issues pull requests code style license last commit

📦 StackScribe: AI-Powered Code Documentation

StackScribe is a next-generation CLI tool and NPM module that automatically generates professional, high-quality comments for your code. It scans your codebase, identifies functions and APIs, and leverages powerful Large Language Models (LLMs) to write detailed explanations—saving you hours of manual documentation.

⚠️ Under Development

Currently supports JavaScript and TypeScript. Support for additional languages like Python, Java, and more is planned in future releases.

🚀 Usage

Getting started with StackScribe is quick, whether as a CLI tool or a module in your project.

Install the package

npm install stackscribe

Commands Guide

npx stackscribe --help

# To see Version
npx stackscribe --version

1️⃣ Command-Line (CLI) Usage

Step 1: Configure Your API Key (One-time setup)

Ollama runs locally and requires no key. For other providers, set your API key:

# Groq example
npx stackscribe config --provider groq --apiKey YOUR_GROQ_API_KEY

# OpenAI
npx stackscribe config --provider openai --apiKey YOUR_OPENAI_API_KEY

# Gemini
npx stackscribe config --provider gemini --apiKey YOUR_GEMINI_API_KEY

Step 2: Run on Your Codebase

Generate comments for a single file or an entire directory:

# Run on './src' directory using Groq
npx stackscribe run --path ./src --provider groq --model llama3-8b-8192

# Run on a Python file using Gemini
npx stackscribe run --path ./my_script.py --provider gemini --model gemini-1.5-pro

# Use the default provider from your config
npx stackscribe run --path ./src/myFile.js

2️⃣ Programmatic (Module) Usage

Integrate StackScribe directly into scripts, CI/CD pipelines, or other tools.

Step 1: Install the package

npm install stackscribe

Step 2: Import and use

import { main } from "stackscribe";

// Run on './src' directory using default provider
main("./src");

// Specify a provider
main("./src", "gemini");

// Specify provider and model
main("./src", "groq", "llama3-8b-8192");

📝 About the Project

StackScribe is designed to be simple, powerful, and easy to integrate into modern development workflows.

Key Features

  • 🤖 Intelligent Code Analysis Uses AST parsing to accurately detect all functions and API calls in JavaScript/TypeScript code, ignoring irrelevant sections. Other languages are fully processed.

  • 🔗 Multi-Provider LLM Support Choose from OpenAI, Google Gemini, Groq, or run offline with Ollama. Specify models for each provider for maximum control.

  • ✍️ High-Quality Comment Generation Generates professional, list-style explanations of function logic, purpose, and expected inputs/outputs.

Example Comment Output

/**
 * Handles user login by validating credentials and issuing a JWT.
 * 1. Extracts email and password from the request body.
 * 2. Validates credentials against the database.
 * 3. On success, generates access and refresh tokens.
 * 4. Returns the tokens to the client.
 * @param {object} req - Express request object
 * @param {object} res - Express response object
 */
function loginUser(req, res) {
   // ... function logic
}

Project Architecture

stackscribe/
├─ bin/
│  └─ stackscribe.ts       # CLI commands (yargs)
├─ src/
│  ├─ index.ts             # Main logic (CLI + module)
│  ├─ parser.ts            # AST parsing
│  ├─ annotator.ts         # Insert comments
│  ├─ llm/
│  │  ├─ openai.ts         # LLM wrappers
│  │  ├─ gemini.ts
│  │  ├─ groq.ts
│  │  └─ ollama.ts
│  ├─ config.ts            # API keys & config
│  └─ utils.ts             # Helper functions
├─ package.json
└─ README.md

Core Technologies

  • Code Parsing: @babel/parser, @babel/traverse
  • Code Generation: recast (preserves formatting)
  • LLM SDKs: openai, @google/generative-ai, groq-sdk, ollama
  • CLI Framework: yargs

🛠 Development Roadmap

  • Phase 1 (MVP): JS/TS support with OpenAI — ✅ Completed
  • Phase 2 (Multi-Provider): Gemini, Groq, Ollama integration — ✅ Completed
  • Phase 3 (Config): Robust CLI + stackscribe.json for project defaults — 🟡 Partially Completed
  • Phase 4 (Multi-Language): Python, Java, etc. support — 🟡 Partially Completed

Keywords

ai

FAQs

Package last updated on 12 Sep 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts