
Security News
Axios Supply Chain Attack Reaches OpenAI macOS Signing Pipeline, Forces Certificate Rotation
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.
stack-replayer
Advanced tools
Turn cryptic error logs into reproducible bugs, replay scripts, and fix suggestions — with optional AI
Turn cryptic error logs into reproducible bugs, replay scripts, and fix suggestions — with optional AI.
npm install stack-replayer
# or
pnpm add stack-replayer
# or
yarn add stack-replayer
The library works immediately without any configuration or API keys:
import { replayBug } from "stack-replayer";
try {
// Your code that might throw
const user = null;
console.log(user.name); // TypeError!
} catch (err) {
const errorLog = err instanceof Error ? err.stack ?? String(err) : String(err);
const result = await replayBug(errorLog);
console.log(result.explanation);
console.log(result.reproductionSteps);
console.log(result.suggestedFix);
}
Output:
TypeError occurred: "Cannot read properties of null (reading 'name')"
This error was thrown at /home/user/app.js:5 in function "<anonymous>".
The error likely indicates a runtime issue in your code. Review the stack trace and the code at the specified location for potential bugs.
Set two environment variables and your analysis gets dramatically smarter:
export AI_BUG_REPLAYER_PROVIDER=openai
export OPENAI_API_KEY=sk-...
Then run the same code as above. The library automatically detects and uses OpenAI for enhanced analysis including:
Run a local LLM with Ollama (completely free, no API keys):
# Install and start Ollama
ollama pull llama3
ollama serve &
# Configure environment
export AI_BUG_REPLAYER_PROVIDER=ollama
# Optional: export OLLAMA_MODEL=llama3
# Optional: export OLLAMA_BASE_URL=http://localhost:11434
Now your same code uses local AI with no external API calls or costs.
npm install -g stack-replayer
stack-replayer --log error.log
cat error.log | stack-replayer
stack-replayer --log error.log --run
stack-replayer --log error.log --root /path/to/project
stack-replayer --log error.log --json > result.json
replayBug(errorLog, options?)Convenience function for one-line bug replay.
Parameters:
errorLog: string - The error log or stack traceoptions?: object
llmClient?: LlmClient - Custom LLM client (overrides auto-detection)dryRun?: boolean - If true, don't execute replay script (default: false)projectRoot?: string - Project root directorymetadata?: object - Additional context (nodeVersion, os, etc.)Returns: Promise<BugReplayResult>
interface BugReplayResult {
explanation: string;
reproductionSteps: string[];
replayScript: string;
suggestedFix?: string;
suggestedPatch?: string;
suggestedTest?: string;
sandboxResult?: {
success: boolean;
reproduced: boolean;
stdout: string;
stderr: string;
exitCode: number | null;
};
}
AiBugReplayer ClassFor more control, use the class directly:
import { AiBugReplayer, OpenAiLlmClient } from "stack-replayer";
const replayer = new AiBugReplayer({
llmClient: new OpenAiLlmClient({
apiKey: process.env.OPENAI_API_KEY!,
model: "gpt-4o-mini"
}),
dryRun: false
});
const result = await replayer.replay({
errorLog: errorStack,
projectRoot: "/path/to/project",
metadata: {
nodeVersion: process.version,
os: process.platform
}
});
import { OpenAiLlmClient } from "stack-replayer";
const client = new OpenAiLlmClient({
apiKey: "sk-...",
model: "gpt-4o-mini", // optional
baseURL: "https://api.openai.com/v1" // optional
});
import { OllamaLlmClient } from "stack-replayer";
const client = new OllamaLlmClient({
baseUrl: "http://localhost:11434",
model: "llama3"
});
import { HttpLlmClient } from "stack-replayer";
const client = new HttpLlmClient({
baseUrl: "https://your-api.com/v1/chat/completions",
apiKey: "your-key",
model: "your-model"
});
Implement the LlmClient interface:
import { LlmClient, ParsedErrorLog, BugReplayInput } from "stack-replayer";
class MyCustomLlmClient implements LlmClient {
async generateReplay(parsed: ParsedErrorLog, input: BugReplayInput) {
// Your custom logic here
return {
explanation: "...",
reproductionSteps: ["..."],
replayScript: "...",
suggestedFix: "..."
};
}
}
| Variable | Description | Default |
|---|---|---|
AI_BUG_REPLAYER_PROVIDER | LLM provider: openai or ollama | None (no-AI mode) |
OPENAI_API_KEY | OpenAI API key | - |
OPENAI_MODEL | OpenAI model to use | gpt-4o-mini |
OPENAI_BASE_URL | Custom OpenAI endpoint | https://api.openai.com/v1 |
OLLAMA_BASE_URL | Ollama server URL | http://localhost:11434 |
OLLAMA_MODEL | Ollama model to use | llama3 |
import { replayBug } from "stack-replayer";
process.on('uncaughtException', async (err) => {
console.error('Uncaught exception:', err);
const analysis = await replayBug(err.stack ?? String(err), {
projectRoot: process.cwd(),
metadata: {
nodeVersion: process.version,
os: process.platform,
timestamp: new Date().toISOString()
}
});
// Send to your logging service
await sendToLoggingService({
error: err,
analysis: analysis.explanation,
suggestedFix: analysis.suggestedFix
});
});
import { replayBug } from "stack-replayer";
afterEach(async function() {
if (this.currentTest?.state === 'failed') {
const err = this.currentTest.err;
if (err?.stack) {
const analysis = await replayBug(err.stack);
console.log('\n🔍 AI Analysis:');
console.log(analysis.explanation);
console.log('\n💡 Suggested Fix:');
console.log(analysis.suggestedFix);
}
}
});
const result = await replayBug(errorLog, { dryRun: true });
// Only get analysis and script, don't execute
console.log(result.replayScript);
MIT
Contributions welcome! Please read our contributing guidelines and submit PRs.
Built with ❤️ by the open source community.
FAQs
Turn cryptic error logs into reproducible bugs, replay scripts, and fix suggestions — with optional AI
The npm package stack-replayer receives a total of 0 weekly downloads. As such, stack-replayer popularity was classified as not popular.
We found that stack-replayer demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.

Security News
Open source is under attack because of how much value it creates. It has been the foundation of every major software innovation for the last three decades. This is not the time to walk away from it.

Security News
Socket CEO Feross Aboukhadijeh breaks down how North Korea hijacked Axios and what it means for the future of software supply chain security.