
Research
Supply Chain Attack on Axios Pulls Malicious Dependency from npm
A supply chain attack on Axios introduced a malicious dependency, plain-crypto-js@4.2.1, published minutes earlier and absent from the project’s GitHub releases.
ai-patterns
Advanced tools
Production-ready TypeScript patterns to build solid and robust AI applications. Retry logic, circuit breakers, rate limiting, human-in-the-loop escalation, prompt versioning, response validation, context window management, and more—all with complete type
Battle-tested TypeScript patterns for building rock-solid AI applications.
We provide developers with battle-tested tools for resilient AI workflows: retry logic, circuit breakers, rate limiting, human-in-the-loop escalation, and more — all with complete type safety and composability. Inspired by Vercel AI SDK's developer experience.
✨ Battle-Tested Patterns - Retry, Circuit Breaker, Timeout, Rate Limiter, Fallback, Cache, Debounce, Throttle, Bulkhead, A/B Testing, Cost Tracking, Prompt Versioning, Response Validation, Context Window Management, Reflection Loop, and more 🎨 Elegant Composition - Compose patterns together for complex workflows 🔒 Type-Safe - Full TypeScript support with generics and strict mode 🧩 Composable - Patterns work together seamlessly for robust workflows 📊 Observable - Built-in lifecycle callbacks for monitoring and debugging 🪶 Lightweight - Zero dependencies, minimal overhead ⚡ Production-Ready - Build solid AI applications with confidence 🎯 Developer-Friendly - Inspired by Vercel AI SDK's excellent DX 💰 Cost Control - Track and control AI spending in real-time 🧪 Experimentation - A/B test prompts and models to optimize performance
npm install ai-patterns
# or
yarn add ai-patterns
# or
pnpm add ai-patterns
import { retry } from 'ai-patterns';
// Retry any async function
const result = await retry({
execute: () => fetch('https://api.example.com/data'),
maxAttempts: 3
});
console.log(result.value);
import { retry } from 'ai-patterns';
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
const result = await retry({
execute: async () => {
const { text } = await generateText({
model: openai('gpt-4-turbo'),
prompt: 'Explain quantum computing',
maxRetries: 0 // Disable Vercel's built-in retry
});
return text;
},
maxAttempts: 3
});
console.log(result.value);
💡 Note: While Vercel AI SDK has built-in retry (
maxRetries: 2),ai-patternsgives you more flexibility:
- 🎛️ Custom backoff strategies (exponential, linear, fixed)
- 📊 Detailed observability (attempts, delays, errors)
- 🔄 Cross-provider fallback (OpenAI → Claude → Gemini)
- 🎯 Advanced retry logic (conditional, circuit breakers)
Building AI applications? You're probably facing these challenges:
❌ Copy-pasting retry logic across every API call ❌ No circuit breakers — one API failure brings down your entire app ❌ Constantly hitting rate limits with no systematic handling ❌ No human oversight for edge cases that need review
With ai-patterns:
✅ Battle-tested patterns ready to use out of the box ✅ Compose like Lego blocks — combine patterns seamlessly ✅ Full type safety — catch errors at compile time ✅ Zero dependencies — lightweight and production-ready
Before ai-patterns:
// 50+ lines of retry logic with exponential backoff,
// jitter, error classification, timeout handling...
let attempt = 0;
const maxAttempts = 3;
while (attempt < maxAttempts) {
try {
// ... complex retry logic
} catch (error) {
// ... backoff calculation
// ... error handling
}
}
After ai-patterns:
const result = await retry({
execute: () => callAPI(),
maxAttempts: 3
});
That's it. Simple, reliable, production-ready.
Use defineCircuitBreaker and defineRateLimiter for patterns that maintain state:
const breaker = defineCircuitBreaker({
execute: (prompt: string) => callAPI(prompt),
failureThreshold: 5,
resetTimeout: 60000
});
// Reuse the same instance across calls
await breaker('First call');
await breaker('Second call');
console.log(breaker.getState()); // Check circuit state
Compose patterns together for robust workflows using the compose() function:
import { compose, withRetry, withTimeout, withFallback } from 'ai-patterns';
// Create a reusable composed function
const robustAI = compose<string, string>([
withFallback({ fallback: () => "Sorry, service unavailable" }),
withTimeout({ duration: 10000 }),
withRetry({
maxAttempts: 3,
backoffStrategy: "exponential",
})
]);
// Use it anywhere
const result = await robustAI(callAI, "Explain quantum computing");
Tip: You can also nest patterns directly if you prefer explicit control flow.
For advanced composition strategies:
| Pattern | Description | Use Case | Docs |
|---|---|---|---|
| compose | Functional pattern composition | Complex AI pipelines | 📖 |
| retry | Automatic retry with exponential backoff | Unstable APIs, network issues | 📖 |
| timeout | Time limits with AbortSignal support | Long-running operations | 📖 |
| fallback | Execute alternatives on failure | Multi-provider failover | 📖 |
| defineCircuitBreaker | Protect against failing services | External API calls | 📖 |
| defineRateLimiter | Control request throughput | API rate limiting | 📖 |
| Pattern | Description | Use Case | Docs |
|---|---|---|---|
| memoize | Cache function results with TTL | Response caching | 📖 |
| defineDebounce | Delay execution until silence period | User input handling | 📖 |
| defineThrottle | Limit execution frequency | API call throttling | 📖 |
| defineBulkhead | Isolate resources with concurrency limits | Resource isolation | 📖 |
| deadLetterQueue | Handle failed operations | Error recovery | 📖 |
| Pattern | Description | Use Case | Docs |
|---|---|---|---|
| fanOut | Parallel processing with concurrency control | Batch operations | 📖 |
| saga | Distributed transactions with compensation | Multi-step workflows | 📖 |
| conditionalBranch | Route based on conditions | Dynamic workflow routing | 📖 |
| Pattern | Description | Use Case | Docs |
|---|---|---|---|
| humanInTheLoop | AI → Human escalation | Content moderation | 📖 |
| smartContextWindow | Manage context token limits automatically | Long conversations, chat apps | 📖 |
| reflectionLoop | AI self-critique and iterative improvement | High-quality content generation | 📖 |
| idempotency | Prevent duplicate operations | Payment processing | 📖 |
| Pattern | Description | Use Case | Docs |
|---|---|---|---|
| abTest | Test multiple variants simultaneously | Prompt optimization, model selection | 📖 |
| costTracking | Monitor and control AI spending | Budget management, cost optimization | 📖 |
| versionedPrompt | Manage prompt versions with rollback | Prompt experimentation, gradual rollout | 📖 |
| validateResponse | Validate AI responses with auto-retry | Quality assurance, business rules | 📖 |
import { retry, timeout } from 'ai-patterns';
const result = await retry({
execute: async () => {
return await timeout({
execute: () => fetch('https://api.example.com/data'),
timeoutMs: 5000
});
},
maxAttempts: 3
});
import { fallback } from 'ai-patterns';
import { generateText } from 'ai';
import { openai, anthropic } from '@ai-sdk/openai';
const result = await fallback({
execute: async () => {
const { text } = await generateText({
model: openai('gpt-4-turbo'),
prompt: 'Explain quantum computing'
});
return text;
},
fallback: async () => {
const { text} = await generateText({
model: anthropic('claude-3-5-sonnet-20241022'),
prompt: 'Explain quantum computing'
});
return text;
}
});
import { fanOut } from 'ai-patterns';
import { embed } from 'ai';
const chunks = [
{ id: '1', text: 'Introduction to ML' },
{ id: '2', text: 'Deep learning basics' },
// ... more chunks
];
const result = await fanOut({
items: chunks,
execute: async (chunk) => {
const { embedding } = await embed({
model: openai.embedding('text-embedding-3-small'),
value: chunk.text
});
return { id: chunk.id, embedding };
},
concurrency: 5
});
import { compose, retryMiddleware, timeoutMiddleware } from 'ai-patterns/composition';
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
// Compose multiple patterns functionally
const robustAI = compose([
timeoutMiddleware({ duration: 10000 }),
retryMiddleware({ maxAttempts: 3, backoffStrategy: 'exponential' })
]);
// Use the composed function
const result = await robustAI(
async (prompt: string) => {
const { text } = await generateText({
model: openai('gpt-4-turbo'),
prompt
});
return text;
},
'Explain quantum computing'
);
For detailed pattern documentation:
Runnable examples:
Each pattern has a simple runnable example:
Coming soon:
All patterns follow a consistent API design:
const result = await pattern({
execute: () => yourFunction(),
// pattern-specific options...
});
See the API Reference for complete details.
Built with TypeScript strict mode for maximum type safety:
// Full type inference with generics
interface User {
id: string;
name: string;
email: string;
}
const result = await retry<User>({
execute: async () => {
return await fetchUser();
}
});
// result.value is typed as User
const user: User = result.value;
console.log(user.email); // ✅ Full autocomplete
Contributions are welcome! Please read our Contributing Guide.
MIT © Serge KOKOUA
Inspired by:
Built with ❤️ by Serge KOKOUA
Empowering developers to build solid and robust AI applications.
FAQs
Production-ready TypeScript patterns to build solid and robust AI applications. Retry logic, circuit breakers, rate limiting, human-in-the-loop escalation, prompt versioning, response validation, context window management, and more—all with complete type
We found that ai-patterns demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Research
A supply chain attack on Axios introduced a malicious dependency, plain-crypto-js@4.2.1, published minutes earlier and absent from the project’s GitHub releases.

Research
Malicious versions of the Telnyx Python SDK on PyPI delivered credential-stealing malware via a multi-stage supply chain attack.

Security News
TeamPCP is partnering with ransomware group Vect to turn open source supply chain attacks on tools like Trivy and LiteLLM into large-scale ransomware operations.