New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

ai-patterns

Package Overview
Dependencies
Maintainers
1
Versions
15
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

ai-patterns

Production-ready TypeScript patterns to build solid and robust AI applications. Retry logic, circuit breakers, rate limiting, human-in-the-loop escalation, prompt versioning, response validation, context window management, and more—all with complete type

latest
Source
npmnpm
Version
1.3.3
Version published
Maintainers
1
Created
Source

ai-patterns

npm version Downloads License: MIT TypeScript

Battle-tested TypeScript patterns for building rock-solid AI applications.

We provide developers with battle-tested tools for resilient AI workflows: retry logic, circuit breakers, rate limiting, human-in-the-loop escalation, and more — all with complete type safety and composability. Inspired by Vercel AI SDK's developer experience.

Features

Battle-Tested Patterns - Retry, Circuit Breaker, Timeout, Rate Limiter, Fallback, Cache, Debounce, Throttle, Bulkhead, A/B Testing, Cost Tracking, Prompt Versioning, Response Validation, Context Window Management, Reflection Loop, and more 🎨 Elegant Composition - Compose patterns together for complex workflows 🔒 Type-Safe - Full TypeScript support with generics and strict mode 🧩 Composable - Patterns work together seamlessly for robust workflows 📊 Observable - Built-in lifecycle callbacks for monitoring and debugging 🪶 Lightweight - Zero dependencies, minimal overhead ⚡ Production-Ready - Build solid AI applications with confidence 🎯 Developer-Friendly - Inspired by Vercel AI SDK's excellent DX 💰 Cost Control - Track and control AI spending in real-time 🧪 Experimentation - A/B test prompts and models to optimize performance

Installation

npm install ai-patterns
# or
yarn add ai-patterns
# or
pnpm add ai-patterns

Quick Start

Simple Retry

import { retry } from 'ai-patterns';

// Retry any async function
const result = await retry({
  execute: () => fetch('https://api.example.com/data'),
  maxAttempts: 3
});

console.log(result.value);

With Vercel AI SDK

import { retry } from 'ai-patterns';
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';

const result = await retry({
  execute: async () => {
    const { text } = await generateText({
      model: openai('gpt-4-turbo'),
      prompt: 'Explain quantum computing',
      maxRetries: 0 // Disable Vercel's built-in retry
    });
    return text;
  },
  maxAttempts: 3
});

console.log(result.value);

💡 Note: While Vercel AI SDK has built-in retry (maxRetries: 2), ai-patterns gives you more flexibility:

  • 🎛️ Custom backoff strategies (exponential, linear, fixed)
  • 📊 Detailed observability (attempts, delays, errors)
  • 🔄 Cross-provider fallback (OpenAI → Claude → Gemini)
  • 🎯 Advanced retry logic (conditional, circuit breakers)

Why ai-patterns?

Building AI applications? You're probably facing these challenges:

Copy-pasting retry logic across every API call ❌ No circuit breakers — one API failure brings down your entire app ❌ Constantly hitting rate limits with no systematic handling ❌ No human oversight for edge cases that need review

With ai-patterns:

Battle-tested patterns ready to use out of the box ✅ Compose like Lego blocks — combine patterns seamlessly ✅ Full type safety — catch errors at compile time ✅ Zero dependencies — lightweight and production-ready

Before ai-patterns:

// 50+ lines of retry logic with exponential backoff,
// jitter, error classification, timeout handling...
let attempt = 0;
const maxAttempts = 3;
while (attempt < maxAttempts) {
  try {
    // ... complex retry logic
  } catch (error) {
    // ... backoff calculation
    // ... error handling
  }
}

After ai-patterns:

const result = await retry({
  execute: () => callAPI(),
  maxAttempts: 3
});

That's it. Simple, reliable, production-ready.

Advanced Usage

Stateful Patterns

Use defineCircuitBreaker and defineRateLimiter for patterns that maintain state:

const breaker = defineCircuitBreaker({
  execute: (prompt: string) => callAPI(prompt),
  failureThreshold: 5,
  resetTimeout: 60000
});

// Reuse the same instance across calls
await breaker('First call');
await breaker('Second call');
console.log(breaker.getState()); // Check circuit state

Pattern Composition

Compose patterns together for robust workflows using the compose() function:

import { compose, withRetry, withTimeout, withFallback } from 'ai-patterns';

// Create a reusable composed function
const robustAI = compose<string, string>([
  withFallback({ fallback: () => "Sorry, service unavailable" }),
  withTimeout({ duration: 10000 }),
  withRetry({
    maxAttempts: 3,
    backoffStrategy: "exponential",
  })
]);

// Use it anywhere
const result = await robustAI(callAI, "Explain quantum computing");

Tip: You can also nest patterns directly if you prefer explicit control flow.

For advanced composition strategies:

Patterns

Core Patterns

PatternDescriptionUse CaseDocs
composeFunctional pattern compositionComplex AI pipelines📖
retryAutomatic retry with exponential backoffUnstable APIs, network issues📖
timeoutTime limits with AbortSignal supportLong-running operations📖
fallbackExecute alternatives on failureMulti-provider failover📖
defineCircuitBreakerProtect against failing servicesExternal API calls📖
defineRateLimiterControl request throughputAPI rate limiting📖

Advanced Patterns

PatternDescriptionUse CaseDocs
memoizeCache function results with TTLResponse caching📖
defineDebounceDelay execution until silence periodUser input handling📖
defineThrottleLimit execution frequencyAPI call throttling📖
defineBulkheadIsolate resources with concurrency limitsResource isolation📖
deadLetterQueueHandle failed operationsError recovery📖

Orchestration Patterns

PatternDescriptionUse CaseDocs
fanOutParallel processing with concurrency controlBatch operations📖
sagaDistributed transactions with compensationMulti-step workflows📖
conditionalBranchRoute based on conditionsDynamic workflow routing📖

AI-Specific Patterns

PatternDescriptionUse CaseDocs
humanInTheLoopAI → Human escalationContent moderation📖
smartContextWindowManage context token limits automaticallyLong conversations, chat apps📖
reflectionLoopAI self-critique and iterative improvementHigh-quality content generation📖
idempotencyPrevent duplicate operationsPayment processing📖

Experimentation & Monitoring

PatternDescriptionUse CaseDocs
abTestTest multiple variants simultaneouslyPrompt optimization, model selection📖
costTrackingMonitor and control AI spendingBudget management, cost optimization📖
versionedPromptManage prompt versions with rollbackPrompt experimentation, gradual rollout📖
validateResponseValidate AI responses with auto-retryQuality assurance, business rules📖

Pattern Examples

Robust API Call

import { retry, timeout } from 'ai-patterns';

const result = await retry({
  execute: async () => {
    return await timeout({
      execute: () => fetch('https://api.example.com/data'),
      timeoutMs: 5000
    });
  },
  maxAttempts: 3
});

AI Agent with Fallback

import { fallback } from 'ai-patterns';
import { generateText } from 'ai';
import { openai, anthropic } from '@ai-sdk/openai';

const result = await fallback({
  execute: async () => {
    const { text } = await generateText({
      model: openai('gpt-4-turbo'),
      prompt: 'Explain quantum computing'
    });
    return text;
  },
  fallback: async () => {
    const { text} = await generateText({
      model: anthropic('claude-3-5-sonnet-20241022'),
      prompt: 'Explain quantum computing'
    });
    return text;
  }
});

Data Processing Pipeline

import { fanOut } from 'ai-patterns';
import { embed } from 'ai';

const chunks = [
  { id: '1', text: 'Introduction to ML' },
  { id: '2', text: 'Deep learning basics' },
  // ... more chunks
];

const result = await fanOut({
  items: chunks,
  execute: async (chunk) => {
    const { embedding } = await embed({
      model: openai.embedding('text-embedding-3-small'),
      value: chunk.text
    });
    return { id: chunk.id, embedding };
  },
  concurrency: 5
});

Composing Patterns with Middleware

import { compose, retryMiddleware, timeoutMiddleware } from 'ai-patterns/composition';
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';

// Compose multiple patterns functionally
const robustAI = compose([
  timeoutMiddleware({ duration: 10000 }),
  retryMiddleware({ maxAttempts: 3, backoffStrategy: 'exponential' })
]);

// Use the composed function
const result = await robustAI(
  async (prompt: string) => {
    const { text } = await generateText({
      model: openai('gpt-4-turbo'),
      prompt
    });
    return text;
  },
  'Explain quantum computing'
);

For detailed pattern documentation:

Runnable examples:

Examples

Basic Examples

Each pattern has a simple runnable example:

Advanced Examples

Real-World Examples

Coming soon:

  • E-commerce - Order processing with saga, retry, and idempotency
  • AI Agent - Chatbot with human escalation and circuit breakers
  • Microservices - API gateway with rate limiting and retries

Documentation

Pattern Documentation

Guides

API Reference

All patterns follow a consistent API design:

const result = await pattern({
  execute: () => yourFunction(),
  // pattern-specific options...
});

See the API Reference for complete details.

Type Safety

Built with TypeScript strict mode for maximum type safety:

// Full type inference with generics
interface User {
  id: string;
  name: string;
  email: string;
}

const result = await retry<User>({
  execute: async () => {
    return await fetchUser();
  }
});

// result.value is typed as User
const user: User = result.value;
console.log(user.email); // ✅ Full autocomplete

Contributing

Contributions are welcome! Please read our Contributing Guide.

License

MIT © Serge KOKOUA

Acknowledgments

Inspired by:

Built with ❤️ by Serge KOKOUA

Empowering developers to build solid and robust AI applications.

Keywords

ai

FAQs

Package last updated on 10 Nov 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts