You're Invited:Meet the Socket Team at BlackHat and DEF CON in Las Vegas, Aug 4-6.RSVP β†’
Socket
Book a DemoInstallSign in
Socket

pulse-ai-utils

Package Overview
Dependencies
Maintainers
1
Versions
145
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

pulse-ai-utils

Utility functions and helpers for AI-powered applications

3.45.20
latest
Source
npmnpm
Version published
Weekly downloads
8
-83.33%
Maintainers
1
Weekly downloads
Β 
Created
Source

Pulse AI Utils

A powerful TypeScript library for AI-powered applications with multi-provider LLM support. Provides unified interfaces for OpenAI, Gemini, Claude, and 100+ models via OpenRouter.

✨ Features

  • πŸ€– Multi-Provider Support: OpenAI, Gemini, Claude, and 100+ models
  • πŸ”„ Unified Interface: Same API for all providers
  • πŸ”‘ BYOK Support: Bring Your Own Key for provider-specific APIs
  • πŸ“Š Structured Data: Built-in Zod schema validation
  • 🌐 Web Search: AI-powered web queries with caching
  • 🎯 Type-Safe: Full TypeScript support with proper types

Installation

npm install pulse-ai-utils

πŸš€ Quick Start

import { OpenAIHelper, OpenRouter } from 'pulse-ai-utils';

// Auto-loads API keys from your .env file
const openai = new OpenAIHelper();  // Uses OPENAI_API_KEY from .env

// Auto-loads keys for different providers
const gemini = OpenRouter.forGemini();  // Uses OPENROUTER_API_KEY + GEMINI_API_KEY from .env
const claude = OpenRouter.forClaude();  // Uses OPENROUTER_API_KEY + CLAUDE_API_KEY from .env

// Use remote config for dynamic model selection (recommended)
const smartOpenai = await OpenAIHelper.createWithRemoteConfig();
const smartGemini = await OpenRouter.forGeminiWithRemoteConfig();
const webOptimized = await OpenRouter.createForWebFetching();

// Or pass keys explicitly if needed
const customOpenai = new OpenAIHelper('your-openai-key');
const customGemini = OpenRouter.forGemini('openrouter-key', 'gemini-key');

Environment Configuration

The library automatically reads API keys from your project's root .env file. Simply create a .env file in your project root:

# .env file in your project root
OPENAI_API_KEY=your-openai-key
OPENROUTER_API_KEY=your-openrouter-key
GEMINI_API_KEY=your-gemini-key
CLAUDE_API_KEY=your-claude-key

No need to call dotenv.config() - the library handles this automatically!

Remote Configuration (Firebase Remote Config)

The library also supports dynamic model selection via Firebase Remote Config. This allows you to change models without code deployments:

# Firebase Remote Config Parameters (optional)
pulse-ai-util-openai-model=gpt-4o-mini                    # OpenAI model selection
pulse-ai-util-openrouter-model=google/gemini-2.0-flash-exp  # OpenRouter model
pulse-ai-util-gemini-model=google/gemini-2.0-flash-exp      # Gemini-specific model
pulse-ai-web-openrouter-model=claude-3-5-sonnet-20241022    # Optimized for web fetching

These remote config values take precedence over environment variables, providing dynamic configuration capabilities.

Required Environment Variables

At least one of these is required:

OPENAI_API_KEY=your-openai-key                    # For direct OpenAI access
OPENROUTER_API_KEY=your-openrouter-key            # For multi-provider access via OpenRouter

Optional Environment Variables

# Provider-specific keys for BYOK (Bring Your Own Key)
GEMINI_API_KEY=your-gemini-key                    # Google AI Studio key
CLAUDE_API_KEY=your-claude-key                    # Anthropic Claude key

# Model Selection (optional - smart defaults provided)
OPENROUTER_MODEL=google/gemini-2.0-flash-exp      # Default OpenRouter model
GEMINI_MODEL=google/gemini-2.0-flash-exp          # Default Gemini model

# Optional Configuration
OPENROUTER_BASE_URL=https://openrouter.ai/api/v1  # Custom OpenRouter URL
LLM_CACHE_DB_ID=llmCache                          # Firestore cache collection

Usage

LLMQueryHandler

Fetch arbitrary structured data using web search.

import { LLMQueryHandler } from 'pulse-ai-utils';

const queryHandler = new LLMQueryHandler('your-api-key');

// Use in an Express route
app.post('/llm-query', (req, res) => queryHandler.query(req, res));

πŸ€– LLM Providers

OpenAI Helper - Direct OpenAI API

import { OpenAIHelper } from 'pulse-ai-utils';

// Auto-loads from OPENAI_API_KEY env var, or pass explicitly
const openai = new OpenAIHelper(undefined, undefined, 'gpt-4o-mini');
// Or with explicit key: new OpenAIHelper('your-api-key', undefined, 'gpt-4o-mini');

// 🌟 Recommended: Use remote config for dynamic model selection
const smartOpenai = await OpenAIHelper.createWithRemoteConfig();

// Update model from remote config for long-running processes
await openai.updateModelFromRemoteConfig();

// Fetch structured data from the web  
const result = await openai.fetchStructuredDataFromWeb({
  prompt: 'Find upcoming tech events in San Francisco',
  zodSchema: yourZodSchema,
  userLocation: { 
    type: 'approximate',
    country: 'US', 
    region: 'CA',
    city: 'San Francisco'
  },
  locationGranularity: 'city',
});

// Get available OpenAI models
const models = await openai.getAvailableModels();
// Returns: ['gpt-4o-mini', 'gpt-4', 'gpt-3.5-turbo', ...]

OpenRouter - Universal Multi-Provider Access

Access 100+ models from multiple providers through a unified interface:

import { OpenRouter } from 'pulse-ai-utils';

// Auto-loads from environment variables (.env file)
const router = new OpenRouter();

// Factory methods auto-load from .env - no keys needed!
const gemini = OpenRouter.forGemini();  // Uses OPENROUTER_API_KEY + GEMINI_API_KEY
const claude = OpenRouter.forClaude();  // Uses OPENROUTER_API_KEY + CLAUDE_API_KEY  
const gpt = OpenRouter.forGPT();        // Uses OPENROUTER_API_KEY

// 🌟 Recommended: Use remote config for dynamic model selection
const smartRouter = await OpenRouter.createWithRemoteConfig();
const smartGemini = await OpenRouter.forGeminiWithRemoteConfig();
const webOptimized = await OpenRouter.createForWebFetching(); // Uses pulse-ai-web-openrouter-model

// Update model from remote config for long-running processes
await router.updateModelFromRemoteConfig();
await router.updateModelFromRemoteConfig(true); // Use web-optimized model

// Or pass keys explicitly if needed
const customGemini = OpenRouter.forGemini('openrouter-key', 'gemini-key');
const customClaude = OpenRouter.forClaude('openrouter-key', 'claude-key');

// Get available models with provider info
const models = await router.getAvailableModels();
// Returns: [
//   { id: 'google/gemini-2.0-flash-exp', name: 'Gemini 2.0 Flash', provider: 'Google' },
//   { id: 'anthropic/claude-3.5-sonnet', name: 'Claude 3.5 Sonnet', provider: 'Anthropic' },
//   ...
// ]

Model Selection Guide

// 🌟 Best: Remote config with dynamic model selection (recommended)
const smartOpenai = await OpenAIHelper.createWithRemoteConfig();
const smartGemini = await OpenRouter.forGeminiWithRemoteConfig();
const webOptimized = await OpenRouter.createForWebFetching(); // Special web-optimized model

// βœ… Good: Environment variables (auto-loads from .env)
const openai = new OpenAIHelper();
const gemini = OpenRouter.forGemini();
const claude = OpenRouter.forClaude();

// βœ… Fallback: Explicit configuration
const customGemini = OpenRouter.forGemini(undefined, undefined, 'google/gemini-2.0-flash-exp');
const customClaude = OpenRouter.forClaude(undefined, undefined, 'anthropic/claude-3.5-sonnet');

Remote Config Priority Order

  • Firebase Remote Config (highest priority) - pulse-ai-util-*-model
  • Environment Variables (.env file) - OPENAI_MODEL, OPENROUTER_MODEL, etc.
  • Default Values (fallback) - gpt-4o-mini, google/gemini-2.0-flash-exp, etc.

Utility Functions

import { 
  getSchemaByCategory, 
  sanitizeId, 
  zodToJsonSchema 
} from 'pulse-ai-utils';

// Get schema for a category
const schema = getSchemaByCategory('events');

// Sanitize an ID
const cleanId = sanitizeId('https://example.com/path/');
// Result: 'example.com-path'

// Convert Zod schema to JSON schema
const jsonSchema = zodToJsonSchema(myZodSchema);

License

0BSD

Versioning and Publishing

To release a new version of this package:

  • Open package.json in this directory and update the version field to the desired version tag (for example, "3.4.1").
  • In your terminal, ensure you're in this directory:
    cd lib
    
  • Build and publish to npm:
    npm run build
    npm publish
    

Your new version will be published under the latest tag on npm.

Keywords

pulse

FAQs

Package last updated on 28 Jun 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

About

Packages

Stay in touch

Get open source security insights delivered straight into your inbox.

  • Terms
  • Privacy
  • Security

Made with ⚑️ by Socket Inc

U.S. Patent No. 12,346,443 & 12,314,394. Other pending.