New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

openai-plugins

Package Overview
Dependencies
Maintainers
1
Versions
18
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

openai-plugins

A TypeScript library that provides an OpenAI-compatible client for the Model Context Protocol (MCP).

latest
npmnpm
Version
0.0.19
Version published
Maintainers
1
Created
Source

openai-mcp

A TypeScript library that provides an OpenAI-compatible client for the Model Context Protocol (MCP).

Installation

npm install openai-mcp

Features

  • OpenAI API compatibility - works as a drop-in replacement for the OpenAI client
  • Connects to local or remote Model Context Protocol servers
  • Supports tool use and function calling
  • Rate limiting and retry logic built in
  • Configurable logging
  • TypeScript type definitions included

Usage

import { OpenAI } from 'openai-mcp';

// Create an OpenAI-compatible client connected to an MCP server
const openai = new OpenAI({
  mcp: {
    // MCP server URL(s) to connect to
    serverUrls: ['http://localhost:3000/mcp'],
    
    // Optional: set log level (debug, info, warn, error)
    logLevel: 'info',
    
    // Additional configuration options
    // modelName: 'gpt-4',             // Default model to use
    // disconnectAfterUse: true,       // Auto-disconnect after use
    // maxToolCalls: 15,               // Max number of tool calls per conversation
    // toolTimeoutSec: 60,             // Timeout for tool calls
  }
});

// Use the client like a standard OpenAI client
const response = await openai.chat.completions.create({
  model: 'gpt-4',
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' },
    { role: 'user', content: 'Hello, how are you today?' }
  ]
});

console.log(response.choices[0].message.content);

Logging Configuration

import { setMcpLogLevel } from 'openai-mcp';

// Set log level to one of: 'debug', 'info', 'warn', 'error'
setMcpLogLevel('info');

Environment Variables

The library also supports configuration through environment variables:

# MCP Server URL(s) - comma separated for multiple servers
MCP_SERVER_URL=http://localhost:3000/mcp

# API Keys for different model providers
OPENAI_API_KEY=your-openai-api-key
ANTHROPIC_API_KEY=your-anthropic-api-key
GEMINI_API_KEY=your-gemini-api-key

Multi-Model Support

The library supports routing requests to different model providers based on the model name:

import { OpenAI } from 'openai-mcp';

const openai = new OpenAI();

// Uses OpenAI API
const gpt4Response = await openai.chat.completions.create({
  model: 'gpt-4',
  messages: [{ role: 'user', content: 'Hello GPT-4' }]
});

// Uses Anthropic API
const claudeResponse = await openai.chat.completions.create({
  model: 'claude-3',
  messages: [{ role: 'user', content: 'Hello Claude' }]
});

// Uses Google Gemini API
const geminiResponse = await openai.chat.completions.create({
  model: 'gemini-pro',
  messages: [{ role: 'user', content: 'Hello Gemini' }]
});

Examples

The examples/ directory contains various usage examples:

  • Basic Usage: Simple chat completion request
  • Streaming: Stream responses token by token
  • Multi-Model: Use OpenAI, Anthropic, and Gemini models
  • Tools Usage: Function/tool calling with MCP
  • Custom Logging: Configure and use the logging system

See the Examples README for more details on running these examples.

Development

To build the library:

npm run build

To run tests:

npm test

Keywords

openai

FAQs

Package last updated on 20 Apr 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts