Socket
Book a DemoInstallSign in
Socket

deca-chat

Package Overview
Dependencies
Maintainers
0
Versions
14
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

deca-chat

A simple wrapper for OpenAI's chat API - that includes ability to use custom baseURL's

latest
npmnpm
Version
1.3.2
Version published
Weekly downloads
15
200%
Maintainers
0
Weekly downloads
 
Created
Source

DecaChat

A lightweight and easy-to-use wrapper for OpenAI's Chat API. DecaChat provides a simple interface for creating chat-based applications with OpenAI's GPT models.

Features

  • 🚀 Simple, intuitive API
  • 📝 TypeScript support
  • 💾 Conversation management
  • ⚙️ Configurable parameters
  • 🛡️ Built-in error handling
  • 🌐 Custom base URL support
  • 🔄 Conversation history management
  • 🤖 System message configuration
  • 👋 Customizable introduction messages
  • 📦 Zero dependencies (except OpenAI SDK)

Installation

npm install deca-chat

Quick Start

import { DecaChat } from 'deca-chat';

// Initialize the chat client
const chat = new DecaChat({
  apiKey: 'your-openai-api-key'
});

// Send a message and get a response
async function example() {
  const response = await chat.chat('Hello, how are you?');
  console.log(response);
}

Configuration

The DecaChat constructor accepts a configuration object with the following options:

interface DecaChatConfig {
  apiKey: string;         // Required: Your OpenAI API key
  model?: string;         // Optional: Default 'gpt-4o-mini'
  baseUrl?: string;       // Optional: Default 'https://api.openai.com/v1'
  maxTokens?: number;     // Optional: Default 1000
  temperature?: number;   // Optional: Default 0.7
  intro?: string;         // Optional: Custom introduction message
  systemMessage?: string; // Optional: Initial system message
  useBrowser?: boolean;   // Optional: Enable browser usage (Ensure API keys are secured!)
}

API Reference

Constructor

const chat = new DecaChat(config: DecaChatConfig);

Methods

setSystemMessage(message: string): void

Sets the system message for the conversation. This resets the conversation history and starts with the new system message.

chat.setSystemMessage('You are a helpful assistant specialized in JavaScript.');

setIntro(message: string): void

Sets a custom introduction message that will be sent to the user when starting a new conversation.

chat.setIntro('Hi! I'm your AI assistant. How can I help you today?');

async chat(message: string): Promise<string>

Sends a message and returns the assistant's response. The message and response are automatically added to the conversation history.

const response = await chat.chat('What is a closure in JavaScript?');

clearConversation(): void

Clears the entire conversation history, including the system message.

chat.clearConversation();

getConversation(): ChatMessage[]

Returns the current conversation history, including system messages, user messages, and assistant responses.

const history = chat.getConversation();

Example Usage

Basic Chat Application

import { DecaChat } from 'deca-chat';

async function example() {
  // Initialize with custom configuration including system message
  const chat = new DecaChat({
    apiKey: 'your-openai-api-key',
    model: 'gpt-4',
    maxTokens: 2000,
    temperature: 0.8,
    intro: 'Hello! I'm your coding assistant. Ask me anything about programming!',
    systemMessage: 'You are a helpful coding assistant specialized in JavaScript.'
  });

  // The system message can also be set after initialization
  chat.setSystemMessage('You are now a Python expert.');

  // The intro message will be automatically sent when starting a conversation
  const response = await chat.chat('How do I create a React component?');
  console.log('Assistant:', response);
}

Custom API Endpoint

const chat = new DecaChat({
  apiKey: 'your-api-key',
  baseUrl: 'https://your-custom-endpoint.com/v1',
  model: 'gpt-4o-mini'
});

Managing Conversations

// Start with a system message
chat.setSystemMessage('You are a helpful assistant.');

// Have a conversation
await chat.chat('What is TypeScript?');
await chat.chat('How does it differ from JavaScript?');

// Get the conversation history
const history = chat.getConversation();
console.log(history);
/* Output:
[
  { role: 'system', content: 'You are a helpful assistant.' },
  { role: 'user', content: 'What is TypeScript?' },
  { role: 'assistant', content: '...' },
  { role: 'user', content: 'How does it differ from JavaScript?' },
  { role: 'assistant', content: '...' }
]
*/

// Clear the conversation and start fresh
chat.clearConversation();

Error Handling

The chat method throws errors when:

  • The API key is invalid
  • The API request fails
  • Rate limits are exceeded
  • Network errors occur
  • Invalid model specified
  • Custom endpoint is unreachable

Always wrap API calls in try-catch blocks for proper error handling:

try {
  const response = await chat.chat('Hello');
  console.log(response);
} catch (error) {
  console.error('Chat error:', error);
}

Best Practices

  • System Messages: Set appropriate system messages to guide the assistant's behavior
  • Conversation Management: Clear conversations when starting new topics
  • Error Handling: Always implement proper error handling
  • Resource Management: Monitor token usage and conversation length
  • API Key Security: Never expose your API key in client-side code

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.

License

MIT License - see the LICENSE file for details.

Keywords

openai

FAQs

Package last updated on 10 Feb 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts