🚨 Shai-Hulud Strikes Again:834 Packages Compromised.Technical Analysis →
Socket
Book a DemoInstallSign in
Socket

@agent-client/aa

Package Overview
Dependencies
Maintainers
1
Versions
6
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@agent-client/aa

A streaming chat client SDK for AI Agent conversations

latest
Source
npmnpm
Version
2.0.1
Version published
Maintainers
1
Created
Source

@agent-client/aa

A lightweight, type-safe streaming chat client SDK for AutoAgents (AA) conversations. Built with TypeScript and native Web APIs.

Features

  • 🔄 Streaming Support - Real-time SSE (Server-Sent Events) streaming
  • 🎯 Type Safe - Full TypeScript support with comprehensive type definitions
  • 🪶 Zero Dependencies - Uses native browser APIs only
  • ⚡️ Async Generator - Modern AsyncGenerator pattern for clean stream handling
  • 🎨 Rich Metadata - Supports reasoning content, thinking time, and custom metadata
  • 🛡️ Error Handling - Robust error handling and stream cleanup

Installation

npm install @agent-client/aa
yarn add @agent-client/aa
pnpm add @agent-client/aa
bun add @agent-client/aa

Quick Start

import { chat } from '@agent-client/aa';

// Simple and clean API
const chatStream = chat('https://api.example.com/chat', {
  token: 'your-auth-token',
  body: {
    agentId: 'agent-123',
    userChatInput: 'Hello, how are you?',
  },
});

// Process streaming messages
for await (const { messages, conversationId, chatId, chunk, rawChunk, sseLine } of chatStream) {
  // Structured messages
  console.log('Current messages:', messages);
  console.log('Conversation ID:', conversationId);
  console.log('Chat ID:', chatId);
  
  // Raw data access (optional, for advanced use)
  if (chunk) console.log('Parsed chunk:', chunk);
  if (rawChunk) console.log('Raw JSON:', rawChunk);
  if (sseLine) console.log('SSE line:', sseLine);
}

API Reference

chat(url, options)

Main function to initiate a streaming chat conversation. Simple, clean, and powerful.

Parameters:

  • url (string): The API endpoint URL
  • options (object):
    • token (string): Authentication token
    • body (ChatRequestBody): Request payload
    • signal (AbortSignal, optional): Abort signal for cancellation
    • onopen (function, optional): Callback when connection opens

Returns: AsyncGenerator<GenerateMessagesYield>

Types

ChatRequestBody

interface ChatRequestBody {
  agentId: string;
  chatId?: string;
  userChatInput: string;
  files?: { fileId: string; fileName: string; fileUrl: string; }[];
  images?: { url: string }[];
  kbIdList?: number[];
  database?: {
    databaseUuid: string;
    tableNames: string[];
  };
  state?: any;
  trialOperation?: boolean;
}

ChatMessage

interface ChatMessage {
  content: string;
  role: "assistant" | "user";
  messageId: string;
  loading: boolean;
  contentType?: "q_file" | "q_image";
  metadata?: Record<string, {
    complete: boolean;
    result?: any[];
    type?: string;
  }>;
  type: "text" | "table" | "buttons" | "result_file";
  reasoningContent?: string;
  thinkingElapsedMillSecs?: number;
  __raw?: any;
}

ChatMessageStreamYield

interface ChatMessageStreamYield {
  messages: ChatMessage[];       // Array of accumulated messages
  conversationId?: string;        // Conversation identifier
  chatId?: string;                // Chat identifier for continuation
  chunk?: ChatStreamChunk;        // Parsed chunk object
  rawChunk?: string;              // Raw JSON text (without "data:" prefix)
  sseLine?: string;               // Complete SSE line (with "data:" prefix)
}

Note - Data Layers:

  • sseLine: Complete SSE protocol line, e.g., "data: {\"content\":\"hello\"}"
  • rawChunk: JSON text only (without data: prefix), e.g., "{\"content\":\"hello\"}"
  • chunk: Parsed JSON object, e.g., {content: "hello"}

Use sseLine for protocol-level debugging or logging, rawChunk for custom JSON parsing, and chunk for direct data access.

Advanced Usage

With Abort Signal

const controller = new AbortController();
const chatStream = chat('https://api.example.com/chat', {
  token: 'your-token',
  body: { agentId: 'agent-123', userChatInput: 'Hello!' },
  signal: controller.signal,
});

// Cancel the request after 5 seconds
setTimeout(() => controller.abort(), 5000);

With Connection Callback

const chatStream = chat('https://api.example.com/chat', {
  token: 'your-token',
  body: { agentId: 'agent-123', userChatInput: 'Hello!' },
  onopen: () => {
    console.log('Connection established!');
  },
});

React Integration Example

import { useState, useEffect } from 'react';
import { chat, ChatMessage } from '@agent-client/aa';

function ChatComponent() {
  const [messages, setMessages] = useState<ChatMessage[]>([]);
  const [conversationId, setConversationId] = useState<string>('');

  const sendMessage = async (input: string) => {
    const chatStream = chat('https://api.example.com/chat', {
      token: 'your-token',
      body: {
        agentId: 'agent-123',
        userChatInput: input,
        chatId: conversationId,
      },
    });

    for await (const { messages, conversationId: convId } of chatStream) {
      setMessages(messages);
      if (convId) setConversationId(convId);
    }
  };

  return (
    <div>
      {messages.map((msg) => (
        <div key={msg.messageId}>
          <strong>{msg.role}:</strong> {msg.content}
          {msg.loading && <span>...</span>}
        </div>
      ))}
    </div>
  );
}

Low-Level APIs

createChatSSEStream(url, options)

Creates a ReadableStream for SSE data.

import { createChatSSEStream } from '@agent-client/aa';

const stream = await createChatSSEStream('https://api.example.com/chat', {
  token: 'your-token',
  body: { agentId: 'agent-123', userChatInput: 'Hello!' },
});

createChatMessageStream(stream)

Processes a ReadableStream and yields structured messages.

import { createChatSSEStream, createChatMessageStream } from '@agent-client/aa';

const stream = await createChatSSEStream(url, options);
for await (const result of createChatMessageStream(stream)) {
  console.log(result.messages);
}

Protocol

The SDK expects Server-Sent Events (SSE) in the following format:

data: {"chatId":"123","conversationId":"456","content":"Hello","complete":false,"finish":false,...}
data: {"chatId":"123","conversationId":"456","content":" World","complete":true,"finish":false,...}
data: [DONE]

Stream Markers

  • complete: true - Current message is complete (may have more messages in conversation)
  • finish: true - Entire conversation stream is finished
  • data: [DONE] - Alternative stream termination marker

Browser Support

This package uses native browser APIs:

  • fetch API
  • ReadableStream API
  • TextDecoder API
  • AsyncGenerator support

Requires modern browsers with ES2022 support. For older browsers, use appropriate polyfills.

Publishing

To publish this package to npm:

# First time: Login to npm
npm run login

# Then: One-command publish
npm run publish:now

License

Apache-2.0

Contributing

Contributions are welcome! Please read our contributing guidelines and code of conduct.

Support

  • GitHub Issues: https://github.com/align-dev/align/issues
  • Email: contact@align.com

Keywords

ai

FAQs

Package last updated on 20 Nov 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts