🚀 Big News: Socket Acquires Coana to Bring Reachability Analysis to Every Appsec Team.Learn more
Socket
DemoInstallSign in
Socket

@copilotkit/runtime

Package Overview
Dependencies
Maintainers
1
Versions
401
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@copilotkit/runtime

1.8.14
latest
Source
npm
Version published
Weekly downloads
17K
19.55%
Maintainers
1
Weekly downloads
 
Created
Source
CopilotKit Logo
CopilotKit is the open-source framework for integrating powerful AI Copilots into any application. Easily implement custom AI Chatbots, AI Agents, AI Textareas, and more.



CopilotKit Screenshot

Documentation

To get started with CopilotKit, please check out the documentation.

LangFuse Logging Integration

CopilotKit now supports LangFuse logging integration to help you monitor, analyze, and debug your LLM requests and responses.

Setup

To enable LangFuse logging, you can configure it when initializing the CopilotRuntime:

import { CopilotRuntime, OpenAIAdapter } from "@copilotkit/runtime";
import { LangfuseClient } from "langfuse";

// Initialize your LangFuse client
const langfuse = new LangfuseClient({
  publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
  secretKey: process.env.LANGFUSE_SECRET_KEY!,
  baseUrl: process.env.LANGFUSE_BASE_URL,
});

// Create a CopilotRuntime with LangFuse logging enabled
const runtime = new CopilotRuntime({
  adapter: new OpenAIAdapter({ apiKey: process.env.OPENAI_API_KEY }),
  logging: {
    enabled: true,
    progressive: true, // Set to false for buffered logging
    logger: {
      logRequest: (data) => langfuse.trace({ name: "LLM Request", input: data }),
      logResponse: (data) => langfuse.trace({ name: "LLM Response", output: data }),
      logError: (errorData) => langfuse.trace({ name: "LLM Error", metadata: errorData }),
    },
  },
});

Configuration Options

The logging configuration accepts the following options:

  • enabled (boolean): Enable or disable logging (default: false)
  • progressive (boolean): When true, logs each chunk as it's streamed. When false, logs the complete response (default: true)
  • logger (object): Contains callback functions for logging:
    • logRequest: Called when an LLM request is made
    • logResponse: Called when an LLM response is received
    • logError: Called when an error occurs during an LLM request

Custom Logging Integrations

You can integrate with any logging service by implementing the logger interface:

const runtime = new CopilotRuntime({
  adapter: new OpenAIAdapter({ apiKey: "YOUR_API_KEY" }),
  logging: {
    enabled: true,
    progressive: false,
    logger: {
      logRequest: (data) => {
        // Implement your custom logging logic
        console.log("LLM Request:", JSON.stringify(data));
      },
      logResponse: (data) => {
        // Implement your custom logging logic
        console.log("LLM Response:", JSON.stringify(data));
      },
      logError: (error) => {
        // Implement your custom error logging
        console.error("LLM Error:", error);
      },
    },
  },
});

This allows you to send your logs to any system or service that you prefer.

Keywords

copilotkit

FAQs

Package last updated on 06 Jun 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts