New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

@databricks/langchainjs

Package Overview
Dependencies
Maintainers
4
Versions
3
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@databricks/langchainjs

LangChain integration for Databricks Model Serving

latest
Source
npmnpm
Version
0.1.0
Version published
Weekly downloads
156
-44.88%
Maintainers
4
Weekly downloads
 
Created
Source

@databricks/langchainjs

LangChain TypeScript integration for Databricks Model Serving.

This package provides a ChatDatabricks class that integrates with the LangChain ecosystem, allowing you to use Databricks Model Serving endpoints with LangChain's chat model interface.

Features

  • Compatible with LangChain's BaseChatModel interface
  • Supports streaming responses
  • Supports tool/function calling
  • Multiple endpoint APIs: Chat Completions, and Responses
  • Automatic authentication via Databricks SDK
  • MCP (Model Context Protocol) integration for dynamic tool loading

Requirements

  • Node.js >= 18.0.0
  • A Databricks workspace with Model Serving enabled

Installation

npm install @databricks/langchainjs

Quick Start

import { ChatDatabricks } from "@databricks/langchainjs";

const model = new ChatDatabricks({
  model: "databricks-claude-sonnet-4-5",
});

const response = await model.invoke("Hello, how are you?");
console.log(response.content);

Endpoint APIs

ChatDatabricks supports Chat Completions or Responses via useResponsesApi:

Chat Completions

OpenAI-compatible chat completions for Foundation Models.

const model = new ChatDatabricks({
  model: "databricks-claude-sonnet-4-5",
  useResponsesApi: false, // can be omitted
});

Responses

Rich output with reasoning, citations, and function calls.

const model = new ChatDatabricks({
  model: "databricks-gpt-5-2",
  useResponsesApi: true,
});

Authentication

ChatDatabricks uses the Databricks SDK for authentication, which automatically detects credentials from:

  • Environment variables (DATABRICKS_HOST, DATABRICKS_TOKEN)
  • Databricks CLI config (~/.databrickscfg)
  • Azure CLI / Managed Identity
  • Google Cloud credentials
  • OAuth M2M (Service Principal)
// Credentials are automatically detected
const model = new ChatDatabricks({
  model: "your-model",
});

Environment Variables

The following environment variables are supported

VariableDescription
DATABRICKS_HOSTWorkspace or account URL
DATABRICKS_TOKENPersonal access token
DATABRICKS_CLIENT_IDOAuth client ID / Azure client ID
DATABRICKS_CLIENT_SECRETOAuth client secret / Azure client secret
DATABRICKS_ACCOUNT_IDDatabricks account ID (for account-level operations)
DATABRICKS_AZURE_TENANT_IDAzure tenant ID
DATABRICKS_GOOGLE_SERVICE_ACCOUNTGCP service account email
DATABRICKS_AUTH_TYPEForce specific auth type

Explicit Auth

You can also pass credentials directly via the auth field:

const model = new ChatDatabricks({
  model: "your-model",
  auth: {
    host: "https://your-workspace.databricks.com",
    token: "dapi...",
  },
});

Streaming

const stream = await model.stream("Tell me a story");

for await (const chunk of stream) {
  process.stdout.write(chunk.content as string);
}

Tool Calling

const modelWithTools = model.bindTools([
  {
    type: "function",
    function: {
      name: "get_weather",
      description: "Get the current weather for a location",
      parameters: {
        type: "object",
        properties: {
          location: {
            type: "string",
            description: "The city and state, e.g. San Francisco, CA",
          },
        },
        required: ["location"],
      },
    },
  },
]);

const response = await modelWithTools.invoke("What's the weather in NYC?");

if (response.tool_calls) {
  for (const toolCall of response.tool_calls) {
    console.log(`Tool: ${toolCall.name}`);
    console.log(`Args: ${JSON.stringify(toolCall.args)}`);
  }
}

Using LangChain Tools

import { z } from "zod";
import { tool } from "@langchain/core/tools";

const weatherTool = tool(
  async ({ location }) => {
    return `The weather in ${location} is sunny, 72°F`;
  },
  {
    name: "get_weather",
    description: "Get weather for a location",
    schema: z.object({
      location: z.string().describe("City and state"),
    }),
  }
);

const modelWithTools = model.bindTools([weatherTool]);

Using with LangChain Agents

ChatDatabricks works with LangChain's createAgent:

import { createAgent } from "langchain";
import { ChatDatabricks } from "@databricks/langchainjs";

const model = new ChatDatabricks({
  model: "databricks-claude-sonnet-4-5",
});

const agent = createAgent({
  llm: model,
  tools: [weatherTool, searchTool],
});

const result = await agent.invoke("What's the weather in Paris?");

MCP (Model Context Protocol) Integration

Connect to MCP servers to dynamically load tools from Databricks services and external APIs.

Connecting to Databricks MCP Servers

import { MultiServerMCPClient } from "@langchain/mcp-adapters";
import { ChatDatabricks, buildMCPServerConfig, DatabricksMCPServer } from "@databricks/langchainjs";

// Create MCP server for Databricks SQL (host resolved from DATABRICKS_HOST env var)
const sqlServer = new DatabricksMCPServer({
  name: "dbsql",
  path: "/api/2.0/mcp/sql",
});

// Build config and create client
const mcpServers = await buildMCPServerConfig([sqlServer]);
const client = new MultiServerMCPClient({ mcpServers });
const tools = await client.getTools();

// Use with ChatDatabricks
const model = new ChatDatabricks({ model: "databricks-claude-sonnet-4-5" });
const modelWithTools = model.bindTools(tools);

const response = await modelWithTools.invoke("Query the sales table");

// Clean up when done
await client.close();

Factory Methods for Databricks Services

// Unity Catalog Functions
const ucServer = DatabricksMCPServer.fromUCFunction(
  "catalog",
  "schema",
  "function_name" // optional - omit to expose all functions in schema
);

// Vector Search
const vectorServer = DatabricksMCPServer.fromVectorSearch(
  "catalog",
  "schema",
  "index_name" // optional
);

// Genie Space
const genieServer = DatabricksMCPServer.fromGenieSpace("space_id");

Multiple MCP Servers

import { MultiServerMCPClient } from "@langchain/mcp-adapters";
import { buildMCPServerConfig } from "@databricks/langchainjs";

const mcpServers = await buildMCPServerConfig([sqlServer, ucServer, vectorServer]);
const client = new MultiServerMCPClient({
  mcpServers,
  throwOnLoadError: false, // Continue if some servers fail
  prefixToolNameWithServerName: true, // Avoid tool name conflicts
});

const tools = await client.getTools();
console.log(`Loaded ${tools.length} tools`);

Explicit Authentication per Server

Each DatabricksMCPServer can use different credentials via auth:

// Server using service principal (M2M OAuth)
const server1 = new DatabricksMCPServer({
  name: "workspace-1",
  path: "/api/2.0/mcp/sql",
  auth: {
    host: "https://workspace-1.databricks.com",
    clientId: process.env.SP_CLIENT_ID,
    clientSecret: process.env.SP_CLIENT_SECRET,
  },
});

// Server using personal access token
const server2 = new DatabricksMCPServer({
  name: "workspace-2",
  path: "/api/2.0/mcp/sql",
  auth: {
    host: "https://workspace-2.databricks.com",
    token: process.env.DATABRICKS_TOKEN_WS2,
  },
});

// Server using default auth chain (env vars, CLI config, etc.)
const server3 = new DatabricksMCPServer({
  name: "default-workspace",
  path: "/api/2.0/mcp/sql",
});

Generic MCP Servers

For non-Databricks MCP servers, use MCPServer:

import { MCPServer } from "@databricks/langchainjs";

const externalServer = new MCPServer({
  name: "external-api",
  url: "https://api.example.com/mcp",
  headers: { "X-API-Key": process.env.API_KEY },
  timeout: 30, // seconds
});

Configuration Options

const model = new ChatDatabricks({
  // Required
  model: "your-model-name",

  // Use Responses API instead of Chat Completions (optional)
  useResponsesApi: true,

  // Model parameters (optional)
  temperature: 0.7,
  maxTokens: 1000,
  stop: ["\n\n"],
});

Call-Time Options

Options can also be passed at call time:

const response = await model.invoke("Hello", {
  temperature: 0.5,
  maxTokens: 100,
  stop: ["."],
});

Multi-Turn Conversations

import { HumanMessage, AIMessage, SystemMessage } from "@langchain/core/messages";

const response = await model.invoke([
  new SystemMessage("You are a helpful assistant."),
  new HumanMessage("What's the capital of France?"),
  new AIMessage("The capital of France is Paris."),
  new HumanMessage("What's its population?"),
]);

Examples

See the examples folder for complete working examples.

# Copy the example env file and fill in your credentials
cp .env.example .env.local

# Edit .env.local with your environment variables
# Then run the example
npm run example

Alternatively, set environment variables directly:

export DATABRICKS_HOST=https://your-workspace.databricks.com
export DATABRICKS_TOKEN=dapi...

# Run the basic example
npm run example

# Run the tools example
npm run example:tools

# Run the MCP example
npm run example:mcp

Development

# Install dependencies
npm install

# Build
npm run build

# Run unit tests
npm test

# Run integration tests (requires Databricks credentials)
npm run test:integration

# Type check
npm run typecheck

# Lint and format
npm run lint
npm run format

Contributing

See CONTRIBUTING.md for development guidelines.

Keywords

databricks

FAQs

Package last updated on 29 Jan 2026

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts