
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
@databricks/langchainjs
Advanced tools
LangChain TypeScript integration for Databricks Model Serving.
This package provides a ChatDatabricks class that integrates with the LangChain ecosystem, allowing you to use Databricks Model Serving endpoints with LangChain's chat model interface.
BaseChatModel interfacenpm install @databricks/langchainjs
import { ChatDatabricks } from "@databricks/langchainjs";
const model = new ChatDatabricks({
model: "databricks-claude-sonnet-4-5",
});
const response = await model.invoke("Hello, how are you?");
console.log(response.content);
ChatDatabricks supports Chat Completions or Responses via useResponsesApi:
OpenAI-compatible chat completions for Foundation Models.
const model = new ChatDatabricks({
model: "databricks-claude-sonnet-4-5",
useResponsesApi: false, // can be omitted
});
Rich output with reasoning, citations, and function calls.
const model = new ChatDatabricks({
model: "databricks-gpt-5-2",
useResponsesApi: true,
});
ChatDatabricks uses the Databricks SDK for authentication, which automatically detects credentials from:
DATABRICKS_HOST, DATABRICKS_TOKEN)~/.databrickscfg)// Credentials are automatically detected
const model = new ChatDatabricks({
model: "your-model",
});
The following environment variables are supported
| Variable | Description |
|---|---|
| DATABRICKS_HOST | Workspace or account URL |
| DATABRICKS_TOKEN | Personal access token |
| DATABRICKS_CLIENT_ID | OAuth client ID / Azure client ID |
| DATABRICKS_CLIENT_SECRET | OAuth client secret / Azure client secret |
| DATABRICKS_ACCOUNT_ID | Databricks account ID (for account-level operations) |
| DATABRICKS_AZURE_TENANT_ID | Azure tenant ID |
| DATABRICKS_GOOGLE_SERVICE_ACCOUNT | GCP service account email |
| DATABRICKS_AUTH_TYPE | Force specific auth type |
You can also pass credentials directly via the auth field:
const model = new ChatDatabricks({
model: "your-model",
auth: {
host: "https://your-workspace.databricks.com",
token: "dapi...",
},
});
const stream = await model.stream("Tell me a story");
for await (const chunk of stream) {
process.stdout.write(chunk.content as string);
}
const modelWithTools = model.bindTools([
{
type: "function",
function: {
name: "get_weather",
description: "Get the current weather for a location",
parameters: {
type: "object",
properties: {
location: {
type: "string",
description: "The city and state, e.g. San Francisco, CA",
},
},
required: ["location"],
},
},
},
]);
const response = await modelWithTools.invoke("What's the weather in NYC?");
if (response.tool_calls) {
for (const toolCall of response.tool_calls) {
console.log(`Tool: ${toolCall.name}`);
console.log(`Args: ${JSON.stringify(toolCall.args)}`);
}
}
import { z } from "zod";
import { tool } from "@langchain/core/tools";
const weatherTool = tool(
async ({ location }) => {
return `The weather in ${location} is sunny, 72°F`;
},
{
name: "get_weather",
description: "Get weather for a location",
schema: z.object({
location: z.string().describe("City and state"),
}),
}
);
const modelWithTools = model.bindTools([weatherTool]);
ChatDatabricks works with LangChain's createAgent:
import { createAgent } from "langchain";
import { ChatDatabricks } from "@databricks/langchainjs";
const model = new ChatDatabricks({
model: "databricks-claude-sonnet-4-5",
});
const agent = createAgent({
llm: model,
tools: [weatherTool, searchTool],
});
const result = await agent.invoke("What's the weather in Paris?");
Connect to MCP servers to dynamically load tools from Databricks services and external APIs.
import { MultiServerMCPClient } from "@langchain/mcp-adapters";
import { ChatDatabricks, buildMCPServerConfig, DatabricksMCPServer } from "@databricks/langchainjs";
// Create MCP server for Databricks SQL (host resolved from DATABRICKS_HOST env var)
const sqlServer = new DatabricksMCPServer({
name: "dbsql",
path: "/api/2.0/mcp/sql",
});
// Build config and create client
const mcpServers = await buildMCPServerConfig([sqlServer]);
const client = new MultiServerMCPClient({ mcpServers });
const tools = await client.getTools();
// Use with ChatDatabricks
const model = new ChatDatabricks({ model: "databricks-claude-sonnet-4-5" });
const modelWithTools = model.bindTools(tools);
const response = await modelWithTools.invoke("Query the sales table");
// Clean up when done
await client.close();
// Unity Catalog Functions
const ucServer = DatabricksMCPServer.fromUCFunction(
"catalog",
"schema",
"function_name" // optional - omit to expose all functions in schema
);
// Vector Search
const vectorServer = DatabricksMCPServer.fromVectorSearch(
"catalog",
"schema",
"index_name" // optional
);
// Genie Space
const genieServer = DatabricksMCPServer.fromGenieSpace("space_id");
import { MultiServerMCPClient } from "@langchain/mcp-adapters";
import { buildMCPServerConfig } from "@databricks/langchainjs";
const mcpServers = await buildMCPServerConfig([sqlServer, ucServer, vectorServer]);
const client = new MultiServerMCPClient({
mcpServers,
throwOnLoadError: false, // Continue if some servers fail
prefixToolNameWithServerName: true, // Avoid tool name conflicts
});
const tools = await client.getTools();
console.log(`Loaded ${tools.length} tools`);
Each DatabricksMCPServer can use different credentials via auth:
// Server using service principal (M2M OAuth)
const server1 = new DatabricksMCPServer({
name: "workspace-1",
path: "/api/2.0/mcp/sql",
auth: {
host: "https://workspace-1.databricks.com",
clientId: process.env.SP_CLIENT_ID,
clientSecret: process.env.SP_CLIENT_SECRET,
},
});
// Server using personal access token
const server2 = new DatabricksMCPServer({
name: "workspace-2",
path: "/api/2.0/mcp/sql",
auth: {
host: "https://workspace-2.databricks.com",
token: process.env.DATABRICKS_TOKEN_WS2,
},
});
// Server using default auth chain (env vars, CLI config, etc.)
const server3 = new DatabricksMCPServer({
name: "default-workspace",
path: "/api/2.0/mcp/sql",
});
For non-Databricks MCP servers, use MCPServer:
import { MCPServer } from "@databricks/langchainjs";
const externalServer = new MCPServer({
name: "external-api",
url: "https://api.example.com/mcp",
headers: { "X-API-Key": process.env.API_KEY },
timeout: 30, // seconds
});
const model = new ChatDatabricks({
// Required
model: "your-model-name",
// Use Responses API instead of Chat Completions (optional)
useResponsesApi: true,
// Model parameters (optional)
temperature: 0.7,
maxTokens: 1000,
stop: ["\n\n"],
});
Options can also be passed at call time:
const response = await model.invoke("Hello", {
temperature: 0.5,
maxTokens: 100,
stop: ["."],
});
import { HumanMessage, AIMessage, SystemMessage } from "@langchain/core/messages";
const response = await model.invoke([
new SystemMessage("You are a helpful assistant."),
new HumanMessage("What's the capital of France?"),
new AIMessage("The capital of France is Paris."),
new HumanMessage("What's its population?"),
]);
See the examples folder for complete working examples.
# Copy the example env file and fill in your credentials
cp .env.example .env.local
# Edit .env.local with your environment variables
# Then run the example
npm run example
Alternatively, set environment variables directly:
export DATABRICKS_HOST=https://your-workspace.databricks.com
export DATABRICKS_TOKEN=dapi...
# Run the basic example
npm run example
# Run the tools example
npm run example:tools
# Run the MCP example
npm run example:mcp
# Install dependencies
npm install
# Build
npm run build
# Run unit tests
npm test
# Run integration tests (requires Databricks credentials)
npm run test:integration
# Type check
npm run typecheck
# Lint and format
npm run lint
npm run format
See CONTRIBUTING.md for development guidelines.
FAQs
LangChain integration for Databricks Model Serving
The npm package @databricks/langchainjs receives a total of 122 weekly downloads. As such, @databricks/langchainjs popularity was classified as not popular.
We found that @databricks/langchainjs demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 4 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.