
Security News
TypeScript 6.0 Released: The Final JavaScript-Based Version
TypeScript 6.0 introduces new standard APIs, modern default settings, and deprecations as it prepares projects for the upcoming TypeScript 7.0 release.
@unroute/sdk
Advanced tools
Unroute is a Typescript framework that easily connects language models (LLMs) to Model Context Protocols (MCPs), allowing you to build agents that use resources and tools without being overwhelmed by JSON schemas.
⚠️ This repository is work in progress and in alpha. Not recommended for production use yet. ⚠️
Key Features
npm install @unroute/sdk
In this example, we'll connect use OpenAI client with Exa search capabilities.
npm install @unroute/mcp-exa
The following code sets up OpenAI and connects to an Exa MCP server. In this case, we're running the server locally within the same process, so it's just a simple passthrough.
import { Connection } from "@unroute/sdk"
import { OpenAIHandler } from "@unroute/sdk/openai"
import * as exa from "@unroute/mcp-exa"
import { OpenAI } from "openai"
const openai = new OpenAI()
const connection = await Connection.connect({
exa: {
server: exa.createServer({
apiKey: process.env.EXA_API_KEY,
}),
},
})
Now you can make your LLM aware of the available tools from Exa.
// Create a handler
const handler = new OpenAIHandler(connection)
const response = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "user", content: "In 2024, did OpenAI release GPT-5?" }],
// Pass the tools to OpenAI call
tools: await handler.listTools(),
})
// Obtain the tool outputs as new messages
const toolMessages = await handler.call(response)
Using this, you can easily enable your LLM to call tools and obtain the results.
However, it's often the case where your LLM needs to call a tool, see its response, and continue processing output of the tool in order to give you a final response.
In this case, you have to loop your LLM call and update your messages until there are no more toolMessages to continue.
Example:
let messages: ChatCompletionMessageParam[] = [
{
role: "user",
content:
"Deduce Obama's age in number of days. It's November 28, 2024 today. Search to ensure correctness.",
},
]
const handler = new OpenAIHandler(connection)
while (!isDone) {
const response = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages,
tools: await handler.listTools(),
})
// Handle tool calls
const toolMessages = await handler.call(response)
// Append new messages
messages.push(response.choices[0].message)
messages.push(...toolMessages)
isDone = toolMessages.length === 0
}
See a full example in the examples directory.
Error: ReferenceError: EventSource is not defined
This event means you're trying to use EventSource API (which is typically used in the browser) from Node. You'll have to install the following to use it:
npm install eventsource
npm install -D @types/eventsource
Patch the global EventSource object:
import EventSource from "eventsource"
global.EventSource = EventSource as any
Developing locally:
npm link -ws --include-workspace-root
Version bumping:
npm version patch -ws --include-workspace-root
FAQs
Connect language models to Model Context Protocols
We found that @unroute/sdk demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
TypeScript 6.0 introduces new standard APIs, modern default settings, and deprecations as it prepares projects for the upcoming TypeScript 7.0 release.

Security News
/Research
Newly published Trivy Docker images (0.69.4, 0.69.5, and 0.69.6) were found to contain infostealer IOCs and were pushed to Docker Hub without corresponding GitHub releases.

Research
/Security News
The worm-enabled campaign hit @emilgroup and @teale.io, then used an ICP canister to deliver follow-on payloads.