
Research
2025 Report: Destructive Malware in Open Source Packages
Destructive malware is rising across open source registries, using delays and kill switches to wipe code, break builds, and disrupt CI/CD.
token.js-fork
Advanced tools
Integrate 200+ LLMs with one TypeScript SDK using OpenAI's format. Free and open source. No proxy server required.
npm install token.js
Import the Token.js client and call the create function with a prompt in OpenAI's format. Specify the model and LLM provider using their respective fields.
OPENAI_API_KEY=<openai api key>
import { TokenJS } from 'token.js'
// Create the Token.js client
const tokenjs = new TokenJS()
async function main() {
// Create a model response
const completion = await tokenjs.chat.completions.create({
// Specify the provider and model
provider: 'openai',
model: 'gpt-4o',
// Define your message
messages: [
{
role: 'user',
content: 'Hello!',
},
],
})
console.log(completion.choices[0])
}
main()
We recommend using environment variables to configure the credentials for each LLM provider.
# OpenAI
OPENAI_API_KEY=
# AI21
AI21_API_KEY=
# Anthropic
ANTHROPIC_API_KEY=
# Cohere
COHERE_API_KEY=
# Gemini
GEMINI_API_KEY=
# Groq
GROQ_API_KEY=
# Mistral
MISTRAL_API_KEY=
# Perplexity
PERPLEXITY_API_KEY=
# OpenRouter
OPENROUTER_API_KEY=
# AWS Bedrock
AWS_REGION_NAME=
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
# OpenAI Compatible
OPENAI_COMPATIBLE_API_KEY=
Token.js supports streaming responses for all providers that offer it.
import { TokenJS } from 'token.js'
const tokenjs = new TokenJS()
async function main() {
const result = await tokenjs.chat.completions.create({
stream: true,
provider: 'openai',
model: 'gpt-4o',
messages: [
{
role: 'user',
content: `Tell me about yourself.`,
},
],
})
for await (const part of result) {
process.stdout.write(part.choices[0]?.delta?.content || '')
}
}
main()
Token.js supports the function calling tool for all providers and models that offer it.
import { TokenJS, ChatCompletionTool } from 'token.js'
const tokenjs = new TokenJS()
async function main() {
const tools: ChatCompletionTool[] = [
{
type: 'function',
function: {
name: 'get_current_weather',
description: 'Get the current weather in a given location',
parameters: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'The city and state, e.g. San Francisco, CA',
},
},
required: ['location'],
},
},
},
]
const result = await tokenjs.chat.completions.create({
provider: 'gemini',
model: 'gemini-1.5-pro',
messages: [
{
role: 'user',
content: `What's the weather like in San Francisco?`,
},
],
tools,
tool_choice: 'auto',
})
console.log(result.choices[0].message.tool_calls)
}
main()
Token.js allows you to extend the predefined model list using the extendModelList method. Here are some example scenarios where this is useful:
us.anthropic.claude-3-sonnetimport { TokenJS } from 'token.js'
// Example in 2 steps: Adding AWS Bedrock Claude models with region prefix
const tokenjs = new TokenJS();
// Step 1: Register the new model name
tokenjs.extendModelList(
"bedrock",
'us.anthropic.claude-3-5-sonnet-20241022-v2:0',
"anthropic.claude-3-sonnet-20240229-v1:0" // Copy features from existing model
);
// Step 2: Using the extended model in a chat completion
const result = await tokenjs.chat.completions.create({
stream: true,
provider: 'bedrock',
model: 'us.anthropic.claude-3-5-sonnet-20241022-v2:0' as any, // Type casting as 'any' required
messages: [
{
role: 'user',
content: 'Tell me about yourself.',
},
],
});
Note: When using extended models, type casting (as any) is required
The featureSupport parameter can be either:
| Feature | Type | Description |
|---|---|---|
| streaming | boolean | Whether the model supports streaming responses |
| json | boolean | Whether the model supports JSON mode |
| toolCalls | boolean | Whether the model supports function calling |
| images | boolean | Whether the model supports image inputs |
This table provides an overview of the features that Token.js supports from each LLM provider.
| Provider | Chat Completion | Streaming | Function Calling Tool | JSON Output | Image Input |
|---|---|---|---|---|---|
| OpenAI | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| Anthropic | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| Bedrock | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| Mistral | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :heavy_minus_sign: |
| Cohere | :white_check_mark: | :white_check_mark: | :white_check_mark: | :heavy_minus_sign: | :heavy_minus_sign: |
| AI21 | :white_check_mark: | :white_check_mark: | :heavy_minus_sign: | :heavy_minus_sign: | :heavy_minus_sign: |
| Gemini | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| Groq | :white_check_mark: | :white_check_mark: | :heavy_minus_sign: | :white_check_mark: | :heavy_minus_sign: |
| Perplexity | :white_check_mark: | :white_check_mark: | :heavy_minus_sign: | :heavy_minus_sign: | :heavy_minus_sign: |
| OpenRouter | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| OpenAI Compatible | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| Symbol | Description |
|---|---|
| :white_check_mark: | Supported by Token.js |
| :heavy_minus_sign: | Not supported by the LLM provider, so Token.js cannot support it |
Note: Certain LLMs, particularly older or weaker models, do not support some features in this table. For details about these restrictions, see our LLM provider documentation.
See our Contributing guide to learn how to contribute to Token.js.
Please let us know if there's any way that we can improve Token.js by opening an issue!
Token.js is free and open source software licensed under MIT.
FAQs
Integrate 9 LLM providers with a single Typescript SDK using OpenAIs format.
The npm package token.js-fork receives a total of 389 weekly downloads. As such, token.js-fork popularity was classified as not popular.
We found that token.js-fork demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Research
Destructive malware is rising across open source registries, using delays and kill switches to wipe code, break builds, and disrupt CI/CD.

Security News
Socket CTO Ahmad Nassri shares practical AI coding techniques, tools, and team workflows, plus what still feels noisy and why shipping remains human-led.

Research
/Security News
A five-month operation turned 27 npm packages into durable hosting for browser-run lures that mimic document-sharing portals and Microsoft sign-in, targeting 25 organizations across manufacturing, industrial automation, plastics, and healthcare for credential theft.