
Security News
Rust RFC Proposes a Security Tab on crates.io for RustSec Advisories
Rustโs crates.io team is advancing an RFC to add a Security tab that surfaces RustSec vulnerability and unsoundness advisories directly on crate pages.
ollama-llm-bridge
Advanced tools
Universal Ollama LLM Bridge supporting multiple models (Llama, Gemma, etc.) with a unified interface.
# pnpm (๊ถ์ฅ)
pnpm add ollama-llm-bridge llm-bridge-spec ollama zod
# npm
npm install ollama-llm-bridge llm-bridge-spec ollama zod
# yarn
yarn add ollama-llm-bridge llm-bridge-spec ollama zod
This package follows the Abstract Model Pattern inspired by the bedrock-llm-bridge:
ollama-llm-bridge/
โโโ models/
โ โโโ base/AbstractOllamaModel # Abstract base class
โ โโโ llama/LlamaModel # Llama implementation
โ โโโ gemma/GemmaModel # Gemma implementation
โ โโโ gpt-oss/GptOssModel # GPT-OSS implementation
โโโ bridge/OllamaBridge # Main bridge class
โโโ factory/ # Factory functions
โโโ utils/error-handler # Error handling
import { createOllamaBridge } from 'ollama-llm-bridge';
// Create bridge with auto-detected model
const bridge = createOllamaBridge({
host: 'http://localhost:11434',
model: 'llama3.2', // or 'gemma3n:latest' or 'gpt-oss-20:b'
temperature: 0.7,
});
// Simple chat
const response = await bridge.invoke({
messages: [{ role: 'user', content: [{ type: 'text', text: 'Hello!' }] }],
});
console.log(response.choices[0].message.content[0].text);
// Streaming chat
const stream = bridge.invokeStream({
messages: [{ role: 'user', content: [{ type: 'text', text: 'Tell me a story' }] }],
});
for await (const chunk of stream) {
const text = chunk.choices[0]?.message?.content[0]?.text;
if (text) {
process.stdout.write(text);
}
}
const response = await bridge.invoke({
messages: [
{
role: 'user',
content: [
{ type: 'text', text: 'What do you see in this image?' },
{ type: 'image', data: 'base64_encoded_image_data' },
],
},
],
});
import { createOllamaBridge } from 'ollama-llm-bridge';
const bridge = createOllamaBridge({
host: 'http://localhost:11434',
model: 'llama3.2', // Required
temperature: 0.7,
num_predict: 4096,
});
import {
createLlamaBridge,
createGemmaBridge,
createGptOssBridge,
createDefaultOllamaBridge,
} from 'ollama-llm-bridge';
// Llama with defaults
const llamaBridge = createLlamaBridge({
model: 'llama3.2', // Optional, defaults to 'llama3.2'
temperature: 0.8,
});
// Gemma with defaults
const gemmaBridge = createGemmaBridge({
model: 'gemma3n:7b', // Optional, defaults to 'gemma3n:latest'
num_predict: 1024,
});
// GPT-OSS with defaults
const gptOssBridge = createGptOssBridge({
model: 'gpt-oss-20:b', // Optional, defaults to 'gpt-oss-20:b'
});
// Default configuration (Llama 3.2)
const defaultBridge = createDefaultOllamaBridge({
temperature: 0.5, // Override defaults
});
llama3.2 (with multi-modal support)llama3.1llama3llama2llamagemma3n:latestgemma3n:7bgemma3n:2bgemma2:latestgemma2:7bgemma2:2bgemma:latestgemma:7bgemma:2bgpt-oss-20:binterface OllamaBaseConfig {
host?: string; // Default: 'http://localhost:11434'
model: string; // Required: Model ID
temperature?: number; // 0.0 - 1.0
top_p?: number; // 0.0 - 1.0
top_k?: number; // Integer >= 1
num_predict?: number; // Max tokens to generate
stop?: string[]; // Stop sequences
seed?: number; // Seed for reproducibility
stream?: boolean; // Default: false
}
// Get model capabilities
const capabilities = bridge.getMetadata();
console.log(capabilities);
// {
// name: 'Llama',
// version: '3.2',
// description: 'Ollama Llama Bridge',
// model: 'llama3.2',
// contextWindow: 8192,
// maxTokens: 4096
// }
// Check model features
const features = bridge.model.getCapabilities();
console.log(features.multiModal); // true for Llama 3.2+
console.log(features.streaming); // true for all models
console.log(features.functionCalling); // false (coming soon)
The bridge provides comprehensive error handling with standardized error types:
import { NetworkError, ModelNotSupportedError, ServiceUnavailableError } from 'llm-bridge-spec';
try {
const response = await bridge.invoke(prompt);
} catch (error) {
if (error instanceof NetworkError) {
console.error('Network issue:', error.message);
} else if (error instanceof ModelNotSupportedError) {
console.error('Unsupported model:', error.requestedModel);
console.log('Supported models:', error.supportedModels);
} else if (error instanceof ServiceUnavailableError) {
console.error('Ollama server unavailable. Retry after:', error.retryAfter);
}
}
// Create bridge with initial model
const bridge = createOllamaBridge({ model: 'llama3.2' });
// Switch to different model at runtime
bridge.setModel('gemma3n:latest');
// Get current model
console.log(bridge.getCurrentModel()); // 'gemma3n:latest'
// Get supported models
console.log(bridge.getSupportedModels());
# Run unit tests
pnpm test
# Run tests with coverage
pnpm test:coverage
# Run e2e tests (requires running Ollama server)
pnpm test:e2e
| Feature | llama3-llm-bridge | gemma3n-llm-bridge | ollama-llm-bridge |
|---|---|---|---|
| Code Duplication | โ High | โ High | โ Eliminated |
| Model Support | ๐ถ Llama only | ๐ถ Gemma only | โ Universal |
| Architecture | ๐ถ Basic | ๐ถ Basic | โ Abstract Pattern |
| Extensibility | โ Limited | โ Limited | โ Easy to extend |
| Maintenance | โ Multiple packages | โ Multiple packages | โ Single package |
์ด ํ๋ก์ ํธ๋ Git Workflow Guide๋ฅผ ๋ฐ๋ฆ ๋๋ค.
git checkout -b feature/core-new-featuregit commit -m "โ
[TODO 1/3] Add new model support"
pnpm lint && pnpm test:ci && pnpm build
MIT License - see the LICENSE file for details.
Made with โค๏ธ by the LLM Bridge Team
FAQs
Universal Ollama LLM Bridge for multiple models (Llama, Gemma, etc.)
The npm package ollama-llm-bridge receives a total of 3 weekly downloads. As such, ollama-llm-bridge popularity was classified as not popular.
We found that ollama-llm-bridge demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.ย It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Rustโs crates.io team is advancing an RFC to add a Security tab that surfaces RustSec vulnerability and unsoundness advisories directly on crate pages.

Security News
/Research
Socket found a Rust typosquat (finch-rust) that loads sha-rust to steal credentials, using impersonation and an unpinned dependency to auto-deliver updates.

Research
/Security Fundamentals
A pair of typosquatted Go packages posing as Googleโs UUID library quietly turn helper functions into encrypted exfiltration channels to a paste site, putting developer and CI data at risk.