
Company News
Socket Named Top Sales Organization by RepVue
Socket won two 2026 Reppy Awards from RepVue, ranking in the top 5% of all sales orgs. AE Alexandra Lister shares what it's like to grow a sales career here.
@elizaos/plugin-ollama-root
Advanced tools
This plugin provides integration with [Ollama](https://ollama.com/)'s local models through the ElizaOS platform. It allows you to leverage locally running LLMs for text generation, embeddings, and object generation.
This plugin provides integration with Ollama's local models through the ElizaOS platform. It allows you to leverage locally running LLMs for text generation, embeddings, and object generation.
Ollama enables running large language models locally on your machine. This plugin connects ElizaOS with your local Ollama installation, giving your characters access to powerful language models running on your own hardware.
llama3, gemma3:latest)Install this plugin in your ElizaOS project:
bun add @elizaos/plugin-ollama
Make sure Ollama is running:
ollama serve
Add the plugin to your character configuration:
"plugins": ["@elizaos/plugin-ollama"]
The plugin requires these environment variables (can be set in .env file or character settings):
"settings": {
"OLLAMA_API_ENDPOINT": "http://localhost:11434/api",
"OLLAMA_SMALL_MODEL": "gemma3:latest",
"OLLAMA_MEDIUM_MODEL": "gemma3:latest",
"OLLAMA_LARGE_MODEL": "gemma3:latest",
"OLLAMA_EMBEDDING_MODEL": "nomic-embed-text:latest"
}
Or in .env file:
OLLAMA_API_ENDPOINT=http://localhost:11434/api
OLLAMA_SMALL_MODEL=gemma3:latest
OLLAMA_MEDIUM_MODEL=gemma3:latest
OLLAMA_LARGE_MODEL=gemma3:latest
OLLAMA_EMBEDDING_MODEL=nomic-embed-text:latest
OLLAMA_API_ENDPOINT: Ollama API endpoint (default: http://localhost:11434/api)OLLAMA_SMALL_MODEL: Model for simpler tasks (default: gemma3:latest)OLLAMA_MEDIUM_MODEL: Medium-complexity model (default: gemma3:latest)OLLAMA_LARGE_MODEL: Model for complex tasks (default: gemma3:latest)OLLAMA_EMBEDDING_MODEL: Model for text embeddings (default: nomic-embed-text:latest)The plugin provides these model classes:
TEXT_SMALL: Optimized for fast responses with simpler promptsTEXT_LARGE: For complex tasks requiring deeper reasoningTEXT_EMBEDDING: Text embedding modelOBJECT_SMALL: JSON object generation with simpler modelsOBJECT_LARGE: JSON object generation with more complex modelsFor detailed information about the Ollama API used by this plugin, refer to the official Ollama API documentation.
Generate text using smaller, faster models optimized for quick responses:
const text = await runtime.useModel(ModelType.TEXT_SMALL, {
prompt: "What is the nature of reality?",
stopSequences: [], // optional
});
Generate comprehensive text responses using more powerful models for complex tasks:
const text = await runtime.useModel(ModelType.TEXT_LARGE, {
prompt: "Write a detailed explanation of quantum physics",
stopSequences: [], // optional
maxTokens: 8192, // optional (default: 8192)
temperature: 0.7, // optional (default: 0.7)
frequencyPenalty: 0.7, // optional (default: 0.7)
presencePenalty: 0.7, // optional (default: 0.7)
});
Generate vector embeddings for text, which can be used for semantic search or other vector operations:
const embedding = await runtime.useModel(ModelType.TEXT_EMBEDDING, {
text: "Text to embed",
});
// or
const embedding = await runtime.useModel(
ModelType.TEXT_EMBEDDING,
"Text to embed",
);
Generate structured JSON objects using faster models:
const object = await runtime.useModel(ModelType.OBJECT_SMALL, {
prompt: "Generate a JSON object representing a user profile",
temperature: 0.7, // optional
});
Generate complex, detailed JSON objects using more powerful models:
const object = await runtime.useModel(ModelType.OBJECT_LARGE, {
prompt: "Generate a detailed JSON object representing a restaurant",
temperature: 0.7, // optional
});
OLLAMA_API_ENDPOINTOLLAMA_API_ENDPOINT points to the correct host and portollama pull modelnameollama listSee LICENSE file for details.
FAQs
This plugin provides integration with [Ollama](https://ollama.com/)'s local models through the ElizaOS platform. It allows you to leverage locally running LLMs for text generation, embeddings, and object generation.
We found that @elizaos/plugin-ollama-root demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 5 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Company News
Socket won two 2026 Reppy Awards from RepVue, ranking in the top 5% of all sales orgs. AE Alexandra Lister shares what it's like to grow a sales career here.

Security News
NIST will stop enriching most CVEs under a new risk-based model, narrowing the NVD's scope as vulnerability submissions continue to surge.

Company News
/Security News
Socket is an initial recipient of OpenAI's Cybersecurity Grant Program, which commits $10M in API credits to defenders securing open source software.