πŸš€ Big News:Socket Has Acquired Secure Annex.Learn More β†’
Socket
Book a DemoSign in
Socket

@prismer/agent-core

Package Overview
Dependencies
Maintainers
1
Versions
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@prismer/agent-core

Lightweight TypeScript agent runtime. Zero heavy dependencies. OpenAI-compatible.

latest
Source
npmnpm
Version
0.3.1
Version published
Weekly downloads
2
-33.33%
Maintainers
1
Weekly downloads
Β 
Created
Source

Prismer

@prismer/agent-core

Lightweight TypeScript agent runtime. Zero heavy dependencies. OpenAI-compatible.

npm license node deps

Features

  • Agent loop β€” tool calling, sub-agent delegation, doom-loop detection, context guard
  • OpenAI-compatible β€” works with any /chat/completions endpoint (OpenAI, Anthropic, Ollama, etc.)
  • File-based memory β€” keyword recall, zero vector DB dependency. Beats Letta/MemGPT on LoCoMo (86% vs 74%)
  • Context compaction β€” automatic fact extraction + LLM summarization when context overflows
  • Lifecycle hooks β€” before_prompt, before_tool, after_tool, agent_end
  • Skills β€” installable SKILL.md extensions with ClawHub (pure JS git clone)
  • Channels β€” Telegram, Cloud IM adapters (auto-detected from env)
  • HTTP + WebSocket gateway β€” zero external dependencies, real-time streaming
  • CLI β€” lumin agent, lumin serve, lumin health
  • ~4,900 LOC β€” single production dependency (Zod)

Quick Start

Install

npm install @prismer/agent-core

Programmatic

import { runAgent } from '@prismer/agent-core';

process.env.OPENAI_API_KEY = 'sk-...';

await runAgent({
  type: 'message',
  content: 'What is 2 + 2?',
});

CLI

# Run agent with a message
lumin agent --message "Hello, world!"

# Start HTTP + WebSocket gateway
lumin serve --port 3001

# Health check
lumin health

WebSocket

const ws = new WebSocket('ws://localhost:3001/v1/stream');

ws.onopen = () => {
  ws.send(JSON.stringify({ type: 'chat.send', content: 'Hello!' }));
};

ws.onmessage = (e) => {
  const msg = JSON.parse(e.data);
  if (msg.type === 'text.delta') process.stdout.write(msg.delta);
  if (msg.type === 'chat.final') console.log('\n---\nDone:', msg.toolsUsed);
};

HTTP

# Chat (synchronous)
curl -X POST http://localhost:3001/v1/chat \
  -H 'Content-Type: application/json' \
  -d '{"content": "List files in the workspace"}'

# List tools
curl http://localhost:3001/v1/tools

# Health
curl http://localhost:3001/health

Configuration

All settings via environment variables. Sensible defaults for standalone use.

VariableDescriptionDefault
OPENAI_API_KEYLLM provider API key(required)
OPENAI_API_BASE_URLLLM provider base URLhttps://api.openai.com/v1
AGENT_DEFAULT_MODELDefault model IDgpt-4o
WORKSPACE_DIRWorking directory./workspace
LUMIN_PORTHTTP/WS server port3001
MAX_CONTEXT_CHARSCompaction threshold (chars)600000
MODEL_FALLBACK_CHAINFallback models (comma-separated)β€”
PRISMER_PLUGIN_PATHPath to workspace pluginβ€”
TELEGRAM_BOT_TOKENTelegram channel (optional)β€”
LOG_LEVELdebug / info / warn / errorinfo

Custom Tools

import { createTool } from '@prismer/agent-core';
import { ToolRegistry } from '@prismer/agent-core/tools';

const tools = new ToolRegistry();

tools.register(createTool(
  'weather',
  'Get current weather for a city',
  {
    type: 'object',
    properties: {
      city: { type: 'string', description: 'City name' },
    },
    required: ['city'],
  },
  async (args) => {
    const res = await fetch(`https://wttr.in/${args.city}?format=j1`);
    return JSON.stringify(await res.json());
  },
));

Architecture

@prismer/agent-core
β”œβ”€β”€ Core
β”‚   β”œβ”€β”€ PrismerAgent             β€” agent loop + tool execution + doom-loop detection
β”‚   β”œβ”€β”€ OpenAICompatibleProvider β€” LLM client (any /chat/completions endpoint)
β”‚   β”œβ”€β”€ FallbackProvider         β€” automatic model fallback chain
β”‚   β”œβ”€β”€ ToolRegistry             β€” tool registration + JSON Schema specs
β”‚   └── EventBus                 β€” SSE / WebSocket event streaming
β”œβ”€β”€ Memory
β”‚   β”œβ”€β”€ MemoryStore (facade)     β€” store / recall / search / recent
β”‚   └── FileMemoryBackend        β€” keyword-based, zero-dependency
β”œβ”€β”€ Infrastructure
β”‚   β”œβ”€β”€ HTTP + WebSocket server  β€” zero external deps (pure node:http)
β”‚   β”œβ”€β”€ CLI                      β€” agent / serve / health commands
β”‚   β”œβ”€β”€ SessionStore             β€” session management + compaction state
β”‚   └── Config                   β€” Zod-validated, env var override
└── Extensions
    β”œβ”€β”€ HookRegistry             β€” before_prompt, before_tool, after_tool, agent_end
    β”œβ”€β”€ SkillLoader              β€” SKILL.md + YAML frontmatter
    β”œβ”€β”€ AgentRegistry            β€” sub-agent delegation via @mention
    └── ChannelManager           β€” Telegram, Cloud IM adapters

Subpath Exports

import { PrismerAgent } from '@prismer/agent-core/agent';
import { OpenAICompatibleProvider } from '@prismer/agent-core/provider';
import { ToolRegistry } from '@prismer/agent-core/tools';
import { SessionStore } from '@prismer/agent-core/session';
import { MemoryStore, FileMemoryBackend } from '@prismer/agent-core/memory';
import { HookRegistry } from '@prismer/agent-core/hooks';
import { EventBus } from '@prismer/agent-core/sse';
import { loadConfig } from '@prismer/agent-core/config';
import { createLogger } from '@prismer/agent-core/log';
import { VERSION } from '@prismer/agent-core';

Memory System

Zero-dependency file-based memory with keyword recall. Tested on the LoCoMo long-term conversation memory benchmark:

ModelOverallNo Adversarialvs Letta/MemGPT
Claude Opus 4.686%95%+12pp
Kimi K2.563%56%-11pp
Letta/MemGPT~74%β€”baseline

Zero-dependency keyword search + strong LLM outperforms Letta's embedding+rerank pipeline.

import { MemoryStore } from '@prismer/agent-core/memory';

const memory = new MemoryStore('./workspace');
await memory.store('The calibration coefficient is 0.03847', ['numeric']);
const results = await memory.search('calibration coefficient');
console.log(results); // [{ content: '...', score: 1.0, tags: ['numeric'] }]

Workspace Templates

Drop markdown files into your workspace to customize agent behavior:

FilePriorityPurpose
IDENTITY.md / SOUL.md10Agent identity and persona
AGENTS.md9Sub-agent routing and priorities
TOOLS.md8Tool reference documentation
USER.md3.5User preferences and context

Skills

Install agent skills from git repositories:

# Via agent tool call
clawhub install research-paper
clawhub install https://github.com/user/my-skill.git
clawhub list
clawhub search "data analysis"

Each skill is a directory with a SKILL.md file (YAML frontmatter + markdown body) that gets injected into the system prompt.

API Reference

See docs/API.md for the complete HTTP, WebSocket, and IPC protocol documentation.

Contributing

git clone https://github.com/prismer-ai/agent-core.git
cd agent-core
npm install
npm run build      # TypeScript compilation
npm test           # Run all tests (vitest)
npm run typecheck  # Type checking only

License

MIT Β© 2026 Prismer.AI

Keywords

agent

FAQs

Package last updated on 12 Mar 2026

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts