
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
SQL-Native AI Memory Fabric for JavaScript/TypeScript, powered by sqlite-vec and Google GenAI.
The SQL-Native AI Memory Fabric for JavaScript & TypeScript.
memori-js is not just a vector database wrapper. It is an active memory layer that lives inside your application, automatically managing context for your AI agents. It bridges the gap between your LLM and long-term storage without the complexity of building manual RAG (Retrieval-Augmented Generation) pipelines.
Inspired by the memorilabs.ai Python library.
embedding prop.googleApiKey is now apiKey and MEMORI_API_KEY.If you are building an AI app today, you usually have to:
With memori-js, you just do this:
// 1 line to register memory
memori.llm.register(client);
// Call your LLM as normal
await client.chat.completions.create({ ... });
| Feature | Standard Vector DB | 🧠 Memori-JS |
|---|---|---|
| Setup | Requires Docker, API keys, or cloud infrastructure. | Zero-Config. Creates a local memori.db SQLite file instantly. |
| Scalability | Manual migration needed. | Pluggable. Scale from local SQLite to Postgres/Supabase seamlessly. |
| Integration | You write the RAG pipeline logic manually. | Auto-Augmentation. Patches the LLM client to inject memory automatically. |
| Complexity | High (Embeddings, Chunking, Retrieval). | Low. Handles embedding generation and retrieval internally. |
sqlite-vec locally or pgvector in the cloud.npm install memori-js
# or
bun add memori-js
import { Memori } from "memori-js";
const memori = new Memori({
apiKey: process.env.MEMORI_API_KEY, // Default is Google GenAI
});
import { Memori, PostgresVecStore } from "memori-js";
const memori = new Memori({
apiKey: process.env.MEMORI_API_KEY,
// Seamlessly switch to Postgres for production
vectorStore: new PostgresVecStore(process.env.DATABASE_URL!),
});
await memori.config.storage.build(); // Initializes tables if missing
Memori supports "Patching" — it wraps your existing LLM client to add memory capabilities transparently.
import { Memori } from "memori-js";
import OpenAI from "openai";
const memori = new Memori({ apiKey: process.env.MEMORI_API_KEY });
await memori.config.storage.build(); // Init DB
const client = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
memori.llm.register(client); // Auto-detects "openai"
// 1. Teach Memory
await memori.addMemory("My name is John and I am a software engineer.");
// 2. Ask (Context is auto-injected)
const response = await client.chat.completions.create({
model: "gpt-4",
messages: [{ role: "user", content: "Who am I and what do I do?" }],
});
console.log(response.choices[0].message.content);
// Output: "You are John, a software engineer."
import { Memori } from "memori-js";
import { GoogleGenAI } from "@google/genai";
const memori = new Memori({ apiKey: process.env.MEMORI_API_KEY });
await memori.config.storage.build(); // Init DB
const client = new GoogleGenAI({ apiKey: process.env.GOOGLE_API_KEY });
memori.llm.register(client); // Auto-detects "google"
// 1. Teach Memory
await memori.addMemory("My name is John and I am a software engineer.");
// 2. Ask (Context is auto-injected)
const result = await client.models.generateContent({
model: "gemini-2.5-flash",
contents: [
{
role: "user",
parts: [{ text: "Who am I and what do I do?" }],
},
],
});
// Response text is directly available or via candidates
console.log(result.text || result.candidates?.[0]?.content?.parts?.[0]?.text);
// Output: "You are John, a software engineer."
import { Memori } from "memori-js";
import Anthropic from "@anthropic-ai/sdk";
const memori = new Memori({ apiKey: process.env.MEMORI_API_KEY });
await memori.config.storage.build(); // Init DB
const client = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY });
memori.llm.register(client); // Auto-detects "anthropic"
// 1. Teach Memory
await memori.addMemory("My name is John and I am a software engineer.");
// 2. Ask (Context is auto-injected)
const response = await client.messages.create({
model: "claude-3-opus-20240229",
max_tokens: 1000,
messages: [{ role: "user", content: "Who am I and what do I do?" }],
});
console.log(response.content[0].text);
// Output: "You are John, a software engineer."
memori-js uses Arktype for ultra-fast runtime validation and strict typing.
import { ConsoleLogger } from "memori-js";
const memori = new Memori({
apiKey: "...",
dbPath: "./custom-memory.db",
logger: new ConsoleLogger(), // Or pass your own Pino/Winston wrapper
});
For deep customization (Custom Embeddings, Vectors, etc.), check out the Master Configuration Guide. It covers:
For multi-user apps (chatbots, agents), you can isolate memory by User ID and Agent ID.
// Define context for the current operation
memori.attribution("user-123", "agent-sales");
// All subsequent operations are scoped to this user
await memori.addMemory("I like apples."); // Stored with metadata
// Search is filtered automatically
// Search is filtered automatically
const results = await memori.search("What do I like?");
// returns "I like apples." ONLY for user-123
CLaRa is an advanced optimization pipeline that compresses memories before storage and "reasons" about queries before search.
favorite_color: blue. Saves ~40% tokens.Project Chimera status, deadlines.const memori = new Memori({
// ...
clara: {
enableCompression: true,
enableReasoning: true,
// Optional: Use a dedicated fast model for compression (Groq/Ollama)
compressor: {
generate: async (prompt) => {
/* Call Llama 3 / Gemma 2 */
},
},
},
});
Read the full CLaRa Release Notes for benchmarks and implementation details.
Most "Memory" libraries are just complex wrappers around vector stores. memori-js takes a different approach: Memory should be invisible.
As a developer, you shouldn't care how the relevant context is found, only that your agent has it. By pushing this logic down into the infrastructure layer (SQLite/Postgres) and the client layer (Patching), we allow you to build complex, stateful agents with simple, stateless code.
MIT
FAQs
SQL-Native AI Memory Fabric for JavaScript/TypeScript, powered by sqlite-vec and Google GenAI.
The npm package memori-js receives a total of 15 weekly downloads. As such, memori-js popularity was classified as not popular.
We found that memori-js demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.