
Security News
ECMAScript 2025 Finalized with Iterator Helpers, Set Methods, RegExp.escape, and More
ECMAScript 2025 introduces Iterator Helpers, Set methods, JSON modules, and more in its latest spec update approved by Ecma in June 2025.
@mem0/vercel-ai-provider
Advanced tools
The Mem0 AI SDK Provider is a community-maintained library developed by Mem0 to integrate with the Vercel AI SDK. This library brings enhanced AI interaction capabilities to your applications by introducing persistent memory functionality. With Mem0, language model conversations gain memory, enabling more contextualized and personalized responses based on past interactions.
Discover more of Mem0 on GitHub. Explore the Mem0 Documentation to gain deeper control and flexibility in managing your memories.
For detailed information on using the Vercel AI SDK, refer to Vercelβs API Reference and Documentation.
npm install @mem0/vercel-ai-provider
Obtain your Mem0 API Key from the Mem0 dashboard.
Initialize the Mem0 Client:
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0({
provider: "openai",
mem0ApiKey: "m0-xxx",
apiKey: "openai-api-key",
config: {
compatibility: "strict",
// Additional model-specific configuration options can be added here.
},
});
By default, the openai
provider is used, so specifying it is optional:
const mem0 = createMem0();
For better security, consider setting MEM0_API_KEY
and OPENAI_API_KEY
as environment variables.
import { LanguageModelV1Prompt } from "ai";
import { addMemories } from "@mem0/vercel-ai-provider";
const messages: LanguageModelV1Prompt = [
{
role: "user",
content: [
{ type: "text", text: "I love red cars." },
{ type: "text", text: "I like Toyota Cars." },
{ type: "text", text: "I prefer SUVs." },
],
},
];
await addMemories(messages, { user_id: "borat" });
These memories are now stored in your profile. You can view and manage them on the Mem0 Dashboard.
For standalone features, such as addMemories
and retrieveMemories
,
you must either set MEM0_API_KEY
as an environment variable or pass it directly in the function call.
Example:
await addMemories(messages, { user_id: "borat", mem0ApiKey: "m0-xxx", org_id: "org_xx", project_id: "proj_xx" });
await retrieveMemories(prompt, { user_id: "borat", mem0ApiKey: "m0-xxx", org_id: "org_xx", project_id: "proj_xx" });
await getMemories(prompt, { user_id: "borat", mem0ApiKey: "m0-xxx", org_id: "org_xx", project_id: "proj_xx" });
retrieveMemories
enriches the prompt with relevant memories from your profile, while getMemories
returns the memories in array format which can be used for further processing.
import { generateText } from "ai";
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0();
const { text } = await generateText({
model: mem0("gpt-4-turbo", {
user_id: "borat",
}),
prompt: "Suggest me a good car to buy!",
});
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
import { retrieveMemories } from "@mem0/vercel-ai-provider";
const prompt = "Suggest me a good car to buy.";
const memories = await retrieveMemories(prompt, { user_id: "borat" });
const { text } = await generateText({
model: openai("gpt-4-turbo"),
prompt: prompt,
system: memories,
});
import { generateText } from "ai";
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0();
const { text } = await generateText({
model: mem0("gpt-4-turbo", {
user_id: "borat",
}),
messages: [
{
role: "user",
content: [
{ type: "text", text: "Suggest me a good car to buy." },
{ type: "text", text: "Why is it better than the other cars for me?" },
{ type: "text", text: "Give options for every price range." },
],
},
],
});
import { generateText, LanguageModelV1Prompt } from "ai";
import { openai } from "@ai-sdk/openai";
import { retrieveMemories } from "@mem0/vercel-ai-provider";
// New format using system parameter for memory context
const messages: LanguageModelV1Prompt = [
{
role: "user",
content: [
{ type: "text", text: "Suggest me a good car to buy." },
{ type: "text", text: "Why is it better than the other cars for me?" },
{ type: "text", text: "Give options for every price range." },
],
},
];
const memories = await retrieveMemories(messages, { user_id: "borat" });
const { text } = await generateText({
model: openai("gpt-4-turbo"),
messages: messages,
system: memories,
});
import { streamText } from "ai";
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0();
const { textStream } = await streamText({
model: mem0("gpt-4-turbo", {
user_id: "borat",
}),
prompt:
"Suggest me a good car to buy! Why is it better than the other cars for me? Give options for every price range.",
});
for await (const textPart of textStream) {
process.stdout.write(textPart);
}
createMem0()
: Initializes a new mem0 provider instance with optional configurationretrieveMemories()
: Enriches prompts with relevant memoriesaddMemories()
: Add memories to your profilegetMemories()
: Get memories from your profile in array formatconst mem0 = createMem0({
config: {
...
// Additional model-specific configuration options can be added here.
},
});
user_id
identifier for consistent memory retrievalWe also have support for agent_id
, app_id
, and run_id
. Refer Docs.
user_id
FAQs
Vercel AI Provider for providing memory to LLMs
The npm package @mem0/vercel-ai-provider receives a total of 1,339 weekly downloads. As such, @mem0/vercel-ai-provider popularity was classified as popular.
We found that @mem0/vercel-ai-provider demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.Β It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
ECMAScript 2025 introduces Iterator Helpers, Set methods, JSON modules, and more in its latest spec update approved by Ecma in June 2025.
Security News
A new Node.js homepage button linking to paid support for EOL versions has sparked a heated discussion among contributors and the wider community.
Research
North Korean threat actors linked to the Contagious Interview campaign return with 35 new malicious npm packages using a stealthy multi-stage malware loader.