
Research
Security News
The Growing Risk of Malicious Browser Extensions
Socket researchers uncover how browser extensions in trusted stores are used to hijack sessions, redirect traffic, and manipulate user behavior.
Memobase is a user profile-based memory system designed to bring long-term user memory to your Generative AI (GenAI) applications. Whether you're building virtual companions, educational tools, or personalized assistants, Memobase empowers your AI to remember, understand, and evolve with your users.
Memobase can provide you structured profiles of users, check out the result (compared with mem0) from a 900-turns real-world chatting:
{
"basic_info": {
"language_spoken": ["English", "Korean"],
"name": "μ€*μ"
},
"demographics": {
"marital_status": "married"
},
"education": {
"notes": "Had an English teacher who emphasized capitalization rules during school days",
"major": "κ΅μ΄κ΅λ¬Ένκ³Ό (Korean Language and Literature)"
},
"interest": {
"games": 'User is interested in Cyberpunk 2077 and wants to create a game better than it',
'youtube_channels': "Kurzgesagt",
...
},
"psychological": {...},
'work': {'working_industry': ..., 'title': ..., },
...
}
π― Memory for User, not Agent
Define and control exactly what user information your AI captures.
π SOTA
check out performance on public benchmark against mem0, langmem, zep...
π Time-aware Memory
Memobase has more than user profiles, it also records user event. User event is essiential to answer time-related question, see how we can improve temporal memory much better than other memory solutions.
πΌοΈ Contorllable Memory
Among all types of memory, only some may enhance your product experience. Memobase offers a flexible configuration for you to design the profile.
π Easy Integration
Minimal code changes to integrate with your existing LLM stack with API, Python/Node/Go SDK.
β‘οΈ Batch-Process:
Memobase offers every user a buffer to batch processing the chats after the conversation. Fast & Cheap.
π Production Ready
Memobase is building with FastAPI, Postgres and Redis, supporting request caching, authing, telemetry... Fully dockerized.
How Memobase works?
[!NOTE]
Try Memobase-Playground, an open-source, full-stack AI Chatbot with long-term memory. live-demo
http://localhost:8019
, cloud https://api.memobase.dev
)secret
, cloud sk-proj-xxxxxx
)pip install memobase
[!TIP]
You can just run this equivalent quickstart script
Or you can keep things super easy by using OpenAI SDK with Memobase., Ollama with Memobase.
from memobase import MemoBaseClient, ChatBlob
mb = MemoBaseClient(
project_url=PROJECT_URL,
api_key=PROJECT_TOKEN,
)
assert mb.ping()
uid = mb.add_user({"any_key": "any_value"})
mb.update_user(uid, {"any_key": "any_value2"})
u = mb.get_user(uid)
print(u)
# mb.delete(uid)
In Memobase, all types of data are blobs to a user that can insert, get and delete:
messages = [
{
"role": "user",
"content": "Hello, I'm Gus",
},
{
"role": "assistant",
"content": "Hi, nice to meet you, Gus!",
}
]
bid = u.insert(ChatBlob(messages=messages))
print(u.get(bid)) # not found once you flush the memory.
# u.delete(bid)
Be default, Memobase will remove the blobs once they're processed. This means that apart from the relevant memory, your data will not be stored with Memobase. You can persist the blobs by adjusting the configuration file.
u.flush(sync=True)
By default, Memobase will flush the buffer asynchronously. You can set
sync=True
to wait for the buffer to be processed.
And what will you get?
print(u.profile(need_json=True))
# results
{
"basic_info": {
"name": {
"content": "Gus",
"id": ...,
"created_at": ...
}
}
}
u.profile()
will return structured profiles that are learned from this user, including topic
, sub_topic
and content
. As you insert more blobs, the profile will become better.
In Memobase, we don't memoize users in hot path. We use buffer zones for the recent inserted blobs.
When the buffer zone becomes too large (e.g., 1024 tokens) or remains idle for an extended period (e.g., 1 hour), Memobase will flush the entire buffer into memory. Alternatively, you can use flush()
manually decide when to flush, such as when a chat session is closed in your app.
Memobase has a context
api to pack everything you need into a simple string, where you can insert it into your prompt directly:
print(u.context(max_token_size=500, prefer_topics=["basic_info"]))
Something like:
<memory>
# Below is the user profile:
- basic_info::name: Gus
...
# Below is the latest events of the user:
2025/02/24 04:25PM:
- work::meetings: Scheduled a meeting with John.
...
</memory>
Please provide your answer using the information within the <memory> tag at the appropriate time.
Checkout the detail params here.
By placing profiles into your AI (e.g. system prompt).
PROFILES = "\n".join([p.describe for p in u.profile()])
print(PROFILES)
# basic_info: name - Gus
# basic_info: age - 25
# ...
# interest: foods - Mexican cuisine
# psychological: goals - Build something that maybe useful
# ...
Too much information is hidden in the conversations between users and AI, that's why you need a new data tracking method to record user preference and behavior.
PROFILES = u.profile()
def under_age_30(p):
return p.sub_topic == "age" and int(p.content) < 30
def love_cat(p):
return p.topic == "interest" and p.sub_topic == "pets" and "cat" in p.content
is_user_under_30 = (
len([p for p in profiles if under_age_30(p)]) > 0
)
is_user_love_cat = (
len([p for p in profiles if love_cat(p)]) > 0
)
...
Not everyone is looking for Grammarly, it's always nice to sell something your users might want.
def pick_an_ad(profiles):
work_titles = [p for p in profiles if p.topic=="work" and p.sub_topic=="title"]
if not len(work_titles):
return None
wt = work_titles[0].content
if wt == "Software Engineer":
return "Deep Learning Stuff"
elif wt == "some job":
return "some ads"
...
For detailed usage instructions, visit the documentation.
Star Memobase on Github to support and receive instant notifications!
Join the community for support and discussions:
Join our Discord π»
Follow us on Twitter π
Or Just email us β€οΈ
This project is licensed under the Apache 2.0 License - see the LICENSE file for details.
FAQs
Client library of Memobase: manage user memory for your LLM applications
We found that memobase demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.Β It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover how browser extensions in trusted stores are used to hijack sessions, redirect traffic, and manipulate user behavior.
Research
Security News
An in-depth analysis of credential stealers, crypto drainers, cryptojackers, and clipboard hijackers abusing open source package registries to compromise Web3 development environments.
Security News
pnpm 10.12.1 introduces a global virtual store for faster installs and new options for managing dependencies with version catalogs.