This example demonstrates how to create a chatbot with long-term memory using the AIGNE Framework. The chatbot can remember user information across conversations and recall details shared in previous interactions.
The example uses two powerful AFS (Agentic File System) modules:
AFSHistory: Automatically records all conversations for context
UserProfileMemory: Intelligently extracts and stores user information (name, interests, location, etc.)
Agentic File System (AFS) is a virtual file system abstraction that provides AI agents with unified access to various storage backends. For comprehensive documentation, see AFS Documentation.
Prerequisites
Node.js (>=20.0) and npm installed on your machine
An OpenAI API key for interacting with OpenAI's services
Optional dependencies (if running the example from source code):
export OPENAI_API_KEY=YOUR_OPENAI_API_KEY # Set your OpenAI API key# First conversation - introduce yourself
npx -y @aigne/example-afs-memory --input "I'm Bob, and I like blue color"# Response: Nice to meet you, Bob — I've saved that your favorite color is blue...# Second conversation - the bot remembers you!
npx -y @aigne/example-afs-memory --input "Tell me all info about me you known"# Response: Here's what I currently have stored about you:# * Name: Bob# * Interests / favorite color: blue# Run in interactive chat mode
npx -y @aigne/example-afs-memory --chat
cd aigne-framework/examples/afs-memory
pnpm install
Setup Environment Variables
Setup your OpenAI API key in the .env.local file:
OPENAI_API_KEY=""# Set your OpenAI API key here
Using Different Models
You can use different AI models by setting the MODEL environment variable along with the corresponding API key. The framework supports multiple providers:
OpenAI: MODEL="openai:gpt-4.1" with OPENAI_API_KEY
Anthropic: MODEL="anthropic:claude-3-7-sonnet-latest" with ANTHROPIC_API_KEY
Google Gemini: MODEL="gemini:gemini-2.0-flash" with GEMINI_API_KEY
AWS Bedrock: MODEL="bedrock:us.amazon.nova-premier-v1:0" with AWS credentials
DeepSeek: MODEL="deepseek:deepseek-chat" with DEEPSEEK_API_KEY
OpenRouter: MODEL="openrouter:openai/gpt-4o" with OPEN_ROUTER_API_KEY
xAI: MODEL="xai:grok-2-latest" with XAI_API_KEY
Ollama: MODEL="ollama:llama3.2" with OLLAMA_DEFAULT_BASE_URL
For detailed configuration examples, please refer to the .env.local.example file in this directory.
Run the Example
# Run in interactive mode
pnpm start
# Run with a single message
pnpm start --input "I'm Bob, and I like blue color"
Example Conversation Flow
# First conversation - introduce yourself
$ pnpm start --input "I'm Bob, and I like blue color"
Response: Nice to meet you, Bob — I've saved that your favorite color is blue.
If you'd like, I can remember more about you (location, hobbies, projects, etc.)...
# Second conversation - the bot remembers you!
$ pnpm start --input "Tell me all info about me you known"
Response: Here's what I currently have stored about you:
* Name: Bob
* Interests / favorite color: blue
Would you like to add or update anything (location, hobbies, projects, family, etc.)?
How Memory Works
This example uses two complementary memory modules that work together to provide intelligent, personalized conversations:
1. AFSHistory Module - Conversation Context
Purpose: Records every conversation turn to provide recent context.
How it works:
Automatically saves each user message and AI response pair
Stores conversations with timestamps and unique IDs
Enables the AI to reference recent conversations
Example:
# First conversation
$ pnpm start --input "I'm Bob, and I like blue color"
Response: Nice to meet you, Bob — I've saved that your favorite color is blue...
# The conversation is automatically saved to history.sqlite3
2. UserProfileMemory Module - User Information Extraction
Purpose: Intelligently extracts and stores structured user information from conversations.
How it works:
Listens to conversation history events
Analyzes each conversation using AI to identify user information
Stores in a structured JSON profile in user_profile.sqlite3
Updates incrementally using JSON Patch operations
What it remembers:
Name and personal details
Location (country, city, address)
Interests and hobbies
Family members and relationships
Projects and work
Languages spoken
Birthday and other personal info
Example:
# After Bob introduces himself, the profile is automatically created:
{
"name": [{ "name": "Bob" }],
"interests": [{ "content": "blue color" }]
}
# In the next conversation, the bot can recall this information:
$ pnpm start --input "Tell me all info about me you known"
Response: Here's what I currently have stored about you:
* Name: Bob
* Interests / favorite color: blue
3. Memory Injection - How the AI Uses Memory
When you send a message, the system automatically:
Step 1: Inject User Profile into System Prompt
You are a friendly chatbot
<related-memories>
User Profile Memory: This contains structured information about the user...
- name:
- name: Bob
interests:
- content: blue color
</related-memories>
Step 2: Add Recent Conversation History
[{"role":"system","content":"You are a friendly chatbot... [profile injected here]"},{"role":"user","content":"I'm Bob and I like blue color"},{"role":"assistant","content":"Nice to meet you, Bob..."},{"role":"user","content":"Tell me all info about me you known"}]
Step 3: AI Generates Personalized Response
The AI can now:
Address the user by name
Reference their interests
Provide personalized recommendations
Maintain context across sessions
Key Design Benefits
Automatic: No manual profile management needed
Intelligent: AI determines what's important to remember
Incremental: Profile updates gradually over time
Persistent: Memory survives across multiple conversations
Structured: Information is organized in a consistent format
A demonstration of using AIGNE Framework with memory-based AFS modules
The npm package @aigne/example-afs-memory receives a total of 471 weekly downloads. As such, @aigne/example-afs-memory popularity was classified as not popular.
We found that @aigne/example-afs-memory demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.It has 2 open source maintainers collaborating on the project.
Package last updated on 03 Dec 2025
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Impostor NuGet package Tracer.Fody.NLog typosquats Tracer.Fody and its author, using homoglyph tricks, and exfiltrates Stratis wallet JSON/passwords to a Russian IP address.