
Security News
Meet Socket at Black Hat and DEF CON 2025 in Las Vegas
Meet Socket at Black Hat & DEF CON 2025 for 1:1s, insider security talks at Allegiant Stadium, and a private dinner with top minds in software supply chain security.
LLM Memory is a Ruby gem designed to provide large language models (LLMs) like ChatGPT with memory using in-context learning. This enables better integration with systems such as Rails and web services while providing a more user-friendly and abstract interface based on brain terms.
Wernicke's area in brain is involved in the comprehension of written and spoken language
Hippocampus in brain plays important roles in the consolidation of information from short-term memory to long-term memory
Broca's area in brain is also known as the motor speech area.
Install the gem and add to the application's Gemfile by executing:
$ bundle add llm_memory
If bundler is not being used to manage dependencies, install the gem by executing:
$ gem install llm_memory
Set environment variable OPENAI_ACCESS_TOKEN
and REDISCLOUD_URL
or set in initializer.
LlmMemory.configure do |c|
c.openai_access_token = "xxxxx"
c.redis_url = "redis://xxxx:6379"
end
To use LLM Memory, follow these steps:
If you want to use pgvector instead of Redisearch. You can use the plugin. Please check the doc and change the setup steps(2&3)
For the details of each class, please refer to API reference document.
docs = LlmMemory::Wernicke.load(:file, "/tmp/a_directory")
# docs is just an array of hash.
# You don't have to use load method but
# create own hash with having content and metadata(optional)
# docs = [{
# content: "Hi there",
# metadata: {
# file_name: "a.txt",
# timestamp: "20201231235959"
# }
# },,,]
hippocampus = LlmMemory::Hippocampus.new
res = hippocampus.memorize(docs)
query_str = "What is my name?"
related_docs = hippocampus.query(query_str, limit: 3)
#[{
# vector_score: "0.192698478699",
# content: "My name is Mike",
# metadata: { ... }
#},,,]
# ERB
prompt = <<-TEMPLATE
Context information is below.
---------------------
<% related_docs.each do |doc| %>
<%= doc[:content] %>
file: <%= doc[:metadata][:file_name] %>
<% end %>
---------------------
Given the context information and not prior knowledge,
answer the question: <%= query_str %>
TEMPLATE
broca = LlmMemory::Broca.new(prompt: prompt, model: 'gpt-3.5-turbo')
message = broca.respond(query_str: query_str, related_docs: related_docs)
...
query_str2 = "How are you?"
related_docs = hippocampus.query(query_str2, limit: 3)
message2 = broca.respond(query_str: query_str2, related_docs: related_docs)
The table below provides a list of plugins utilized by llm_memory. The aim is to keep the core llm_memory lightweight while allowing for easy extensibility through the use of plugins.
Install the plugin and update the method.
For example, if you wan to use pgvector. then,
$ bundle add llm_memory_pgvector
Then, load it instead of :redis
(default is redis).
# may need to have require depending on the project
# require llm_memory_pgvector
hippocamups = LlmMemory::Hippocampus.new(store: :pgvector)`
Please refer to the links for the details.
Plugin Name | Type | Module | Link |
---|---|---|---|
llm_memory_gmail_loader | Loader | Wernicke | link |
llm_memory_pgvector | Store | Hippocampus | link |
After checking out the repo, run bin/setup
to install dependencies. Then, run rake spec
to run the tests. You can also run bin/console
for an interactive prompt that will allow you to experiment.
To install this gem onto your local machine, run bundle exec rake install
. To release a new version, update the version number in version.rb
, and then run bundle exec rake release
, which will create a git tag for the version, push git commits and the created tag, and push the .gem
file to rubygems.org.
Bug reports and pull requests are welcome on GitHub at https://github.com/shohey1226/llm_memory. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the code of conduct.
The gem is available as open source under the terms of the MIT License.
Everyone interacting in the LlmMemory project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the code of conduct.
FAQs
Unknown package
We found that llm_memory demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Meet Socket at Black Hat & DEF CON 2025 for 1:1s, insider security talks at Allegiant Stadium, and a private dinner with top minds in software supply chain security.
Security News
CAI is a new open source AI framework that automates penetration testing tasks like scanning and exploitation up to 3,600× faster than humans.
Security News
Deno 2.4 brings back bundling, improves dependency updates and telemetry, and makes the runtime more practical for real-world JavaScript projects.