Bee Agent Framework
Open-source framework for building, deploying, and serving powerful agentic workflows at scale.
The Bee Agent Framework makes it easy to build scalable agent-based workflows with your model of choice. Our default agent, Bee, is designed to perform robustly with Llama 3.1, and we're actively working on optimizing its performance with other popular LLMs.
Our goal is to empower developers to adopt the latest open-source and proprietary models with minimal changes to their current agent implementation.
Key Features
- π€ AI agents: Use our powerful Bee agent or build your own.
- π οΈ Tools: Use our built-in tools or create your own in Javascript/Python.
- π©βπ» Code interpreter: Run code safely in a sandbox container.
- πΎ Memory: Multiple strategies to optimize token spend.
- βΈοΈ Serialization Handle complex agentic workflows and easily pause/resume them without losing state.
- π Traceability: Get full visibility of your agentβs inner workings, log all running events, and use our MLflow integration (coming soon) to debug performance.
- ποΈ Production-level control with caching and error handling.
- π§ (Coming soon) API: Configure and deploy your agents with a production-hardened API.
- π§ (Coming soon) Chat UI: Serve your agent to users in a delightful GUI with built-in transparency, explainability, and user controls.
- ... more on our Roadmap
Get started with Bee
Installation
npm install bee-agent-framework
or
yarn add bee-agent-framework
Example
import { BeeAgent } from "bee-agent-framework/agents/bee/agent";
import { OllamaChatLLM } from "bee-agent-framework/adapters/ollama/chat";
import { TokenMemory } from "bee-agent-framework/memory/tokenMemory";
import { DuckDuckGoSearchTool } from "bee-agent-framework/tools/search/duckDuckGoSearch";
import { OpenMeteoTool } from "bee-agent-framework/tools/weather/openMeteo";
const llm = new OllamaChatLLM();
const agent = new BeeAgent({
llm,
memory: new TokenMemory({ llm }),
tools: [new DuckDuckGoSearchTool(), new OpenMeteoTool()],
});
const response = await agent
.run({ prompt: "What's the current weather in Las Vegas?" })
.observe((emitter) => {
emitter.on("update", async ({ data, update, meta }) => {
console.log(`Agent (${update.key}) π€ : `, update.value);
});
});
console.log(`Agent π€ : `, response.result.text);
To run this example, be sure that you have installed ollama with the llama3.1 model downloaded.
β‘οΈ See a more advanced example.
β‘οΈ All examples can be found in the examples directory.
β‘οΈ To run an arbitrary example, use the following command yarn start -- examples/agents/bee.ts
(just pass the appropriate path to the desired example).
Local Installation (Python Interpreter + Interactive CLI)
[!NOTE]
yarn
should be installed via Corepack (tutorial)_
[!NOTE]
To make any asset available to a local code interpreter place them the following directory: ./examples/tmp/local_
[!NOTE]
Docker distribution with support for compose is required, the following are supported:
- Docker
- Rancher - macOS users must use VZ instead of QEMU
- Podman - requires compose and rootful machine (if your current machine is rootless, please create a new one)
- Clone the repository
git clone git@github.com:i-am-bee/bee-agent-framework
.
- Install dependencies
yarn install
.
- Create
.env
(from .env.template
) and fill in missing values (if any).
- Start the code interpreter
yarn run infra:start-code-interpreter
.
- Start the agent
yarn run start:bee
(it runs ./examples/agents/bee.ts file).
π οΈ Tools
PythonTool | Run arbitrary Python code in the remote environment. |
WikipediaTool | Search for data on Wikipedia. |
GoogleSearchTool | Search for data on Google using Custom Search Engine. |
DuckDuckGoTool | Search for data on DuckDuckGo. |
SQLTool | Execute SQL queries against relational databases. Instructions. |
CustomTool | Run your own Python function in the remote environment. |
LLMTool | Use an LLM to process input data. |
DynamicTool | Construct to create dynamic tools. |
ArXivTool | Retrieve research articles published on arXiv. |
WebCrawlerTool | Retrieve content of an arbitrary website. |
OpenMeteoTool | Retrieve current, previous, or upcoming weather for a given destination. |
β Request | |
ποΈ Adapters (LLM - Inference providers)
π¦ Modules
The source directory (src
) provides numerous modules that one can use.
agents | Base classes defining the common interface for agent. |
llms | Base classes defining the common interface for text inference (standard or chat). |
template | Prompt Templating system based on Mustache with various improvements_. |
memory | Various types of memories to use with agent. |
tools | Tools that an agent can use. |
cache | Preset of different caching approaches that can be used together with tools. |
errors | Base framework error classes used by each module. |
adapters | Concrete implementations of given modules for different environments. |
logger | Core component for logging all actions within the framework. |
serializer | Core component for the ability to serialize/deserialize modules into the serialized format. |
version | Constants representing the framework (e.g., latest version) |
internals | Modules used by other modules within the framework. |
To see more in-depth explanation see docs.
Tutorials
π§ Coming soon π§
Roadmap
Contribution guidelines
The Bee Agent Framework is an open-source project and we β€οΈ contributions.
If you'd like to contribute to Bee, please take a look at our contribution guidelines.
Bugs
We are using GitHub Issues to manage our public bugs. We keep a close eye on this, so before filing a new issue, please check to make sure it hasn't already been logged.
Code of conduct
This project and everyone participating in it are governed by the Code of Conduct. By participating, you are expected to uphold this code. Please read the full text so that you can read which actions may or may not be tolerated.
Legal notice
All content in these repositories including code has been provided by IBM under the associated open source software license and IBM is under no obligation to provide enhancements, updates, or support. IBM developers produced this code as an open source project (not as an IBM product), and IBM makes no assertions as to the level of quality nor security, and will not be maintaining this code going forward.
Contributors
Special thanks to our contributors for helping us improve Bee Agent Framework.