AgentAmi
AgentAmi is a flexible agentic framework built using LangGraph, designed
to scale with large numbers of tools and intelligently select the most relevant ones for a given user query.
It helps with decreasing token size significantly.
It supports:
- Dynamic tool selection via inbuilt runtime RAG (very efficient) with an option to easily replace it with your own tool_selector.
- Pruner to limit context length and improve performance (it's inbuilt, you don't have to do anything).
Quick start
Refer the main.py file for a complete sample usage.
pip install agentami
from agentami import AgentAmi
from langchain.chat_models import ChatOpenAI
from langgraph.checkpoint.memory import InMemorySaver
from agentami.agents.ami import AgentAmi
tools = [...]
agent = AgentAmi(
model=ChatOpenAI(model="gpt-4o"),
tools=tools,
checkpointer=InMemorySaver(),
tool_selector=...,
top_k=...,
context_size=...,
disable_pruner=...,
prompt_template=...
)
agent_ami = agent.graph
Things you should be aware about:
- Running for the first time will take time as it installs the dependencies (models used by internal tool_selector).
- Your first
agent_ami.invoke() or agent_agent_ami.astream() may take time if you have hundreds of tools, because it initialises a vector store and embeds the tool descriptions at runtime for each AgentAmi() object
- Your eventual prompts' response time would be fine.
- Checkout ROADMAP.md file for future features.
How to integrate your own tool selector?
Just make a function that accepts (query: str, top_k: int) and parameters and returns List[str] #List of tool names.
from typing import List
def my_own_tool_selector(query: str, top_k: int) -> List[str]:
return ["tool1", "tool2", "tool3"]