funcchain

pip install funcchain
Introduction
funcchain
is the most pythonic way of writing cognitive systems. Leveraging pydantic models as output schemas combined with langchain in the backend allows for a seamless integration of llms into your apps.
It utilizes OpenAI Functions or LlamaCpp grammars (json-schema-mode) for efficient structured output.
In the backend it compiles the funcchain syntax into langchain runnables so you can easily invoke, stream or batch process your pipelines.

Simple Demo
from funcchain import chain
from pydantic import BaseModel
class Recipe(BaseModel):
ingredients: list[str]
instructions: list[str]
duration: int
def generate_recipe(topic: str) -> Recipe:
"""
Generate a recipe for a given topic.
"""
return chain()
recipe = generate_recipe("christmas dinner")
print(recipe.ingredients)
Complex Structured Output
from pydantic import BaseModel, Field
from funcchain import chain
class Item(BaseModel):
name: str = Field(description="Name of the item")
description: str = Field(description="Description of the item")
keywords: list[str] = Field(description="Keywords for the item")
class ShoppingList(BaseModel):
items: list[Item]
store: str = Field(description="The store to buy the items from")
class TodoList(BaseModel):
todos: list[Item]
urgency: int = Field(description="The urgency of all tasks (1-10)")
def extract_list(user_input: str) -> TodoList | ShoppingList:
"""
The user input is either a shopping List or a todo list.
"""
return chain()
lst = extract_list(
input("Enter your list: ")
)
match lst:
case ShoppingList(items=items, store=store):
print("Here is your Shopping List: ")
for item in items:
print(f"{item.name}: {item.description}")
print(f"You need to go to: {store}")
case TodoList(todos=todos, urgency=urgency):
print("Here is your Todo List: ")
for item in todos:
print(f"{item.name}: {item.description}")
print(f"Urgency: {urgency}")
Vision Models
from funcchain import Image
from pydantic import BaseModel, Field
from funcchain import chain, settings
settings.llm = "openai/gpt-4-vision-preview"
class AnalysisResult(BaseModel):
"""The result of an image analysis."""
theme: str = Field(description="The theme of the image")
description: str = Field(description="A description of the image")
objects: list[str] = Field(description="A list of objects found in the image")
def analyse_image(image: Image) -> AnalysisResult:
"""
Analyse the image and extract its
theme, description and objects.
"""
return chain()
result = analyse_image(Image.open("examples/assets/old_chinese_temple.jpg"))
print("Theme:", result.theme)
print("Description:", result.description)
for obj in result.objects:
print("Found this object:", obj)
Seamless local model support
from pydantic import BaseModel, Field
from funcchain import chain, settings
settings.llm = "ollama/openchat"
class SentimentAnalysis(BaseModel):
analysis: str
sentiment: bool = Field(description="True for Happy, False for Sad")
def analyze(text: str) -> SentimentAnalysis:
"""
Determines the sentiment of the text.
"""
return chain()
poem = analyze("I really like when my dog does a trick!")
print(poem.analysis)
Features
- π pythonic
- π easy swap between openai or local models
- π dynamic output types (pydantic models, or primitives)
- ποΈ vision llm support
- π§ langchain_core as backend
- π jinja templating for prompts
- ποΈ reliable structured output
- π auto retry parsing
- π§ langsmith support
- π sync, async, streaming, parallel, fallbacks
- π¦ gguf download from huggingface
- β
type hints for all functions and mypy support
- π£οΈ chat router component
- π§© composable with langchain LCEL
- π οΈ easy error handling
- π¦ enums and literal support
- π custom parsing types
Documentation
Checkout the docs here π
Also highly recommend to try and run the examples in the ./examples
folder.
Contribution
You want to contribute? Thanks, that's great!
For more information checkout the Contributing Guide.
Please run the dev setup to get started:
git clone https://github.com/shroominic/funcchain.git && cd funcchain
./dev_setup.sh