![Oracle Drags Its Feet in the JavaScript Trademark Dispute](https://cdn.sanity.io/images/cgdhsj6q/production/919c3b22c24f93884c548d60cbb338e819ff2435-1024x1024.webp?w=400&fit=max&auto=format)
Security News
Oracle Drags Its Feet in the JavaScript Trademark Dispute
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
IntelliBricks provides a streamlined set of tools for developing AI-powered applications. It simplifies complex tasks such as interacting with LLMs, training machine learning models, and implementing Retrieval Augmented Generation (RAG). Focus on building your application logic, not wrestling with boilerplate. IntelliBricks empowers you to build intelligent applications faster and more efficiently.
The Python-first Framework for Agentic & LLM-Powered Applications
Stop wrestling with AI boilerplate. Start building intelligence.
IntelliBricks is the Python-first toolkit for crafting AI applications with ease. Focus on your intelligent logic, not framework complexity.
Imagine this:
IntelliBricks solves AI development pain points:
Start in Seconds:
pip install intellibricks
IntelliBricks is built around three core modules, designed for power and seamless integration:
Interact with Language Models in pure Python.
Key Features:
Synapses: Connect to Google Gemini, OpenAI, Groq, and more with one line of code.
from intellibricks.llms import Synapse
synapse = Synapse.of("google/genai/gemini-pro-experimental")
completion = synapse.complete("Write a poem about Python.") # ChatCompletion[RawResponse]
print(completion.text)
Structured Outputs: Define data models with Python classes using msgspec.Struct
.
import msgspec
from typing import Annotated, Sequence
from intellibricks.llms import Synapse
class Summary(msgspec.Struct, frozen=True):
title: Annotated[str, msgspec.Meta(title="Title", description="Summary Title")]
key_points: Annotated[Sequence[str], msgspec.Meta(title="Key Points")]
synapse = Synapse.of("google/genai/gemini-pro-experimental")
prompt = "Summarize quantum computing article: [...]"
completion = synapse.complete(prompt, response_model=Summary) # ChatCompletion[Summary]
print(completion.parsed.title)
print(completion.parsed.key_points)
Chain of Thought: Structured reasoning with ChainOfThought
for observability.
from intellibricks.llms import Synapse, ChainOfThought
import msgspec
class Response(msgspec.Struct):
response: str
"""just to show you can combine ChainOfThought and other structured classes too"""
synapse = Synapse.of("google/genai/gemini-pro-experimental")
cot_response = synapse.complete(
"Solve riddle: Cities, no houses...",
response_model=ChainOfThought[Response] # You can use ChainOfThoughts[str] too!
)
for step in cot_response.parsed.steps:
print(f"Step {step.step_number}: {step.explanation}")
print(cot_response.parsed.final_answer) # Response
Langfuse Observability: Built-in integration for tracing and debugging.
from intellibricks.llms import Synapse
from langfuse import Langfuse
synapse = Synapse.of(..., langfuse=Langfuse())
Craft agents to perform complex tasks.
Key Features:
Agent Class: Define tasks, instructions, and connect to Synapses.
from intellibricks.agents import Agent
from intellibricks.llms import Synapse
synapse = Synapse.of("google/genai/gemini-pro-experimental")
agent = Agent(
task="Creative Title Generation",
instructions=["Intriguing fantasy story titles."],
metadata={"name": "TitleGen", "description": "Title Agent"},
synapse=synapse,
)
agent_response = agent.run("Knight discovers dragon egg.") # AgentResponse[RawResponse]
print(f"Agent suggests: {agent_response.text}")
Tool Calling: Equip agents with tools for real-world interaction.
Instant APIs: Turn agents into REST APIs with FastAPI/Litestar.
from intellibricks.agents import Agent
from intellibricks.llms import Synapse
import uvicorn
agent = Agent(..., synapse=Synapse.of(...))
app = agent.fastapi_app # WIP, any bugs open an issue please!
uvicorn.run(app, host="0.0.0.0", port=8000)
IntelliBricks is different. It's Python First.
Getting structured data from LLMs is critical. Here's how IntelliBricks compares to other frameworks:
IntelliBricks:
import msgspec
from intellibricks.llms import Synapse
class Summary(msgspec.Struct, frozen=True):
title: str
key_points: list[str]
synapse = Synapse.of("google/genai/gemini-pro-experimental")
completion = synapse.complete(
"Summarize article: [...]",
response_model=Summary
) # ChatCompletion[Summary]
print(completion.parsed) # Summary object
LangChain:
from langchain_openai import ChatOpenAI
from pydantic import BaseModel, Field
from typing import Optional
class Joke(BaseModel):
setup: str = Field(description="The setup of the joke")
punchline: str = Field(description="The punchline to the joke")
rating: Optional[int] = Field(default=None, description="Rating 1-10")
llm = ChatOpenAI(model="gpt-4o-mini")
structured_llm = llm.with_structured_output(Joke)
joke = structured_llm.invoke(
"Tell me a joke about cats"
) # Dict[Unknown, Unknown] | BaseModel
print(joke) # Joke object directly
LangChain uses .with_structured_output()
and Pydantic classes. While functional, it relies on Pydantic for validation and returns the Pydantic object directly via .invoke()
, losing direct access to completion metadata (usage, time, etc.)
LlamaIndex:
from llama_index.llms.openai import OpenAI
from pydantic import BaseModel, Field
from datetime import datetime
import json
class Invoice(BaseModel):
invoice_id: str = Field(...)
date: datetime = Field(...)
line_items: list = Field(...)
llm = OpenAI(model="gpt-4o")
sllm = llm.as_structured_llm(output_cls=Invoice)
response = llm.complete("...") # CompletionResponse
Here is what LlamaIndex' returns:
class CompletionResponse(BaseModel):
"""
Completion response.
Fields:
text: Text content of the response if not streaming, or if streaming,
the current extent of streamed text.
additional_kwargs: Additional information on the response(i.e. token
counts, function calling information).
raw: Optional raw JSON that was parsed to populate text, if relevant.
delta: New text that just streamed in (only relevant when streaming).
"""
text: str
additional_kwargs: dict = Field(default_factory=dict)
raw: Optional[Any] = None # Could be anything and could be None too. Nice!
logprobs: Optional[List[List[LogProb]]] = None
delta: Optional[str] = None
IntelliBricks Advantage:
msgspec
for high-performance serialization, outperforming Pydantic.synapse.complete()
returns ChatCompletion[RawResponse | T]
objects, providing not just parsed data but also full completion details (usage, timing, etc.).Examples adapted from LangChain docs and LlamaIndex docs. IntelliBricks offers a more streamlined and efficient Python-centric approach.
Build intelligent applications, the Python way.
pip install intellibricks
Let's build the future of intelligent applications, together!
FAQs
IntelliBricks provides a streamlined set of tools for developing AI-powered applications. It simplifies complex tasks such as interacting with LLMs, training machine learning models, and implementing Retrieval Augmented Generation (RAG). Focus on building your application logic, not wrestling with boilerplate. IntelliBricks empowers you to build intelligent applications faster and more efficiently.
We found that intellibricks demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
Security News
The Linux Foundation is warning open source developers that compliance with global sanctions is mandatory, highlighting legal risks and restrictions on contributions.
Security News
Maven Central now validates Sigstore signatures, making it easier for developers to verify the provenance of Java packages.