New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

intellibricks

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

intellibricks

IntelliBricks provides a streamlined set of tools for developing AI-powered applications. It simplifies complex tasks such as interacting with LLMs, training machine learning models, and implementing Retrieval Augmented Generation (RAG). Focus on building your application logic, not wrestling with boilerplate. IntelliBricks empowers you to build intelligent applications faster and more efficiently.

  • 0.8.3
  • Source
  • PyPI
  • Socket score

Maintainers
1

Logo

Start Building Intelligence

The Python-first Framework for Agentic & LLM-Powered Applications


Stop wrestling with AI boilerplate. Start building intelligence.

IntelliBricks is the Python-first toolkit for crafting AI applications with ease. Focus on your intelligent logic, not framework complexity.

Quick Overview

Imagine this:

  • Pythonic AI: Write clean, intuitive Python – IntelliBricks handles the AI plumbing.
  • Structured Outputs, Instantly: msgspec.Struct classes define your data, IntelliBricks gets you structured LLM responses.
  • Agents that Understand: Build autonomous agents with clear tasks, instructions, and your knowledge.
  • APIs in Minutes: Deploy agents as REST APIs with FastAPI or Litestar, effortlessly.
  • Context-Aware by Default: Seamless RAG integration for informed, intelligent agents.

IntelliBricks solves AI development pain points:

  • Complexity? Gone. Streamlined, Python-first approach.
  • Framework Chaos? Controlled. Predictable, structured outputs with Python types.
  • Boilerplate? Banished. Focus on intelligence, predictability and observability. No more time setting the framework up.

Start in Seconds:

pip install intellibricks

Core Modules: Your AI Building Blocks

IntelliBricks is built around three core modules, designed for power and seamless integration:

🧠 LLMs Module: Integrate easilly with AI providers

Interact with Language Models in pure Python.

Key Features:

  • Synapses: Connect to Google Gemini, OpenAI, Groq, and more with one line of code.

    from intellibricks.llms import Synapse
    
    synapse = Synapse.of("google/genai/gemini-pro-experimental")
    completion = synapse.complete("Write a poem about Python.") # ChatCompletion[RawResponse]
    print(completion.text)
    
  • Structured Outputs: Define data models with Python classes using msgspec.Struct.

    import msgspec
    from typing import Annotated, Sequence
    from intellibricks.llms import Synapse
    
    class Summary(msgspec.Struct, frozen=True):
        title: Annotated[str, msgspec.Meta(title="Title", description="Summary Title")]
        key_points: Annotated[Sequence[str], msgspec.Meta(title="Key Points")]
    
    synapse = Synapse.of("google/genai/gemini-pro-experimental")
    prompt = "Summarize quantum computing article: [...]"
    completion = synapse.complete(prompt, response_model=Summary) # ChatCompletion[Summary]
    
    print(completion.parsed.title)
    print(completion.parsed.key_points)
    
  • Chain of Thought: Structured reasoning with ChainOfThought for observability.

    from intellibricks.llms import Synapse, ChainOfThought
    import msgspec
    
    class Response(msgspec.Struct):
        response: str
        """just to show you can combine ChainOfThought and other structured classes too"""
    
    synapse = Synapse.of("google/genai/gemini-pro-experimental")
    cot_response = synapse.complete(
        "Solve riddle: Cities, no houses...",
        response_model=ChainOfThought[Response] # You can use ChainOfThoughts[str] too!
    ) 
    
    for step in cot_response.parsed.steps:
        print(f"Step {step.step_number}: {step.explanation}")
    
    print(cot_response.parsed.final_answer) # Response
    
  • Langfuse Observability: Built-in integration for tracing and debugging.

    from intellibricks.llms import Synapse
    from langfuse import Langfuse
    
    synapse = Synapse.of(..., langfuse=Langfuse())
    

    Langfuse

🤖 Agents Module: Build Autonomous Intelligence

Craft agents to perform complex tasks.

Key Features:

  • Agent Class: Define tasks, instructions, and connect to Synapses.

    from intellibricks.agents import Agent
    from intellibricks.llms import Synapse
    
    synapse = Synapse.of("google/genai/gemini-pro-experimental")
    agent = Agent(
        task="Creative Title Generation",
        instructions=["Intriguing fantasy story titles."],
        metadata={"name": "TitleGen", "description": "Title Agent"},
        synapse=synapse,
    )
    
    agent_response = agent.run("Knight discovers dragon egg.") # AgentResponse[RawResponse]
    print(f"Agent suggests: {agent_response.text}")
    
  • Tool Calling: Equip agents with tools for real-world interaction.

  • Instant APIs: Turn agents into REST APIs with FastAPI/Litestar.

    from intellibricks.agents import Agent
    from intellibricks.llms import Synapse
    import uvicorn
    
    agent = Agent(..., synapse=Synapse.of(...))
    app = agent.fastapi_app # WIP, any bugs open an issue please!
    uvicorn.run(app, host="0.0.0.0", port=8000)
    

🏆 Why IntelliBricks? Python Purity & Power.

IntelliBricks is different. It's Python First.

  • 🐍 Idiomatic Python: Clean, modern Python – no framework jargon.
  • Simplicity & Clarity: Intuitive API, less boilerplate.
  • 🧱 Structured Outputs, Core Strength: Define Python classes, get structured data.
  • 🧠 Focus on Intelligence: Build smart apps, not infrastructure headaches.

Structured Outputs: IntelliBricks vs. LangChain & LlamaIndex

Getting structured data from LLMs is critical. Here's how IntelliBricks compares to other frameworks:

IntelliBricks:

import msgspec
from intellibricks.llms import Synapse

class Summary(msgspec.Struct, frozen=True):
    title: str
    key_points: list[str]

synapse = Synapse.of("google/genai/gemini-pro-experimental")
completion = synapse.complete(
    "Summarize article: [...]",
    response_model=Summary
) # ChatCompletion[Summary]

print(completion.parsed) # Summary object

LangChain:

from langchain_openai import ChatOpenAI
from pydantic import BaseModel, Field
from typing import Optional

class Joke(BaseModel):
    setup: str = Field(description="The setup of the joke")
    punchline: str = Field(description="The punchline to the joke")
    rating: Optional[int] = Field(default=None, description="Rating 1-10")

llm = ChatOpenAI(model="gpt-4o-mini")
structured_llm = llm.with_structured_output(Joke)
joke = structured_llm.invoke(
    "Tell me a joke about cats"
) # Dict[Unknown, Unknown] | BaseModel

print(joke) # Joke object directly

LangChain uses .with_structured_output() and Pydantic classes. While functional, it relies on Pydantic for validation and returns the Pydantic object directly via .invoke(), losing direct access to completion metadata (usage, time, etc.)

LlamaIndex:

from llama_index.llms.openai import OpenAI
from pydantic import BaseModel, Field
from datetime import datetime
import json

class Invoice(BaseModel):
    invoice_id: str = Field(...)
    date: datetime = Field(...)
    line_items: list = Field(...)

llm = OpenAI(model="gpt-4o")
sllm = llm.as_structured_llm(output_cls=Invoice)
response = llm.complete("...") # CompletionResponse

Here is what LlamaIndex' returns:

class CompletionResponse(BaseModel):
    """
    Completion response.

    Fields:
        text: Text content of the response if not streaming, or if streaming,
            the current extent of streamed text.
        additional_kwargs: Additional information on the response(i.e. token
            counts, function calling information).
        raw: Optional raw JSON that was parsed to populate text, if relevant.
        delta: New text that just streamed in (only relevant when streaming).
    """

    text: str
    additional_kwargs: dict = Field(default_factory=dict)
    raw: Optional[Any] = None # Could be anything and could be None too. Nice!
    logprobs: Optional[List[List[LogProb]]] = None
    delta: Optional[str] = None

IntelliBricks Advantage:

  • Python-First Purity: Clean, idiomatic Python.
  • Simpler Syntax: More direct and intuitive structured output definition.
  • Blazing Fast: Leverages msgspec for high-performance serialization, outperforming Pydantic.
  • Comprehensive Responses: synapse.complete() returns ChatCompletion[RawResponse | T] objects, providing not just parsed data but also full completion details (usage, timing, etc.).

Examples adapted from LangChain docs and LlamaIndex docs. IntelliBricks offers a more streamlined and efficient Python-centric approach.


🚀 Join the IntelliBricks Revolution!

Build intelligent applications, the Python way.

  • Get Started: pip install intellibricks
  • Explore: Dive into the documentation.
  • Contribute: It's community-driven!
  • Connect: Share feedback and ideas!

Let's build the future of intelligent applications, together!

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc