Mirascope is a powerful, flexible, and user-friendly library that simplifies the process of working with LLMs through a unified interface that works across various supported providers, including OpenAI, Anthropic, Mistral, Gemini, Groq, Cohere, LiteLLM, Azure AI, Vertex AI, and Bedrock.
Whether you're generating text, extracting structured information, or developing complex AI-driven agent systems, Mirascope provides the tools you need to streamline your development process and create powerful, robust applications.
30 Second Quickstart
Install Mirascope, specifying the provider(s) you intend to use, and set your API key:
pip install "mirascope[openai]"
export OPENAI_API_KEY=XXXXX
Make your first call to an LLM to extract the title and author of a book from unstructured text:
from mirascope.core import openai
from pydantic import BaseModel
class Book(BaseModel):
title: str
author: str
@openai.call("gpt-4o-mini", response_model=Book)
def extract_book(text: str) -> str:
return f"Extract {text}"
book = extract_book("The Name of the Wind by Patrick Rothfuss")
assert isinstance(book, Book)
print(book)
Tutorials
Check out our quickstart tutorial and many other tutorials for an interactive way to getting started with Mirascope.
Usage
For a complete guide on how to use all of the various features Mirascope has to offer, read through our Learn documentation.
Versioning
Mirascope uses Semantic Versioning.
Licence
This project is licensed under the terms of the MIT License.