You're Invited:Meet the Socket Team at BlackHat and DEF CON in Las Vegas, Aug 4-6.RSVP β†’
Socket
Book a DemoInstallSign in
Socket

chonkie

Package Overview
Dependencies
Maintainers
2
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

chonkie

πŸ¦› CHONK your texts with Chonkie ✨ - The no-nonsense chunking library

1.1.1
pipPyPI
Maintainers
2

Chonkie Logo

πŸ¦› Chonkie ✨

PyPI version License Documentation Package size codecov Downloads Discord GitHub stars

The no-nonsense ultra-light and lightning-fast chunking library that's ready to CHONK your texts!

Installation β€’ Usage β€’ Pipeline β€’ Chunkers β€’ Integrations β€’ Benchmarks

Tired of making your gazillionth chunker? Sick of the overhead of large libraries? Want to chunk your texts quickly and efficiently? Chonkie the mighty hippo is here to help!

πŸš€ Feature-rich: All the CHONKs you'd ever need
✨ Easy to use: Install, Import, CHONK
⚑ Fast: CHONK at the speed of light! zooooom
πŸͺΆ Light-weight: No bloat, just CHONK
🌏 Wide support: CHONKie integrates with your favorite tokenizer, embedding model and APIs!
πŸ’¬ ️Multilingual: Out-of-the-box support for 56 languages
☁️ Cloud-Ready: CHONK locally or in the Chonkie Cloud
πŸ¦› Cute CHONK mascot: psst it's a pygmy hippo btw
❀️ Moto Moto's favorite python library

Chonkie is a chunking library that "just works" ✨

Installation

To install chonkie, run:

pip install chonkie

Chonkie follows the rule of minimum installs. Have a favorite chunker? Read our docs to install only what you need Don't want to think about it? Simply install all (Not recommended for production environments)

pip install chonkie[all]

Basic Usage

Here's a basic example to get you started:

# First import the chunker you want from Chonkie
from chonkie import RecursiveChunker

# Initialize the chunker
chunker = RecursiveChunker()

# Chunk some text
chunks = chunker("Chonkie is the goodest boi! My favorite chunking hippo hehe.")

# Access chunks
for chunk in chunks:
    print(f"Chunk: {chunk.text}")
    print(f"Tokens: {chunk.token_count}")

Check out more usage examples in the docs!

The Chonkie Pipeline

Chonkie processes text using a pipeline approach to transform raw documents into refined, usable chunks. This allows for flexibility and efficiency in handling different chunking strategies. We call this pipeline CHOMP (short for 'CHOnkie's Multi-step Pipeline').

Here's a conceptual overview of the pipeline, as illustrated in the diagram:

πŸ€– CHOMP pipeline diagram

The main stages are:

  • πŸ“„ Document: The starting point – your input text data. It can be in any format!
  • πŸ‘¨β€πŸ³ Chef: This stage handles initial text preprocessing. It might involve cleaning, normalization, or other preparatory steps to get the text ready for chunking. While this is optional, it is recommended to use the Chef stage to clean your text before chunking.
  • πŸ¦› Chunker: The core component you select (e.g., RecursiveChunker, SentenceChunker). It applies its specific logic to split the preprocessed text into initial chunks based on the chosen strategy and parameters.
  • 🏭 Refinery: After initial chunking, the Refinery performs post-processing. This can include merging small chunks based on overlap, adding embeddings, or adding additional context to the chunks. It helps ensure the quality and consistency of the output. You can have multiple Refineries to apply different post-processing steps.
  • πŸ€— Friends: The pipeline's produces the final results which can be either exported to be saved or ingested into your vector database. Chonkie offers Porters to export the chunks and Handshakes to ingest the chunks into your vector database.
    • 🐴 Porters: Porters can save the chunks to a file or a database. Currently, only JSON is supported for exporting the chunks.
    • 🀝 Handshakes: Handshakes provide a unified interface for ingesting the chunks into your preferred vector databases.

This modular pipeline allows Chonkie to be both powerful and easy to configure for various text chunking needs.

Chunkers

Chonkie provides several chunkers to help you split your text efficiently for RAG applications. Here's a quick overview of the available chunkers:

NameAliasDescription
TokenChunkertokenSplits text into fixed-size token chunks.
SentenceChunkersentenceSplits text into chunks based on sentences.
RecursiveChunkerrecursiveSplits text hierarchically using customizable rules to create semantically meaningful chunks.
SemanticChunkersemanticSplits text into chunks based on semantic similarity. Inspired by the work of Greg Kamradt.
SDPMChunkersdpmSplits text using a Semantic Double-Pass Merge approach.
LateChunkerlateEmbeds text and then splits it to have better chunk embeddings.
CodeChunkercodeSplits code into structurally meaningful chunks.
NeuralChunkerneuralSplits text using a neural model.
SlumberChunkerslumberSplits text using an LLM to find semantically meaningful chunks. Also known as "AgenticChunker".

More on these methods and the approaches taken inside the docs

Integrations

Chonkie boasts 19+ integrations across tokenizers, embedding providers, LLMs, porters, and vector databases, ensuring it fits seamlessly into your existing workflow.

πŸͺ“ Slice 'n' Dice! Chonkie supports 5+ ways to tokenize!

Choose from supported tokenizers or provide your own custom token counting function. Flexibility first!

NameDescriptionOptional Install
characterBasic character-level tokenizer. Default tokenizer.default
wordBasic word-level tokenizer.default
tokenizersLoad any tokenizer from the Hugging Face tokenizers library.default
tiktokenUse OpenAI's tiktoken library (e.g., for gpt-4).chonkie[tiktoken]
transformersLoad tokenizers via AutoTokenizer from HF transformers.chonkie[transformers]

default indicates that the feature is available with the default pip install chonkie.

To use a custom token counter, you can pass in any function that takes a string and returns an integer! Something like this:

def custom_token_counter(text: str) -> int:
    return len(text)

chunker = RecursiveChunker(tokenizer_or_token_counter=custom_token_counter)

You can use this to extend Chonkie to support any tokenization scheme you want!

🧠 Embed like a boss! Chonkie links up with 7+ embedding pals!

Seamlessly works with various embedding model providers. Bring your favorite embeddings to the CHONK party! Use AutoEmbeddings to load models easily.

Provider / AliasClassDescriptionOptional Install
model2vecModel2VecEmbeddingsUse Model2Vec models.chonkie[model2vec]
sentence-transformersSentenceTransformerEmbeddingsUse any sentence-transformers model.chonkie[st]
openaiOpenAIEmbeddingsUse OpenAI's embedding API.chonkie[openai]
cohereCohereEmbeddingsUse Cohere's embedding API.chonkie[cohere]
geminiGeminiEmbeddingsUse Google's Gemini embedding API.chonkie[gemini]
jinaJinaEmbeddingsUse Jina AI's embedding API.chonkie[jina]
voyageaiVoyageAIEmbeddingsUse Voyage AI's embedding API.chonkie[voyageai]
πŸ§žβ€β™‚οΈ Power Up with Genies! Chonkie supports 2+ LLM providers!

Genies provide interfaces to interact with Large Language Models (LLMs) for advanced chunking strategies or other tasks within the pipeline.

Genie NameClassDescriptionOptional Install
geminiGeminiGenieInteract with Google Gemini APIs.chonkie[gemini]
openaiOpenAIGenieInteract with OpenAI APIs.chonkie[openai]

You can also use the OpenAIGenie to interact with any LLM provider that supports the OpenAI API format, by simply changing the model, base_url, and api_key parameters. For example, here's how to use the OpenAIGenie to interact with the Llama-4-Maverick model via OpenRouter:

from chonkie import OpenAIGenie

genie = OpenAIGenie(model="meta-llama/llama-4-maverick",
                    base_url="https://openrouter.ai/api/v1",
                    api_key="your_api_key")
🐴 Exporting CHONKs! Chonkie supports 1+ Porter!

Porters help you save your chunks easily.

Porter NameClassDescriptionOptional Install
jsonJSONPorterExport chunks to a JSON file.default
🀝 Shake hands with your DB! Chonkie connects with 4+ vector stores!

Handshakes provide a unified interface to ingest chunks directly into your favorite vector databases.

Handshake NameClassDescriptionOptional Install
chromaChromaHandshakeIngest chunks into ChromaDB.chonkie[chroma]
qdrantQdrantHandshakeIngest chunks into Qdrant.chonkie[qdrant]
pgvectorPgvectorHandshakeIngest chunks into PostgreSQL with pgvector.chonkie[pgvector]
turbopufferTurbopufferHandshakeIngest chunks into Turbopuffer.chonkie[turbopuffer]

With Chonkie's wide range of integrations, you can easily plug it into your existing infrastructure and start CHONKING!

Benchmarks

"I may be smol hippo, but I pack a big punch!" πŸ¦›

Chonkie is not just cute, it's also fast and efficient! Here's how it stacks up against the competition:

SizeπŸ“¦

  • Default Install: 15MB (vs 80-171MB for alternatives)
  • With Semantic: Still 10x lighter than the closest competition!

Speed⚑

  • Token Chunking: 33x faster than the slowest alternative
  • Sentence Chunking: Almost 2x faster than competitors
  • Semantic Chunking: Up to 2.5x faster than others

Check out our detailed benchmarks to see how Chonkie races past the competition! πŸƒβ€β™‚οΈπŸ’¨

Contributing

Want to help grow Chonkie? Check out CONTRIBUTING.md to get started! Whether you're fixing bugs, adding features, or improving docs, every contribution helps make Chonkie a better CHONK for everyone.

Remember: No contribution is too small for this tiny hippo! πŸ¦›

Acknowledgements

Chonkie would like to CHONK its way through a special thanks to all the users and contributors who have helped make this library what it is today! Your feedback, issue reports, and improvements have helped make Chonkie the CHONKIEST it can be.

And of course, special thanks to Moto Moto for endorsing Chonkie with his famous quote:

"I like them big, I like them chonkie." ~ Moto Moto

Citation

If you use Chonkie in your research, please cite it as follows:

@software{chonkie2025,
  author = {Minhas, Bhavnick AND Nigam, Shreyash},
  title = {Chonkie: A no-nonsense fast, lightweight, and efficient text chunking library},
  year = {2025},
  publisher = {GitHub},
  howpublished = {\url{https://github.com/chonkie-inc/chonkie}},
}

Keywords

chunking

FAQs

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts