ollama-haystack
Table of Contents
Installation
pip install ollama-haystack
License
ollama-haystack
is distributed under the terms of the Apache-2.0 license.
Testing
To run tests first start a Docker container running Ollama and pull a model for integration testing
It's recommended to use the smallest model possible for testing purposes - see https://ollama.ai/library for a list that Ollama supportd
docker run -d -p 11434:11434 --name ollama ollama/ollama:latest
docker exec ollama ollama pull <your model here>
Then run tests:
hatch run test
The default model used here is orca-mini
for generation and nomic-embed-text
for embeddings