
Open source MLOps infrastructure deployment on Public Cloud providers
Open source MLOps: Open source tools for different stages in an MLOps lifecycle.
Public Cloud Providers: Supporting all major cloud providers including AWS, GCP, Azure and Oracle Cloud
mlinfra
is the swiss army knife for deploying MLOps tooling anywhere. It aims to make MLOps infrastructure deployment easy and accessible to all ML teams by liberating IaC logic for creating MLOps stacks which is usually tied to other frameworks.
Contribute to the project by opening a issue or joining project roadmap and design related discussion on discord. Complete roadmap will be released soon!
🚀 Installation
Requirements
mlinfra
requires the following to run perfectly:
terraform
>= 1.10.2
should be installed on the system.
mlinfra
can be installed simply by creating a python virtual environment and installing mlinfra
pip package
python -m venv .venv
source .venv/bin/activate
pip install mlinfra
Copy a deployment config from the examples folder, change your AWS account in the config file, configure your AWS credentials and deploy the configuration using
mlinfra terraform apply --config-file <path-to-your-config>
For more information, read the mlinfra user guide
Deployment Config
mlinfra
deploys infrastructure using declarative approach. It requires resources to be defined in a yaml
file with the following format
name: aws-mlops-stack
provider:
name: aws
account-id: xxxxxxxxx
region: eu-central-1
deployment:
type: cloud_vm
stack:
data_versioning:
- lakefs
experiment_tracker:
- mlflow
orchestrator:
- zenml
model_inference:
- bentoml
monitoring:
- nannyML
alerting:
- mlflow
-
For examples, check out the documentation.
-
NOTE: This was minimal spec for aws cloud as infra with custom applications. Other stacks such as feature_store, event streamers, loggers or cost dashboards can be added via community requests. For more information, please check out the docs.
Supported Providers
The core purpose is to build for all cloud and deployment platforms out there. Any user should be able to just change the cloud provider or runtime environment (whether it be linux or windows) and have the capability to deploy the same tools.
mlinfra will be supporting the following providers:
Local machine (for development)
Cloud Providers (for deployment and production ready)
Supported deployment types
When deploying on managed cloud providers, users can deploy their infrastructure on top of either:
Supported MLOps Tools
mlinfra
intends to support as many MLOps tools deployable in a platform in their standalone as well as high availability across different layers of an MLOps stack:
data_ingestion
data_versioning
data_processing
vector_database
experiment_tracker
orchestrator
model_inference
monitoring
alerting
Development
- This project relies on terraform for IaC code and python to glue it all together.
- To get started, install terraform and python.
- You can install the required python packages by running
uv sync
- You can run any of the available examples from the
examples/
folder by running the following command in root directory python src/mlinfra/cli/cli.py terraform <action> --config-file examples/<deployment-type>/<file>.yaml
where <action>
corresponds to terraform actions such as plan
, apply
and destroy
.
For more information, please refer to the Engineering Wiki of the project (https://mlinfra.io/user_guide/) regarding what are the different components of the project and how they work together.
Contributions
- Contributions are welcome! Help us onboard all of the available mlops tools on currently available cloud providers.
- For major changes, please open an issue first to discuss what you would like to change. A team member will get to you soon.
- For information on the general development workflow, see the contribution guide.
License
The mlinfra
library is distributed under the Apache-2 license.