
Security News
How Enterprise Security Is Adapting to AI-Accelerated Threats
Socket CTO Ahmad Nassri discusses why supply chain attacks now target developer machines and what AI means for the future of enterprise security.
BoTorch is a library for Bayesian Optimization built on PyTorch.
BoTorch is currently in beta and under active development!
BoTorch
The primary audience for hands-on use of BoTorch are researchers and sophisticated practitioners in Bayesian Optimization and AI. We recommend using BoTorch as a low-level API for implementing new algorithms for Ax. Ax has been designed to be an easy-to-use platform for end-users, which at the same time is flexible enough for Bayesian Optimization researchers to plug into for handling of feature transformations, (meta-)data management, storage, etc. We recommend that end-users who are not actively doing research on Bayesian Optimization simply use Ax.
Installation Requirements
The latest release of BoTorch is easily installed via pip:
pip install botorch
Note: Make sure the pip being used is actually the one from the newly created
Conda environment. If you're using a Unix-based OS, you can use which pip to check.
BoTorch stopped publishing
an official Anaconda package to the pytorch channel after the 0.12 release. However,
users can still use the package published to the conda-forge channel and install botorch via
conda install botorch -c gpytorch -c conda-forge
If you would like to try our bleeding edge features (and don't mind potentially
running into the occasional bug here or there), you can install the latest
development version directly from GitHub. You may also want to install the
current gpytorch and linear_operator development versions:
pip install --upgrade git+https://github.com/cornellius-gp/linear_operator.git
pip install --upgrade git+https://github.com/cornellius-gp/gpytorch.git
pip install --upgrade git+https://github.com/meta-pytorch/botorch.git
If you want to contribute to BoTorch, you will want to install editably so that you can change files and have the changes reflected in your local install.
If you want to install the current gpytorch and linear_operator development versions, as in Option 2, do that
before proceeding.
git clone https://github.com/meta-pytorch/botorch.git
cd botorch
pip install -e .
git clone https://github.com/meta-pytorch/botorch.git
cd botorch
pip install -e ".[dev, tutorials]"
dev: Specifies tools necessary for development
(testing, linting, docs building; see Contributing below).tutorials: Also installs all packages necessary for running the tutorial notebooks.pip install -e ".[dev]".Here's a quick run down of the main components of a Bayesian optimization loop. For more details see our Documentation and the Tutorials.
import torch
from botorch.models import SingleTaskGP
from botorch.models.transforms import Normalize
from botorch.fit import fit_gpytorch_mll
from gpytorch.mlls import ExactMarginalLogLikelihood
# Double precision is highly recommended for GPs.
# See https://github.com/meta-pytorch/botorch/discussions/1444
train_X = torch.rand(10, 2, dtype=torch.double) * 2
Y = 1 - (train_X - 0.5).norm(dim=-1, keepdim=True) # explicit output dimension
Y += 0.1 * torch.rand_like(Y)
gp = SingleTaskGP(
train_X=train_X,
train_Y=Y,
input_transform=Normalize(d=2),
)
mll = ExactMarginalLogLikelihood(gp.likelihood, gp)
fit_gpytorch_mll(mll)
from botorch.acquisition import LogExpectedImprovement
logEI = LogExpectedImprovement(model=gp, best_f=Y.max())
from botorch.optim import optimize_acqf
bounds = torch.stack([torch.zeros(2), torch.ones(2)]).to(torch.double)
candidate, acq_value = optimize_acqf(
logEI, bounds=bounds, q=1, num_restarts=5, raw_samples=20,
)
If you use BoTorch, please cite the following paper:
@inproceedings{balandat2020botorch,
title={{BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization}},
author={Balandat, Maximilian and Karrer, Brian and Jiang, Daniel R. and Daulton, Samuel and Letham, Benjamin and Wilson, Andrew Gordon and Bakshy, Eytan},
booktitle = {Advances in Neural Information Processing Systems 33},
year={2020},
url = {http://arxiv.org/abs/1910.06403}
}
See here for an incomplete selection of peer-reviewed papers that build off of BoTorch.
See the CONTRIBUTING file for how to help out.
BoTorch is MIT licensed, as found in the LICENSE file.
FAQs
Bayesian Optimization in PyTorch
We found that botorch demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 5 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Socket CTO Ahmad Nassri discusses why supply chain attacks now target developer machines and what AI means for the future of enterprise security.

Security News
Learn the essential steps every developer should take to stay secure on npm and reduce exposure to supply chain attacks.

Security News
Experts push back on new claims about AI-driven ransomware, warning that hype and sponsored research are distorting how the threat is understood.