Brax is a fast and fully differentiable physics engine used for research and
development of robotics, human perception, materials science, reinforcement
learning, and other simulation-heavy applications.
Brax is written in JAX and is designed for use
on acceleration hardware. It is both efficient for single-device simulation, and
scalable to massively parallel simulation on multiple devices, without the need
for pesky datacenters.
Brax simulates environments at millions of physics steps per second on TPU, and includes a suite of learning algorithms that train agents in seconds
to minutes:
One API, Four Pipelines
Brax offers four distinct physics pipelines that are easy to swap:
These pipelines share the same API and can run side-by-side within the same
simulation. This makes Brax well suited for experiments in transfer learning
and closing the gap between simulation and the real world.
Quickstart: Colab in the Cloud
Explore Brax easily and quickly through a series of colab notebooks:
- Brax Basics introduces the Brax API, and shows how to simulate basic physics primitives.
- Brax Training introduces Brax's training algorithms, and lets you train your own policies directly within the colab. It also demonstrates loading and saving policies.
- Brax Training with MuJoCo XLA - MJX demonstrates training in Brax using the
MJX
physics simulator. - Brax Training with PyTorch on GPU demonstrates how Brax can be used in other ML frameworks for fast training, in this case PyTorch.
Using Brax Locally
To install Brax from pypi, install it with:
python3 -m venv env
source env/bin/activate
pip install --upgrade pip
pip install brax
You may also install from Conda or Mamba:
conda install -c conda-forge brax # s/conda/mamba for mamba
Alternatively, to install Brax from source, clone this repo, cd
to it, and then:
python3 -m venv env
source env/bin/activate
pip install --upgrade pip
pip install -e .
To train a model:
learn
Training on NVidia GPU is supported, but you must first install
CUDA, CuDNN, and JAX with GPU support.
Learn More
For a deep dive into Brax's design and performance characteristics, please see
our paper, Brax -- A Differentiable Physics Engine for Large Scale Rigid Body Simulation
, which appeared in the Datasets and Benchmarks Track at NeurIPS 2021.
Citing Brax
If you would like to reference Brax in a publication, please use:
@software{brax2021github,
author = {C. Daniel Freeman and Erik Frey and Anton Raichuk and Sertan Girgin and Igor Mordatch and Olivier Bachem},
title = {Brax - A Differentiable Physics Engine for Large Scale Rigid Body Simulation},
url = {http://github.com/google/brax},
version = {0.12.1},
year = {2021},
}
Acknowledgements
Brax has come a long way since its original publication. We offer gratitude and
effusive praise to the following people:
- Manu Orsini and Nikola Momchev who provided a major refactor of Brax's
training algorithms to make them more accessible and reusable.
- Erwin Coumans who has graciously offered advice and mentorship, and many
useful references from Tiny Differentiable Simulator.
- Baruch Tabanpour, a colleague who helped launch brax v2 and overhauled the contact library.
- Shixiang Shane Gu and Hiroki Furuta, who contributed BIG-Gym and Braxlines, and a scene composer to Brax.
- Our awesome open source collaborators and contributors. Thank you!