Brax is a differentiable physics engine that simulates environments made up of rigid bodies, joints, and actuators. Brax is written in JAX and is designed for use on acceleration hardware. It is both efficient for single-device simulation, and scalable to massively parallel simulation on multiple devices, without the need for pesky datacenters.
Some policies trained via Brax. Brax simulates these environments at millions of physics steps per second on TPU.
Brax also includes a suite of learning algorithms that train agents in seconds to minutes:
- Baseline learning algorithms such as PPO, SAC, ARS, and evolutionary strategies.
- Learning algorithms that leverage the differentiability of the simulator, such as analytic policy gradients.
Explore Brax easily and quickly through a series of colab notebooks:
- Brax Basics introduces the Brax API, and shows how to simulate basic physics primitives.
- Brax Environments shows how to operate and visualize Brax environments. It also demonstrates converting Brax environments to Gym environments, and how to use Brax via other ML frameworks such as PyTorch.
- Brax Training with TPU introduces Brax's training algorithms, and lets you train your own policies directly within the colab. It also demonstrates loading and saving policies.
- Brax Training with PyTorch on GPU demonstrates how Brax can be used in other ML frameworks for fast training, in this case PyTorch.
- Brax Multi-Agent measures Brax's performance on multi-agent simulation, with many bodies in the environment at once.
To install Brax from pypi, install it with:
python3 -m venv env
source env/bin/activate
pip install --upgrade pip
pip install brax
Alternatively, to install Brax from source, clone this repo, cd
to it, and then:
python3 -m venv env
source env/bin/activate
pip install --upgrade pip
pip install -e .
To train a model:
learn
Training on NVidia GPU is supported, but you must first install CUDA, CuDNN, and JAX with GPU support.
For a deep dive into Brax's design and performance characteristics, please see our paper, Brax -- A Differentiable Physics Engine for Large Scale Rigid Body Simulation , which appeared in the Datasets and Benchmarks Track at NeurIPS 2021.
If you would like to reference Brax in a publication, please use:
@software{brax2021github,
author = {C. Daniel Freeman and Erik Frey and Anton Raichuk and Sertan Girgin and Igor Mordatch and Olivier Bachem},
title = {Brax - A Differentiable Physics Engine for Large Scale Rigid Body Simulation},
url = {https://github.com/google/brax},
version = {0.0.13},
year = {2021},
}
Brax has come a long way since its original publication. We offer gratitude and effusive praise to the following people:
- Manu Orsini and Nikola Momchev who provided a major refactor of Brax's training algorithms to make them more accessible and reusable.
- Erwin Coumans who has graciously offered advice and mentorship, and many useful references from Tiny Differentiable Simulator.
- Baruch Tabanpour, a colleague who is making Brax much more reliable and feature-complete.
- Shixiang Shane Gu and Hiroki Furuta, who contributed BIG-Gym and Braxlines, and a scene composer to Brax.
- Our awesome open source collaborators and contributors. Thank you!