mjlab combines Isaac Lab's proven API with best-in-class MuJoCo physics to provide lightweight, modular abstractions for RL robotics research and sim-to-real deployment.
uvx --from mjlab --with "mujoco-warp @ git+https://github.com/google-deepmind/mujoco_warp" demo
⚠️ BETA PREVIEWThis project is in beta. There might be breaking changes and missing features.
- Familiar APIs: If you know Isaac Lab or MuJoCo, you already know mjlab
- Instant Feedback: Fast startup and kernel caching. Drop a breakpoint anywhere and debug immediately
- Massively Parallel: MuJoCo Warp enables efficient GPU-accelerated simulation at scale
- Zero Friction: Pure Python, minimal dependencies. Just
uv run
and go
- Why mjlab? - When to use mjlab (and when to use Isaac Lab, Newton, etc.)
- Migration Guide - Moving from Isaac Lab
- FAQ & Troubleshooting - Common questions and answers
Install uv if you haven't already:
curl -LsSf https://astral.sh/uv/install.sh | sh
Run the demo directly:
uvx --from mjlab --with "mujoco-warp @ git+https://github.com/google-deepmind/mujoco_warp" demo
Clone the repository:
git clone git@github.com:mujocolab/mjlab.git && cd mjlab
Then either:
-
Run commands directly (recommended for development):
uv run demo
-
Install as editable package (if you need to import mjlab elsewhere):
uv pip install -e . "mujoco-warp @ git+https://github.com/google-deepmind/mujoco_warp"
Train a Unitree G1 humanoid to follow velocity commands on flat terrain:
MUJOCO_GL=egl uv run train \
Mjlab-Velocity-Flat-Unitree-G1 \
--env.scene.num-envs 4096
# NOTE: You can evaluate a policy while your training is still
# in progress. This will grab the latest checkpoint from wandb.
uv run play \
--task Mjlab-Velocity-Flat-Unitree-G1-Play \
--wandb-run-path your-org/mjlab/run-id
Train a Unitree G1 to mimic reference motions. mjlab uses WandB to manage reference motion datasets:
-
Create a registry collection in your WandB workspace named
Motions
-
Set your WandB entity:
export WANDB_ENTITY=your-organization-name
-
Process and upload motion files:
MUJOCO_GL=egl uv run scripts/tracking/csv_to_npz.py \ --input-file /path/to/motion.csv \ --output-name motion_name \ --input-fps 30 \ --output-fps 50 \ --render # Optional: generates preview video
Note: For detailed motion preprocessing instructions, see the BeyondMimic documentation.
MUJOCO_GL=egl uv run train \
Mjlab-Tracking-Flat-Unitree-G1 \
--registry-name your-org/motions/motion-name \
--env.scene.num-envs 4096
uv run play \
--task Mjlab-Tracking-Flat-Unitree-G1-Play \
--wandb-run-path your-org/mjlab/run-id
make test
# Install pre-commit hook.
uvx pre-commit install
# Format manually.
make format
mjlab is licensed under the Apache License, Version 2.0.
The third_party/
directory contains files from external projects, each with its own license:
- isaaclab/ — Selected files from NVIDIA Isaac Lab (BSD-3-Clause)
When distributing or modifying mjlab, comply with:
- The Apache-2.0 license for mjlab's original code
- The respective licenses in
third_party/
for those files
See individual LICENSE
files for complete terms.
mjlab wouldn’t exist without the excellent work of the Isaac Lab team, whose API design and abstractions mjlab builds upon.
Thanks to the MuJoCo Warp team — especially Erik Frey and Taylor Howell — for answering our questions, giving helpful feedback, and implementing features based on our requests countless times.