Skip to content

GalaxyGeneralRobotics/OpenTrack

Repository files navigation

OpenTrack

GALBOT · Tsinghua

📃Paper | 🏠Website

This repository is the official implementation of OpenTrack, an open-source humanoid motion tracking codebase that uses MuJoCo for simulation and supports multi-GPU parallel training.

News 🚩

[November 30, 2025] LAFAN1 generalist v1 released. Now you can track cartwheel, kungfu, fall and getup, and many other motions within a single policy.

[September 19, 2025] Simple Domain Randomization released.

[September 19, 2025] Tracking codebase released.

TODOs

  • Release motion tracking codebase
  • Release simple domain randomization
  • Release pretrained LAFAN1 generalist v1 checkpoints
  • Release DAgger code
  • Release AnyAdapter
  • Release more pretrained checkpoints
  • Release real-world deployment code

Prepare

  1. Clone the repository:

    git clone git@github.com:GalaxyGeneralRobotics/OpenTrack.git
  2. Create a virtual environment and install dependencies:

    conda create -n any2track python=3.12
    conda activate any2track
    # Install torch to convert JAX to Torch. We don't require the GPU version of torch, but you can install any version as you like.
    pip install torch==2.5.1 torchvision==0.20.1 torchaudio==2.5.1 --index-url https://download.pytorch.org/whl/cpu
    pip install -r requirements.txt
    export MJX_SKIP_MENAGERIE_CLONE=1 
  3. Download the mocap data and put them under data/mocap/. Thanks for the retargeting motions of LAFAN1 dataset from LocoMuJoCo!

    The file structure should be like:

    data/
    |-- xmls
       |- ...
    |-- mocap
       |-- lafan1
          |-- UnitreeG1
                |-- dance1_subject1.npz
                |--- ...
    

Usage

Play pretrained checkpoints

  1. Download pretrained checkpoints and configs from checkpoints and configs, and put them under experiments/.

  2. Run the evaluation script:

    # your_exp_name=<timestamp>_<exp_name>
    python play_policy.py --exp_name <your_exp_name> [--use_viewer] [--use_renderer] [---play_ref_motion]

As of November 30, 2025, we have open-sourced a generalist model on LAFAN1, daggered from four teachers. This checkpoint was trained with simple domain randomization (DR). You may try deploying it on a Unitree G1 robot using your own deployment code, since we have not yet open-sourced our real-robot deployment pipeline.

Train from scratch

  1. Train the model

    # Train on a flat terrain:
    python train_policy.py --exp_name flat_terrain --terrain_type flat_terrain
    # Train on a rough terrain:
    python generate_terrain.py # generate various hfield with Perlin noise
    python train_policy.py --exp_name rough_terrain --terrain_type rough_terrain
    
    # For debug mode (quick testing training without logging)
    python train_policy.py --exp_name debug 
  2. Evaluate the model First, convert the Brax model checkpoint to PyTorch:

    # your_exp_name=<timestamp>_<exp_name>
    python brax2torch.py --exp_name <your_exp_name>

    Next, run the evaluation script:

    # your_exp_name=<timestamp>_<exp_name>
    python play_policy.py --exp_name <your_exp_name> [--use_viewer] [--use_renderer] [---play_ref_motion]

Acknowledgement

This repository is build upon jax, brax, loco-mujoco, and mujoco_playground.

If you find this repository helpful, please cite our work:

@article{zhang2025track,
  title={Track Any Motions under Any Disturbances},
  author={Zhikai Zhang and Jun Guo and Chao Chen and Jilong Wang and Chenghuai Lin and Yunrui Lian and Han Xue and Zhenrong Wang and Maoqi Liu and Huaping Liu and He Wang and Li Yi},
  journal={arXiv preprint arXiv:2509.13833},
  year={2025}
}

About

Official implementation of OpenTrack.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages