Skip to content

jp-schneider/nag

Repository files navigation

Neural Atlas Graphs
for Dynamic Scene Decomposition and Editing

Jan Philipp Schneider1,2, Pratik Singh Bisht1, Ilya Chugunov2, Andreas Kolb1, Michael Moeller1,3, Felix Heide2,4

1University of Siegen     2Princeton University     3Lamarr Institute     4Torc Robotics

🎉 NeurIPS 2025 (spotlight) 🎉

"Neural Atlas Graphs enable high-quality dynamic scene decomposition and intuitive 2D appearance editing, with use-cases in autonomous driving and videography."


Ground Truth Image 1 - Time 002 Ground Truth Image 2 - Time 005 Ground Truth Image 3 - Time 015 Ground Truth Image 4 - Time 025 Decomposed Objects
Edited Image 1 - Time 002 Edited Image 2 - Time 005 Edited Image 3 - Time 015 Edited Image 4 - Time 025 Edit Texture
Dynamic Scene Editing in Waymo S-203. Ground Truth (top) vs. NAG Edits (bottom) across four frames. The final column shows the texture source (bottom) and removed objects (top). Note the realistic, consistent blending of the foreground car and its shadow into the edited scene.

Ground Truth Image 1 - Time 05 Ground Truth Image 2 - Time 15 Ground Truth Image 3 - Time 30 Ground Truth Image 4 - Time 45
Edited Image 1 - Time 05 Edited Image 2 - Time 15 Edited Image 3 - Time 30 Edited Image 4 - Time 45
Edited Image 1 - Time 05 Edited Image 2 - Time 15 Edited Image 3 - Time 30 Edited Image 4 - Time 45
Seamless Texture Transfer and Propagation. We utilize image generation models to create new textures for the black swan (top). The NAG effectively projects textures (white, rainbow) onto the dynamic 3D object, ensuring robust temporal coherence throughout the sequence (bottom rows).

This repository contains the official implementation of Neural Atlas Graphs for Dynamic Scene Decomposition and Editing, a novel hybrid scene representation for learning editable high-resolution dynamic scenes. Neural Atlas Graphs (NAG) integrate the editability of neural atlases with the complex spatial reasoning of scene graphs, where each graph node is a view-dependent neural atlas. This allows for both intuitive 2D appearance editing and consistent 3D ordering and positioning of scene elements.

🛠️ Installation

Please refer to the installation instructions for setting up the repository, its dependencies and data.

🚀 NAG Training

Given a proper python environment setup, one can run our method using:

python nag/scripts/run_nag.py --config-path [path-to-config]

For more details, please refer to the training instructions.

During the training process and afterwards, the model is evaluated to produce outputs for all frames, calculating metrics, as well as scene decompositions for every object.

🔁 Reproducibility

We are committed to full reproducibility of the results presented in our paper. All configuration files and training procedures are provided in this repository. We provide detailed instructions on how to reproduce our experiments in the reproducibility document. Further, we provide the datasets and an explanation how to set these up in our datasets setup document.

In the future, we plan to provide further scripts to convert additional Waymo segments and Davis sequences into our used formats. Create a GitHub issue if you are interested in this or have any questions.

📝 Working with NAGs

We provide a Jupyter Notebook showcasing how to load a pre-trained NAG model, decompose scenes into objects, and perform texture editing. This notebook serves as a practical guide for utilizing the capabilities of Neural Atlas Graphs.

🧠 NAG Code Structure

To briefly outline the code structure of our repository, we provide a high-level overview of the main components and their locations within the codebase.

Our model training and evaluations are encapsulated using a dedicated runner, which holds instances of the model, dataset, and all other training related components. The runner can be created to train a new model, or load an existing one to further evaluate it. As we are relying on pytorch lightning for training, we implemented a callback class, which control training progress and handles the evaluation of the model.General tools and utility functions are within a dedicated tools library, which need to be included using git-submodules.

NAG Core Components

Further, we briefly point out the location of the NAG core components within the repository.

Surely, there is way more to explore & explain, so feel free to open issues or discussions on GitHub if you have any questions regarding the code structure or implementation details.

📜 Citation

If you find our work useful in your research, please consider citing our paper:

@article{Schneider2025NAG,
  author    = {Jan Philipp Schneider and
              Pratik Singh Bisht and
              Ilya Chugunov and
              Andreas Kolb and
              Michael Moeller and
              Felix Heide},
  title     = {Neural Atlas Graphs for Dynamic Scene Decomposition and Editing},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  volume    = {38},
  url       = {https://neurips.cc/virtual/2025/poster/115926},
}

Thanks for your interest in our work! We hope you find Neural Atlas Graphs as exciting and useful as we do. If you have any questions, suggestions, or feedback, please don't hesitate to reach out via GitHub issues or discussions.