TinyNav /หtaษชni หnรฆvi/: A lightweight, hackable system to guide your robots anywhere. Maintained by Uniflex AI.
| Unitree GO2 | LeKiwi |
|---|---|
go2_robot_720p.mp4 |
lewiki_720p_small.mp4 |
| Navigation with 3D Gaussian Splatting |
|---|
3dgs.mp4 |
-
3D Gaussian Splatting (3DGS) Map Representation
Provides high-quality visualization and an intuitive map editor, making it easy to inspect map details and place target POIs with precision. -
ESDF-based Obstacle Avoidance
Enables more human-like navigation. Robots not only avoid obstacles but also keep a safe distance, improving path quality. -
Localization Benchmark
Adds a benchmark for map-based localization, allowing clear and quantitative evaluation of improvements across versions. -
CUDA Graph Optimization
Reduces inference overhead and achieves >20Hz on Jetson Nano, lowering latency for real-time closed-loop navigation.
-
Simplified First-Time Setup
ThepostStartCommandcommand in the dev container now auto-generates platform-specific models, reducing errors and making setup more user-friendly. -
Expanded CI Testing
Broader continuous integration coverage ensures higher build stability and code quality. -
Map Storage with KV Database
Maps are now stored usingshelve, resulting in shorter code and better performance.
- Over 50 pull requests merged since the last release, delivering numerous fixes and stability improvements.
- Implemented map-based navigation with relocalization and global planning.
- Added support for Unitree robots.
- Added support for the Lewiki platform.
- Upgraded stereo depth model for a better speedโaccuracy balance.
- Tuned Intelยฎ RealSenseโข exposure strategy, optimized for robotics tasks.
- Added Gazebo simulation environment
- CI: Docker image build & push pipeline.
- Used Numba JIT to speed up key operations while keeping the code simple and maintainable.
- Adopted asyncio for concurrent model inference.
- Added gravity correction when velocity is zero.
- Mount /etc/localtime by default so ROS bag files use local time in their names.
- Optimized trajectory generation.
- Various bug fixes and stability improvements.
We aim to make the system:
- Compact (~2000 LOC) for clarity and ease of use.
- Supports fast prototyping and creative applications.
- Encourages community participation and maintenance.
- Designed to be reliable across diverse scenes and datasets.
- Ongoing testing for consistent performance in real-world conditions.
- Targeting out-of-the-box support for various robot types.
- Initial focus: Lekiwi wheeled robot, Unitree GO2.
- Flexible architecture for future robot integration.
- Compute support starts with Jetson Orin and Desktop.
- Planning support for cost-effective platforms like RK3588.
- Aims for broader accessibility and deployment options.
The repository is organized as follows:
-
tinynav/core/
Core Python modules for perception, mapping, planning, and control:perception_node.pyโ Processes sensor data for localization and perception.map_node.pyโ Builds and maintains the environment map.planning_node.pyโ Computes paths and trajectories using map and perception data.control_node.pyโ Sends control commands to actuate the robot.- Supporting modules:
driver_node.py,math_utils.py,models_trt.py,stereo_engine.py.
-
tinynav/cpp/
C++ backend components and bindings for performance-critical operations. -
tinynav/models/
Pretrained models and conversion scripts for perception and feature extraction. -
scripts/
Shell scripts for launching demos, managing Docker containers, and recording datasets.
Before you begin, make sure you have the following installed:
- git and git-lfs (for cloning and handling large files)
- Docker
Platform-specific requirements:
- For x86_64 (PC): NVIDIA Container Toolkit (for GPU support)
- For Jetson Orin: JetPack SDK version 6.2 or higher
-
Check the environment
git clone https://github.com/UniflexAI/tinynav.git cd tinynav bash scripts/check_env.shFollow the instructions to fix any environment issues until you see:
โ Docker is installed. โ Docker daemon is running and accessible. โ NVIDIA runtime is available in Docker. โ Git LFS is installed. โ devcontainer.json patched for your x86 platform.
-
Open the project in VS Code
- Launch VS Code and open the
tinynavfolder. - Install the Dev Containers extension if prompted.
- Reopen the folder inside the container.
- Launch VS Code and open the
-
Run the example
Once inside the container, start the demo:bash /tinynav/scripts/run_rosbag_examples.sh
You should see an RViz window displaying the live planning process:
The script automates the entire demo workflow:
-
Plays dataset
Streams a recorded dataset from this ROS bag. -
Runs TinyNav pipeline
perception_node.pyโ Performs localization and builds the local map.planning_node.pyโ Computes the robotโs optimal path.- RViz โ Visualizes the robotโs state and planned trajectory in real time.
โจ With these steps, youโll have the full TinyNav system up and running in minutes.
TinyNav supports Dev Containers for a consistent and reproducible development experience.
- Open the
tinynavfolder in Visual Studio Code. - Ensure the Dev Containers extension is installed.
- VS Code will automatically start the container and open a terminal inside it.
If you prefer the command line:
# Install the Dev Containers CLI
npm install -g @devcontainers/cli
# Start the Dev Container
devcontainer up --workspace-folder .
# Open a shell inside the container
devcontainer exec --workspace-folder . bashAfter entering the development container, set up the Python environment:
uv venv --system-site-packages
uv syncThis will create a virtual environment and install all required dependencies.
- High Optimization NN models: Support real-time perception processing at >= 30fps.
- Map module enhancement: Improve consistency and accuracy for mapping and localization.
- End-To-End trajectories planning: Deliver robust and safe trajectories with integrated semantic information.
------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
Python 11 328 154 1959
C++ 3 49 32 292
Markdown 2 76 6 167
Bourne Shell 8 9 8 109
Dockerfile 1 12 10 46
TOML 1 6 0 33
JSON 1 4 0 25
CMake 1 4 0 16
XML 1 0 0 13
-------------------------------------------------------------------------------
SUM: 29 488 210 2660
-------------------------------------------------------------------------------
We are a small, dedicated team with experience working on various robots and headsets.
Thanks goes to these wonderful people (emoji key):
YANG Zhenfei ๐ป |
junlinp ๐ป |
heyixuan-DM ๐ป |
xinghan li ๐ป |
This project follows the all-contributors specification. Contributions of any kind welcome!
Thanks to our sponsor(s) for supporting the development of this project:
DeepMirror - https://www.deepmirror.com/
