Skip to content

UniflexAI/tinynav

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

94 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

tinynav logo

TinyNav /หˆtaษชni หˆnรฆvi/: A lightweight, hackable system to guide your robots anywhere. Maintained by Uniflex AI.

license X (formerly Twitter) URL

Unitree GO2 LeKiwi
go2_robot_720p.mp4
lewiki_720p_small.mp4
Navigation with 3D Gaussian Splatting
3dgs.mp4

[v0.2] What's Changed

๐Ÿš€ Features

  • 3D Gaussian Splatting (3DGS) Map Representation
    Provides high-quality visualization and an intuitive map editor, making it easy to inspect map details and place target POIs with precision.

  • ESDF-based Obstacle Avoidance
    Enables more human-like navigation. Robots not only avoid obstacles but also keep a safe distance, improving path quality.

  • Localization Benchmark
    Adds a benchmark for map-based localization, allowing clear and quantitative evaluation of improvements across versions.

  • CUDA Graph Optimization
    Reduces inference overhead and achieves >20Hz on Jetson Nano, lowering latency for real-time closed-loop navigation.

๐Ÿ”ง Improvements

  • Simplified First-Time Setup
    The postStartCommand command in the dev container now auto-generates platform-specific models, reducing errors and making setup more user-friendly.

  • Expanded CI Testing
    Broader continuous integration coverage ensures higher build stability and code quality.

  • Map Storage with KV Database
    Maps are now stored using shelve, resulting in shorter code and better performance.

๐Ÿž Bug Fixes

  • Over 50 pull requests merged since the last release, delivering numerous fixes and stability improvements.

[v0.1] What's Changed

๐Ÿš€ Features

  • Implemented map-based navigation with relocalization and global planning.
  • Added support for Unitree robots.
  • Added support for the Lewiki platform.
  • Upgraded stereo depth model for a better speedโ€“accuracy balance.
  • Tuned Intelยฎ RealSenseโ„ข exposure strategy, optimized for robotics tasks.
  • Added Gazebo simulation environment
  • CI: Docker image build & push pipeline.

๐Ÿ”ง Improvements

  • Used Numba JIT to speed up key operations while keeping the code simple and maintainable.
  • Adopted asyncio for concurrent model inference.
  • Added gravity correction when velocity is zero.
  • Mount /etc/localtime by default so ROS bag files use local time in their names.
  • Optimized trajectory generation.

๐Ÿž BugFix

  • Various bug fixes and stability improvements.

Highlight (Our Design Goals)

We aim to make the system:

Tiny

  • Compact (~2000 LOC) for clarity and ease of use.
  • Supports fast prototyping and creative applications.
  • Encourages community participation and maintenance.

Robust

  • Designed to be reliable across diverse scenes and datasets.
  • Ongoing testing for consistent performance in real-world conditions.

Multiple Robots Platform

  • Targeting out-of-the-box support for various robot types.
  • Initial focus: Lekiwi wheeled robot, Unitree GO2.
  • Flexible architecture for future robot integration.

Multiple Chips Platform

  • Compute support starts with Jetson Orin and Desktop.
  • Planning support for cost-effective platforms like RK3588.
  • Aims for broader accessibility and deployment options.

Project Structure

The repository is organized as follows:

  • tinynav/core/
    Core Python modules for perception, mapping, planning, and control:

    • perception_node.py โ€“ Processes sensor data for localization and perception.
    • map_node.py โ€“ Builds and maintains the environment map.
    • planning_node.py โ€“ Computes paths and trajectories using map and perception data.
    • control_node.py โ€“ Sends control commands to actuate the robot.
    • Supporting modules:
      • driver_node.py, math_utils.py, models_trt.py, stereo_engine.py.
  • tinynav/cpp/
    C++ backend components and bindings for performance-critical operations.

  • tinynav/models/
    Pretrained models and conversion scripts for perception and feature extraction.

  • scripts/
    Shell scripts for launching demos, managing Docker containers, and recording datasets.


Getting Started

Prerequisites

Before you begin, make sure you have the following installed:

  • git and git-lfs (for cloning and handling large files)
  • Docker

Platform-specific requirements:

๐Ÿš€ Quick Start

  1. Check the environment

    git clone https://github.com/UniflexAI/tinynav.git
    cd tinynav
    bash scripts/check_env.sh

    Follow the instructions to fix any environment issues until you see:

    โœ… Docker is installed.
    โœ… Docker daemon is running and accessible.
    โœ… NVIDIA runtime is available in Docker.
    โœ… Git LFS is installed.
    โœ… devcontainer.json patched for your x86 platform.
  2. Open the project in VS Code

    • Launch VS Code and open the tinynav folder.
    • Install the Dev Containers extension if prompted.
    • Reopen the folder inside the container.
  3. Run the example
    Once inside the container, start the demo:

    bash /tinynav/scripts/run_rosbag_examples.sh

    You should see an RViz window displaying the live planning process:

    tinynav logo


๐Ÿ“œ What run_rosbag_examples.sh Does

The script automates the entire demo workflow:

  1. Plays dataset
    Streams a recorded dataset from this ROS bag.

  2. Runs TinyNav pipeline

    • perception_node.py โ†’ Performs localization and builds the local map.
    • planning_node.py โ†’ Computes the robotโ€™s optimal path.
    • RViz โ†’ Visualizes the robotโ€™s state and planned trajectory in real time.

โœจ With these steps, youโ€™ll have the full TinyNav system up and running in minutes.

Developer Guide

Using Dev Containers

TinyNav supports Dev Containers for a consistent and reproducible development experience.

Using VS Code

  1. Open the tinynav folder in Visual Studio Code.
  2. Ensure the Dev Containers extension is installed.
  3. VS Code will automatically start the container and open a terminal inside it.

Using the Dev Container CLI

If you prefer the command line:

# Install the Dev Containers CLI
npm install -g @devcontainers/cli

# Start the Dev Container
devcontainer up --workspace-folder .

# Open a shell inside the container
devcontainer exec --workspace-folder . bash

First-Time Setup (Inside the Dev Container)

After entering the development container, set up the Python environment:

uv venv --system-site-packages
uv sync

This will create a virtual environment and install all required dependencies.

Next Steps

  • High Optimization NN models: Support real-time perception processing at >= 30fps.
  • Map module enhancement: Improve consistency and accuracy for mapping and localization.
  • End-To-End trajectories planning: Deliver robust and safe trajectories with integrated semantic information.

๐Ÿ“Š Line of Code

------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          11            328            154           1959
C++                              3             49             32            292
Markdown                         2             76              6            167
Bourne Shell                     8              9              8            109
Dockerfile                       1             12             10             46
TOML                             1              6              0             33
JSON                             1              4              0             25
CMake                            1              4              0             16
XML                              1              0              0             13
-------------------------------------------------------------------------------
SUM:                            29            488            210           2660
-------------------------------------------------------------------------------

Team

We are a small, dedicated team with experience working on various robots and headsets.

Contributors โœจ

Thanks goes to these wonderful people (emoji key):

YANG Zhenfei
YANG Zhenfei

๐Ÿ’ป
junlinp
junlinp

๐Ÿ’ป
heyixuan-DM
heyixuan-DM

๐Ÿ’ป
xinghan li
xinghan li

๐Ÿ’ป

This project follows the all-contributors specification. Contributions of any kind welcome!

Sponsors โค๏ธ

Thanks to our sponsor(s) for supporting the development of this project:

DeepMirror - https://www.deepmirror.com/

About

TinyNav: A lightweight, hackable system to guide your robots anywhere.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 6