Skip to content

TexasInstruments/edgeai-tensorlab

Repository files navigation

Notice


edgeai-tensorlab

Edge AI model training, quantization, compilation/benchmark & Model Zoo


Release 11.1

Updated to match latest edgeai-tidl-tools release. For more details about features and bug fixes in TIDL, please see the documentation of edgeai-tidl-tools (link above)

Models that are not enabled by default in edgeai-benchmark (model_shortlist=100) have been removed from edgeai-modelzoo as well.

Example config files for models (_config.yaml file for each model) have been made more consistent - these now mostly only have parameters that are related to the model properties.

More details are in the Release Notes


Cloning this repository

This repository is an aggregation of multiple repositories using git subtree. Due to large number of commits, git clone can be slow. Recommend to clone only the required depth. For example, to clone the main branch with a shallow depth=1:

git clone --depth 1 https://github.com/TexasInstruments/edgeai-tensorlab.git

If you have done a shallow clone and later need the full history, you can fetch more commits. This will convert the shallow clone into a full clone by fetching the entire history of commits.

git fetch --unshallow

Or, you can incrementally deepen the history:

git fetch --depth=<new_depth>

How to get started

Want do use Edge AI on TI's MPU devices - but don't know where to start? We have multiple solutions to help develop and deploy models.

Develop your model

EDGE-AI-STUDIO - easy to use GUI tools

  • Model Composer: Capture images, annotate them, train and compile models using GUI.
  • Model Analyzer: Use our hosted Jupyter notebooks to try model compilation online.

edgeai-modelmaker - a commandline tool that supports Bring Your Own Data (BYOD) development flow

  • Use EDGE-AI-STUDIO Model Composer (above GUI tool) to collect and annotate data to create a dataset
  • Export the dataset on to your machine.
  • Use edgeai-modelmaker to train a model using the dataset. edgeai-modelmaker allows you to tweak more parameters than what is supported in the GUI tool
  • It is fully customizable, so you can look at how models and tasks are integrated and even add your own model or tasks.

edgeai-modelzoo - for advanced users

  • Navigagte to edgeai-modelzoo to see see example models, their documentation and performance benchmarks.
  • Browse to the respositories that were used to train those models and try to train your own model using one of those.

Deploy your model



Components

  • The subcomponents have detailed documentation. In the browser, navigate into the sub-folders to see detailed documentation. Here is a high level overview.
Category ToolLink Purpose IS NOT
Model Zoo / Models collection edgeai-modelzoo provides collection of pretrained models, documentation & benchmark information
Model compilation & benchmarking edgeai-benchmark Wrapper on top of edgeai-tidl-tools for easy model compilation and speed/accuracy benchmarking
- Bring your own model and compile, benchmark and generate artifacts for deployment on SDK with camera, inference and display (using edgeai-gst-apps)
- Comprehends inference pipeline including dataset loading, pre-processing and post-processing
- Benchmarking of accuracy and latency with large data sets
- Post training quantization
- Docker for easy development environment setup
Model training tools edgeai-modeloptimization Model optimization tools for improved model training, tools to train TIDL friendly models.
- Model surgery: Modifies models with minimal loss in accuracy and makes it suitable for TI device (replaces unsupported operators)
- QAT: Quantization Aware Training to improve accuracy with fixed point quantization
- Model Pruning/sparsity: Induces sparsity during training – only applicable for specific devices - this is in development.
- Does not support Tensorflow
Model training code edgeai-torchvision
edgeai-mmdetection
edgeai-mmdetection3d
edgeai-hf-transformers
edgeai-mmpose
edgeai-tensorvision
Training repositories for various tasks
- Provides extensions of popular training repositories (like mmdetection, torchvision) with lite version of models
- Does not support Tensorflow
End-to-end Model development - Datasets, Training & Compilation edgeai-modelmaker Beginner friendly, command line, integrated environment for training & compilation
- Bring your own data, select a model, perform training and generate artifacts for deployment on SDK
- Backend tool for model composer (early availability of features compared to Model Composer )
- Does not support Bring Your Own Model workflow
Example datasets, used in edgeai-modelmaker edgeai-datasets Example datasets

Deprecations

Category ToolLink Purpose IS NOT
Model training code edgeai-yolox is being deprecated - use edgeai-mmpose for Keypoint detection and edgeai-mmdetection for Object Detection


Tech Reports

Technical documentation can be found in the documentation of each repository. Here we have a collection of technical reports & tutorials that give high level overview on various topics - see Edge AI Tech Reports.



Acknowledgements

This umbrella repository uses and modifies several source repositories. The following table can be used to navigate to the source of the original repositories and see the contents & contributors.

Sub-repository/Sub-directory Original source repository
edgeai-hf-transformers https://github.com/huggingface/transformers
edgeai-mmdeploy https://github.com/open-mmlab/mmdeploy
edgeai-mmdetection https://github.com/open-mmlab/mmdetection
edgeai-mmdetection3d https://github.com/open-mmlab/mmdetection3d
edgeai-mmpose https://github.com/open-mmlab/mmpose
edgeai-torchvision https://github.com/pytorch/vision
edgeai-yolox https://github.com/Megvii-BaseDetection/YOLOX
edgeai-benchmark NA
edgeai-modelzoo NA
edgeai-modelmaker NA
edgeai-modeloptimization NA
edgeai-tensorvision NA


About

Edge AI Model Development Tools

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 499