[FRONTEND][TFLITE] TFLite frontend ported from Relay#18685
[FRONTEND][TFLITE] TFLite frontend ported from Relay#18685srkreddy1238 wants to merge 8 commits intoapache:mainfrom
Conversation
Summary of ChangesHello @srkreddy1238, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request integrates a TensorFlow Lite (TFLite) frontend into the TVM Relax framework, allowing for the conversion and execution of TFLite models. The implementation includes a specialized FlexBuffer parser to handle TFLite's custom options and is validated by a broad range of tests, ensuring compatibility with common TensorFlow operations and popular Keras classification models. This lays the groundwork for broader TFLite model support in Relax. Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request introduces the TFLite frontend for Relax, enabling conversion of TFLite graphs into Relax graphs. It includes a Flexbuffer decoder for custom options and a comprehensive set of tests for various TensorFlow operations and Keras applications. While the overall structure is good, there are several critical issues related to missing imports, incorrect handling of data types, and potential bugs in the Flexbuffer deserialization and quantized operations that need to be addressed before merging. Additionally, some security and maintainability concerns were identified.
|
|
||
| import ssl | ||
|
|
||
| ssl._create_default_https_context = ssl._create_unverified_context |
There was a problem hiding this comment.
| # Limiting the tests for CI | ||
| (keras_app.Xception, (1, 299, 299, 3)), | ||
| # (keras_app.VGG16, (1, 224, 224, 3)), | ||
| # (keras_app.VGG19, (1, 224, 224, 3)), | ||
| (keras_app.ResNet50, (1, 224, 224, 3)), | ||
| # (keras_app.ResNet50V2, (1, 224, 224, 3)), | ||
| # (keras_app.ResNet101, (1, 224, 224, 3)), | ||
| # (keras_app.ResNet101V2, (1, 224, 224, 3)), | ||
| # (keras_app.ResNet152, (1, 224, 224, 3)), | ||
| # (keras_app.ResNet152V2, (1, 224, 224, 3)), | ||
| (keras_app.InceptionResNetV2, (1, 299, 299, 3)), | ||
| # (keras_app.MobileNet, (1, 224, 224, 3)), | ||
| (keras_app.MobileNetV2, (1, 224, 224, 3)), | ||
| (keras_app.DenseNet121, (1, 224, 224, 3)), | ||
| # (keras_app.DenseNet169, (1, 224, 224, 3)), | ||
| # (keras_app.DenseNet201, (1, 224, 224, 3)), | ||
| (keras_app.NASNetMobile, (1, 224, 224, 3)), | ||
| # (keras_app.NASNetLarge, (1, 331, 331, 3)), | ||
| (keras_app.EfficientNetB0, (1, 224, 224, 3)), | ||
| # (keras_app.EfficientNetB1, (1, 240, 240, 3)), | ||
| # (keras_app.EfficientNetB2, (1, 260, 260, 3)), | ||
| # (keras_app.EfficientNetB3, (1, 300, 300, 3)), | ||
| # (keras_app.EfficientNetB4, (1, 380, 380, 3)), | ||
| # (keras_app.EfficientNetB5, (1, 456, 456, 3)), | ||
| # (keras_app.EfficientNetB6, (1, 528, 528, 3)), | ||
| # (keras_app.EfficientNetB7, (1, 600, 600, 3)), | ||
| (keras_app.EfficientNetV2B0, (1, 224, 224, 3)), | ||
| # (keras_app.EfficientNetV2B1, (1, 240, 240, 3)), | ||
| # (keras_app.EfficientNetV2B2, (1, 260, 260, 3)), | ||
| # (keras_app.EfficientNetV2B3, (1, 300, 300, 3)), | ||
| # (keras_app.EfficientNetV2S, (1, 384, 384, 3)), | ||
| # (keras_app.EfficientNetV2M, (1, 480, 480, 3)), | ||
| # (keras_app.EfficientNetV2L, (1, 480, 480, 3)), | ||
| (keras_app.ConvNeXtTiny, (1, 224, 224, 3)), | ||
| # (keras_app.ConvNeXtSmall, (1, 224, 224, 3)), | ||
| # (keras_app.ConvNeXtBase, (1, 224, 224, 3)), | ||
| # (keras_app.ConvNeXtLarge, (1, 224, 224, 3)), | ||
| # (keras_app.ConvNeXtXLarge, (1, 224, 224, 3)), | ||
| ], |
There was a problem hiding this comment.
Many network tests are commented out with the note "Limiting the tests for CI". While this might be necessary for CI resource constraints, it means a significant portion of the TFLite frontend's functionality for these Keras models is not being tested. This could lead to regressions or undetected issues. Consider enabling these tests for local development or in a less constrained CI environment, or adding a tracking issue to re-enable them.
| return mod | ||
|
|
||
|
|
||
| def verify(TestClass, expected=None): |
There was a problem hiding this comment.
Please avoid e2e tests to reduce CI pressure. Structual equality is better. Nightly might be okay.
There was a problem hiding this comment.
Do we have test marker now to skip while in CI ?
There was a problem hiding this comment.
You can put nightly tests in tests/python/nightly.
There was a problem hiding this comment.
Instead of duplicating the under tests/python/nightly managed calling tflite tests from tests/scripts/task_python_nightly.sh with different environment
|
|
||
| import ssl | ||
|
|
||
| ssl._create_default_https_context = ssl._create_unverified_context |
There was a problem hiding this comment.
The network tests where the keras tries to download the model weights.
| def __init__(self, model, subgraph, exp_tab, ctx): | ||
|
|
||
| try: | ||
| from tflite.ActivationFunctionType import ActivationFunctionType |
There was a problem hiding this comment.
I find it a bit annoying that the codebase has so many duplicate import statements. I’d prefer consolidating imports at the top level.
There was a problem hiding this comment.
The front ends are not importing corresponding frameworks globally. Tried to clean by removing exception handling and removing unnecessary imports with in op handlers.
Verified for entire range of classification nets Quantization is disabled at the moment There exists few unspoorted ops in convertion maps which is need to be mapped in future when relax op inventory grows.
03918e8 to
8e93e22
Compare
8e93e22 to
34d07ae
Compare
Verified for entire range of classification nets
Quantization is disabled at the moment
There exists few unspoorted ops in convertion maps which is need to be mapped in future when relax op inventory grows.