Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
Wenuka19 authored Jan 2, 2023
1 parent 9aff06a commit 4773ff0
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -168,4 +168,4 @@ The app looks like below in the emulator. It displays the highest confident resu
[Mobile_App.mp4](https://user-images.githubusercontent.com/89344987/210223484-44402679-ed00-46f1-bd63-ace5d8d21cf4.mp4)

### Week 13 (26<sup>th</sup> December to 2<sup>nd</sup> December)
Now that the mobile app is finalized I tried to convert the vgg_16 model I trained earlier to a `.tflite` model. This caused some unexpected issues. When I run the code snippet which converts the model to a `.tflite` model the colab runs out of RAM. I was using the free version of colab which gives 12GB memory. I looked for solutions in online forums and tried limiting the memory growth, converting using a saved model, converting after freeing up the memory by deleting some data and reducing the batch size as mentioned in the given forums. [solution 01](https://github.com/tensorflow/models/issues/1817), [solution 02](https://github.com/tensorflow/tensorflow/issues/40760). But none of them seem to work. Also I was not able to find a project done by converting a vgg_16 model to a `.tflite` model. And it was mentioned in the [official documentation of TensorFlow lite](https://www.tensorflow.org/lite/guide/ops_compatibility) that certain types of models cannot be converted to `.tflite` format. So I decided that this model may not be usable in my project. Then I tried to train and convert the [InceptionV3](https://keras.io/api/applications/inceptionv3/) model which also couldn't be converted. Then I ran across some online tutorials to convert a [MobileNetV3](https://paperswithcode.com/method/mobilenetv3) model and use in edge devices. So I started using that model and was able to successfully convert it to a `.tflite` and implement it inside my mobile app. The accuracy value is still pretty low, so the rest of the will be spent to fine tune the accuracy.
Now that the mobile app is finalized I tried to convert the vgg_16 model I trained earlier to a `.tflite` model. This caused some unexpected issues. When I run the code snippet which converts the model to a `.tflite` model the colab runs out of RAM. I was using the free version of colab which gives 12GB memory. I looked for solutions in online forums and tried limiting the memory growth, converting using a saved model, converting after freeing up the memory by deleting some data and reducing the batch size as mentioned in the given forums. [solution 01](https://github.com/tensorflow/models/issues/1817), [solution 02](https://github.com/tensorflow/tensorflow/issues/40760). But none of them seem to work. Also I was not able to find a project done by converting a vgg_16 model to a `.tflite` model. And it was mentioned in the [official documentation of TensorFlow lite](https://www.tensorflow.org/lite/guide/ops_compatibility) that certain types of models cannot be converted to `.tflite` format. So I decided that this model may not be usable in my project. Then I tried to train and convert the [InceptionV3](https://keras.io/api/applications/inceptionv3/) model which also couldn't be converted. Then I ran across some online tutorials to convert a [MobileNetV3](https://paperswithcode.com/method/mobilenetv3) model and use in edge devices. So I started using that model and was able to successfully convert it to a `.tflite` and implement it inside my mobile app. The accuracy value is still pretty low, so the rest of the weeks will be spent to fine tune the accuracy.

0 comments on commit 4773ff0

Please sign in to comment.