Skip to content

Latest commit

 

History

History
76 lines (52 loc) · 3.17 KB

README.md

File metadata and controls

76 lines (52 loc) · 3.17 KB

Dig-Draw

This project demonstrates how to use TensorFlow Mobile on Android for handwritten digits classification from MNIST.

*IN MY CASE SINCE I DONT WANT TO DO THE TRAINING PART SO I SKIPPED STEP 1 & 2

How to build

Requirement

  • Python 3.6, TensorFlow 1.8.0
  • Android Studio 3.0, Gradle 4.1

Step 1. Training

Step 2. Model optimization

Step 3. Build Android app

Copy the mnist_optimized.pb generated in Step 2 to /android/app/src/main/assets, then build and run the app. you can find it assets folder because it is already copied there

The Classifer creates a TensorFlowInferenceInterface from mnist_optimized.pb. The TensorFlowInferenceInterface provides an interface for inference and performance summarization, which is included in the following library.

implementation "org.tensorflow:tensorflow-android:1.8.0"

FYI : How is TensorFlowInferenceInterface Used ?

Ans :

Step 1 { Code — Initialization }

Initialize the TensorFlowInferenceInterface.

private static final String MODEL_PATH = "file:///android_asset/mnist_optimized.pb";
c.inferenceInterface = new TensorFlowInferenceInterface(assetManager, modelpath);

Step 2 { Code — Model Definitions }

The names of the input and output for the model come from our mnist.py training script


private static final String INPUT_NAME = "input";
private static final String OUTPUT_NAME = "output";
  • To figure out the names of your input and output nodes is to import your TensorFlow model into TensorBoard and inspect it there.

Step 3 {Code — Running the Classifier via TensorFlowInferenceInterface}

We feed in the pixel data, run the classifier, then fetch the outputs. Those outputs are then sorted to get the one with the highest confidence (above a specified threshold), and shown to the user

  • Copy the input data into TensorFlow.
inferenceInterface.feed(inputName, pixels, new long[]{inputSize * inputSize});
  • Run the inference call.
inferenceInterface.run(outputNames);
  • Copy the output Tensor back into the output array.
inferenceInterface.fetch(outputName, outputs);
  • Find the best classifications.
for (int i = 0; i < outputs.length; ++i) {
    <snip> 
}

Credits