Skip to content

TensorFlow models using TensorFlow Serving and Docker, a simple web application with Flask which will serve as an interface to get predictions from the served TensorFlow model.

License

Notifications You must be signed in to change notification settings

swapnanildutta/Tensorflow-Deployment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tensorflow-Deployment

HitCount PRs Welcome

TensorFlow models using TensorFlow Serving and Docker, a simple web application with Flask which will serve as an interface to get predictions from the served TensorFlow model.

Frameworks:

Flask

Flask is a micro web framework written in Python. It is classified as a microframework because it does not require particular tools or libraries. It has no database abstraction layer, form validation, or any other components where pre-existing third-party libraries provide common functions.

Tensorflow Serving

TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. TensorFlow Serving makes it easy to deploy new algorithms and experiments, while keeping the same server architecture and APIs. TensorFlow Serving provides out-of-the-box integration with TensorFlow models, but can be easily extended to serve other types of models and data.

Working Diagram:

Required Setup:

  • Python3 with Tensorflow, Flask, and Flask Bootstrap
  • Docker with Tensorflow Serving

Output:

Trying with a cat's image

Cat Prediction

Also trying with a dog's image

Dog Prediction

About

TensorFlow models using TensorFlow Serving and Docker, a simple web application with Flask which will serve as an interface to get predictions from the served TensorFlow model.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published