Skip to content

State-of-the-art architecture for Plant Disease Detection using Deep Learning.

License

Notifications You must be signed in to change notification settings

jackfrost1411/Residual_Teacher_Student

Repository files navigation

ResTS: Residual Deep Interpretable Architecture for Plant Disease Detection


This is the source code of ResTS architecture described in the paper :

We propose an architecture named ResTS (Residual Teacher/Student) that can be used as visualization and a classification technique for diagnosis of the plant disease. ResTS is a tertiary adaptation of formerly suggested Teacher/Student architecture. ResTS is grounded on a Convolutional Neural Network (CNN) structure that comprises two classifiers (ResTeacher and ResStudent) and a decoder. This architecture trains both the classifiers in a reciprocal mode and the conveyed representation between ResTeacher and ResStudent is used as a proxy to envision the dominant areas in the image for categorization. The proposed structure ResTS (F1 score: 0.991) has surpassed the Teacher/Student architecture (F1 score: 0.972) because of the residual connections being introduced in all the components. ResTS can yield finer visualizations of symptoms of the disease. All test results are attained on the PlantVillage dataset comprising 54306 images of 14 crop species.

Architecture

Cite this paper as:

1. D. Shah, V. Trivedi, V. Sheth, A. Shah, U. Chauhan, ResTS: Residual DeepInterpretable Architecture for Plant Disease Detection, Information Processing in Agriculture (2021), doi: https://doi.org/10.1016/j.inpa.2021.06.001

The data set used for this paper can be downloaded from https://github.com/spMohanty/PlantVillage-Dataset/tree/master/raw/segmented. In this paper, we have used the segmented version with black background of PlantVilage dataset.

Working video of the WebApp:

Working of Flask+React app

Prerequisites

To run the code, the following packages are required :

  • tensorflow==2.4.1
  • Keras==2.4.3
  • matplotlib==3.2.2
  • OpenCV==4.x
  • Pillow==7.0.0

Usage

  1. install required packages.
  2. Download the model
  3. Copy the model in ./model.
  4. Copy input images with "black background" in ./images.
  5. Run the following code from the project folder: "python visualization.py"

The visualization methode can be used also in an interactive environment using "test_visualization.ipynb".