A full-stack AI application for identifying plant species. It features a modern React + Vite frontend with a beautiful UI and a FastAPI + TensorFlow Lite backend for highly optimized, real-time inference using the MobileNet-based AIY Plants model.
- Real-time Recognition: Upload any plant image and get instant identification.
- Optimized AI: Migrated to TensorFlow Lite (TFLite) for 90% smaller package size and faster inference.
- Interactive UI: Drag-and-drop uploads, live previews, and detailed results card.
- Result Insights:
- Scientific Name & Common Name
- Confidence Score
- Health Status Check
- Detailed Description
- Frontend: React, Vite, Tailwind CSS, Lucide React
- Backend: FastAPI, Python 3.9+, Uvicorn
- AI/ML: TensorFlow Lite, NumPy, Pillow
- Node.js (v18+)
- Python (v3.9+)
git clone https://github.com/dineshingale/Plant-Recognizer.git
cd Plant-RecognizerSet up the Python server.
# Create virtual environment
python -m venv venv
# Activate (Windows)
.\venv\Scripts\activate
# Activate (Mac/Linux)
source venv/bin/activate
# Install dependencies
pip install -r requirements.txtSince we use a TFLite optimized model, you must generate/download it first.
# Run this script once to download and convert the model
python convert_model.pyThis will generate a
plants.tflitefile in your root directory.
python server.pyThe API will available at
http://localhost:8000
Launch the React Client interface.
# Open a new terminal and navigate to Client
cd Client
# Install dependencies
npm install
# Run Development Server
npm run devThe app will open at
http://localhost:5173
- Ensure both Server (port 8000) and Client (port 5173) are running.
- Open
http://localhost:5173. - To Identify a Plant:
- Drag & drop an image onto the upload zone.
- Or click to browse your files.
- Click the Recognize Plant button.
- View the detailed results including confidence score and species info.
Plant-Recognizer/
├── Client/ # React Frontend
│ ├── src/
│ │ ├── components/ # UI Components
│ │ ├── App.jsx # Main Logic
│ │ └── main.jsx # Entry Point
│ └── package.json
├── src/ # Legacy CLI Script
├── convert_model.py # Script to generate TFLite model
├── plants.tflite # Optimized Model File (Generated)
├── server.py # FastAPI Backend Server
├── requirements.txt # Python Dependencies
└── run.py # Helper Script
This project uses the Google AIY Plants V1 model, converted to TFLite:
- Source: TensorFlow Hub
- Optimization: Quantized/Float16 TFLite format for rapid CPU inference.
- Input: 224x224 RGB Images
This project uses a robust Jenkins pipeline running on Docker.
- Dockerized Environment: Tests run inside a self-contained Docker container (Python 3.9 + Node 20 + Chrome).
- Self-Contained E2E Testing: Backend and frontend start locally inside the container for integration testing.
- Artifact Extraction: Test reports (
junit) and screenshots are safely extracted from Docker to Windows. - Auto-Merge: Merges to
mainONLY if tests pass. - Failsafe: Prevents broken code from reaching production.
To simulate the CI environment:
# Build the image
docker build -t plant-test .
# Run the full pipeline script
docker run --rm -v %CD%:/app plant-testWe use Selenium with Pytest for End-to-End (E2E) testing.
# Run tests manually
python -m pytest tests/test_app.py- Headless Mode: Enabled by default in CI, disabled locally for debugging.
- Artifacts: Screenshots of failures are saved to the project root.
Built with 💚 using React & Python
