Skip to content

sahermuhamed1/intent-classifier-chatbot

Repository files navigation

🤖 Intent Classification Chatbot

Intent Detection using BERT deployed by Fast API in a Flask Application
This project uses a fine-tuned BERT model that is an encoder only model without a decoder used to detect user intent from text inputs (e.g., chatbot queries). It provides a lightweight Flask API to classify input sentences into predefined intent categories.

🌐 Access the Chatbot Application from HERE

🤗 Access the Huggingface pretrained Model from HERE

🔧 Features

  • ✅ Pretrained BERT fine-tuned on CLINC150
  • 🧠 Real-time intent classification from natural text
  • 🌐 REST API using Flask
  • 🤖 Easy to integrate with chatbots, voice assistants, or NLP systems

📊 Dataset: CLINC150

The project uses the CLINC150 dataset, a benchmark dataset for intent classification in task-oriented dialogue systems.

🧾 Overview

  • Total intents: 150 unique user intents
  • Domains: 10 real-world domains (e.g., banking, travel, weather, small talk)
  • Examples: ~22,500 utterances
  • Language: English
  • Out-of-scope (OOS): Includes OOS examples to test robustness

📁 Dataset Splits

Split Examples
Train 7,000
Validation 3,000
Test 5,500

📦 Source

🚀 Example

Request

{
  "text": "I want to book a flight"
}

Response

{
  "intent": "book_flight"
}

⚙️ Tech Stack

  • Backend: FastAPI, PyTorch, Transformers
  • Frontend: HTML, CSS, JavaScript
  • Model: BERT-base fine-tuned on CLINC150
  • Deployment: Docker, Hugging Face Spaces

🧑🏻‍💻 Usage

  1. Run the Dockerfile as a container to set up the environment.
  2. Load the pretrained Intent BERT model and tokenizer by executing model/api/api.py
  3. Start the FastAPI server by running model/api/start_server.py.
  4. (Optional) Test the FastAPI endpoint by running model/api/test.py.
  5. Run the Flask web application by executing src/main.py.

🙋🏻‍♂️ Author

Saher Muhamed

About

Intent Detection API using BERT and Flask

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published