A knowledge-based chatbot designed for seamless website integration, leveraging business information from text files to provide automated customer support. This chatbot is ideal for small businesses, e-commerce platforms, and customer service teams who need an efficient way to handle inquiries 24/7. It is particularly useful for automating responses to common questions, reducing response times, and improving user experience.
Why Choose This Chatbot?
- Unique Feature: QuΓ©AI uses Natural Language Processing to understand free-form customer queries, recognize various phrasings, slang, and context β enabling natural and seamless conversations.
- Key Benefits: Offers Contact Management, Feedback Collection, Multilingual Support, and Sales Assistance.
- Distinction: Unlike standard chatbots, it supports multiple Large Language Models (LLMs) including Ollama (local), OpenAI, Anthropic, and Google, allowing flexibility in performance and cost based on your needs.
- Knowledge Management: Store business info in
.txt
files in theknowledge/
directory. - LLM Providers: Supports Ollama, OpenAI, Anthropic, and Google Generative AI, configurable via Admin UI or environment variables.
- Semantic Search: Utilizes sentence transformers and ChromaDB for relevant query responses.
- Admin UI: Web interface (
/admin
) to manage knowledge, configure LLM/model, customize widget appearance, and handle backups. - Docker Support: Deploy with Docker and Docker Compose.
- WordPress Integration: Experimental plugin structure in
wordpress/chatbot/
. - Monitoring: Basic Prometheus setup included.
- Clone Repository:
git clone https://github.com/KazKozDev/ChatBot.git cd ChatBot
- Configure Environment (optional): Create a
.env
file for API keys (seesettings_manager.py
for variables likeCHATBOT_PROVIDER
,CHATBOT_MODEL_NAME
,CHATBOT_API_KEY
). Defaults to Ollama if not set. - Build and Run:
Server typically available at
docker-compose up --build -d
http://localhost:8000
.
- Prerequisites: Python 3.9+, Ollama (if using locally), terminal access.
- Clone Repository:
git clone https://github.com/KazKozDev/ChatBot.git cd ChatBot
- Virtual Environment:
python3 -m venv venv source venv/bin/activate # Windows: `venv\Scripts\activate`
- Install Dependencies:
pip install -r requirements.txt
- Configure Environment (optional): See Docker step 2.
- Prepare Knowledge Base: Add
.txt
files toknowledge/
directory for topics (e.g.,about.txt
,services.txt
). Anexample.txt
is created if empty on first run. - Run Server:
Note:
uvicorn server:app --host 0.0.0.0 --port 8000 --reload
--reload
is for development; remove for production. Ollama models are pulled if needed.
Add this script before </body>
tag in HTML:
<script>
(function() {
const serverUrl = 'http://localhost:8000'; // Replace with your server URL
const script = document.createElement('script');
script.src = serverUrl + '/static/chat-widget.js';
script.async = true;
document.head.appendChild(script);
script.onload = function() {
ChatWidget.init({
botName: 'Assistant',
apiUrl: serverUrl + '/chat'
});
};
})();
</script>
Note: Update http://localhost:8000
to your server's URL.
Access at /admin
(e.g., http://localhost:8000/admin
) to:
- Configure LLM provider and model, input API keys.
- Manage knowledge files (view, edit, delete).
- Customize chat widget appearance.
- Handle backups.
- Python 3.9+
- Docker & Docker Compose (for Docker setup)
- Ollama (for local LLM)
- Dependencies in
requirements.txt
- ~2GB+ RAM (for local LLMs)
ChatBot/
βββ knowledge/ # Knowledge base .txt files
βββ static/ # Static assets (JS, CSS)
βββ templates/ # HTML templates (Widget, Admin UI)
β βββ admin/ # Admin UI templates & files
βββ wordpress/ # Experimental WordPress plugin
βββ .env # Environment variables (optional)
βββ backup_manager.py # Backup/restore logic
βββ chatbot.py # Chatbot logic, LLM integration
βββ docker-compose.yml # Docker configuration
βββ Dockerfile # Docker build instructions
βββ knowledge_loader.py # Knowledge file processing
βββ requirements.txt # Python dependencies
βββ server.py # FastAPI server, API, Admin UI
βββ settings_manager.py # Settings management
Refer to PROJECT_PLAN.md
for improvement areas. Follow existing code patterns and update documentation.
If you like this project, please give it a star β
For questions, feedback, or support, reach out to: