This project aims to develop a chatbot dedicated to customer support. The chatbot's role is to efficiently and quickly respond to users' questions regarding the KYRA platform.
The project is structured as follows:
Generative Chat/
├── backend
│ ├── Dockerfile
│ ├── requirements.txt
│ └── src
│ ├── controller.py
│ ├── main.py
│ ├── service.py
│ └── utils
│ ├── extract_data.py
│ └── segment_data.py
├── frontend
│ ├── Dockerfile
│ ├── requirements.txt
│ └── src
│ ├── main.py
│ └── kyra.png
└── docker-compose.yml
-
Backend:
- Manages core functionalities, including data extraction, service logic, and the chatbot API.
- Main directories:
src/
: Contains the main application logic.utils/
: Includes utility scripts for data handling and processing.
- Key files:
controller.py
: Manages routes and logic for the API.main.py
: The entry point for the backend service.service.py
: Contains the core logic of the chatbot service.Dockerfile
: Used for containerizing the backend service.requirements.txt
: Lists the dependencies required to run the backend.
-
Frontend:
- Manages the user interface, interacting with the backend to provide responses.
- Main directories:
src/
: Contains the frontend logic.
- Key files:
main.py
: The entry point for the frontend service.kyra.png
: A static image used in the frontend.Dockerfile
: Used for containerizing the frontend service.requirements.txt
: Lists the dependencies required to run the frontend.
-
docker-compose.yml:
- Orchestrates the setup of the backend and frontend services.
To set up the project locally, follow these steps:
-
Clone the repository:
git clone https://gitlab.data-tricks.net/dt-solutions/trainee-codebase/2024/data-science-generative-chat-for-customer-support.git cd generative-chat
-
Build and start the Docker containers:
Run the following command to build and start both backend and frontend services:
docker-compose up --build
This command builds the Docker images as specified in the
Dockerfile
for both the backend and frontend services and starts the containers. -
Access the application:
Once the containers are up and running, you can access the frontend interface at
http://localhost:8501
. The backend API should also be available athttp://localhost:8000
. -
Stop the application:
To stop the running services, press
Ctrl+C
in the terminal wheredocker-compose
is running.
-
Enter a Customer Message:
- Type your question or message about KYRA in the text area labeled "Ask a question about KYRA."
-
Trigger the Response:
- Press the Enter button on your keyboard to send the message and trigger the LLM model to generate a response.
-
View the Response:
- The generated response will be displayed in the interface, along with the previous conversation history.
-
Maintain Context:
- Use the conversation history to maintain context and continuity in customer interactions.