This repository contains a medical chatbot powered by the Medical-Llama3-8B model. The chatbot is designed to assist users with medical queries using a conversational retrieval chain.
- Utilizes the
Medical-Llama3-8B
model for medical information. - Incorporates document embeddings and a vector store for improved information retrieval.
- Provides a user-friendly interface with Gradio.
- Customizable model parameters through the Gradio interface.
- Python 3.7 or higher
- CUDA-compatible GPU (for running the model efficiently)
-
Clone the repository:
git clone https://github.com/EchoSingh/medical-chatbot.git cd medical-chatbot
-
Install the required packages:
pip install -r requirements.txt
-
Install the
bitsandbytes
,transformers
, andaccelerate
packages:pip install --upgrade bitsandbytes transformers accelerate -i https://pypi.org/simple/
`Biomistral'.
-
Run the Jupyter cell code.
-
Open the provided link to access the Gradio interface.
Here's an example of how to use the chatbot:
- Type your question in the provided textbox.
- Adjust the parameters like
Max New Tokens
,Temperature
, andContext Length
as needed. - Get the response from the chatbot displayed in a scrollable box.
This project is licensed under the MIT License - see the LICENSE file for details.
- The
Biomistral
- Gradio for providing a user-friendly interface for machine learning models