This is a simple Python-based chatbot using the Ollama API and the Qwen 3 1.7B model. The chatbot maintains conversation history and streams responses in real-time.
- Interactive command-line chat interface.
- Uses Qwen 3 1.7B model via Ollama.
- Streams responses as the model generates text.
- Maintains full conversation history.
- Easy exit with commands like
exitorquit.
- Python 3.8 or higher
- Ollama Python SDK
- Internet connection (for Ollama API access, if not using local model)
- Clone the repository:
git clone https://github.com/behshadrhp/ShadBot.git
cd ShadBot- Create a virtual environment (optional but recommended):
python -m venv venv
source venv/bin/activate # Linux/Mac
venv\Scripts\activate # Windows- Install dependencies:
pip install ollama- Ensure Ollama CLI is installed and the model
qwen3:1.7bis available:
# install ollama in linux
curl -fsSL https://ollama.com/install.sh | sh
# install model
ollama pull qwen3:1.7b
# list of models
ollama listRun the chatbot script:
python main.pyThen interact in the terminal:
You: Hello!
Bot: Hi there! How can I help you today?
To exit the chatbot, type:
exit
or
quit
history: Stores all messages exchanged between the user and the bot.ai_model: Model used (qwen3:1.7b).- Streaming: Messages from the bot are streamed in real-time for a responsive experience.
- Conversation memory: Appends every user and bot message to
historyto maintain context.
response = ollama.chat(
stream=True,
model=ai_model,
messages=history,
think=False,
)stream=Trueenables real-time streaming.think=Falsedisables the "thinking" animation.
- Fork the repository.
- Create a branch (
git checkout -b feature-name). - Commit your changes (
git commit -m 'Add some feature'). - Push to the branch (
git push origin feature-name). - Open a pull request.
This project is licensed under the MIT License.
- Make sure your Ollama CLI is set up correctly before running the script.
- For large conversations, consider limiting history to improve performance.
- This chatbot runs in the terminal and does not include a GUI by default.