This repository contains a Python chatbot project with a Streamlit web interface for conversations. It allows you to interact with conversational AI models like LLama through an easy-to-use chat UI.
streamlit_app.py
- Streamlit web application providing chatbot UIsrc/model_interaction.py
- Python module for calling LLama via Ollama- Uses Ollama (allows you to run open-source large language models, such as Llama 2, locally)
- LLama server runs locally to power chatbot conversations
- Python 3.7+
- Ollama
- LLama model installed and running locally through Ollama
- Streamlit
pip install streamlit
to install Ollama, check their github page for installation instructions
Ollama github page
- Start LLama server with Ollama
ollama run llama2
instead of llama2
, you can put any other supported models by Ollama.
- Run Streamlit app
streamlit run app.py
- Chat with the bot at http://localhost:8501!
model_interaction.py
- Makes API calls to localhost Ollama server
- Handles LLama response payload
streamlit_app.py
- Initializes Streamlit app
- Manages chat message state in session_state
- Displays chat UI using Streamlit components
- Calls model_interaction to get bot responses
- Displays bot messages and responses