Skip to content

krisograbek/ollama-chatbot-st

Repository files navigation

Local AI Chatbot with Llama3, Ollama & Streamlit

This repository contains the code for a simple web application built with Streamlit, which uses Ollama to run the Llama 3 model for generating AI responses in a chat-like interface.

Prerequisites

  1. Python 3.8 or above

App Demo

download

Steps to run the application

1. Clone the repository to your local machine:

git clone https://github.com/krisograbek/ollama-chatbot-st.git

2. Navigate to the project directory:

cd ollama-chatbot-st
  1. Create a virtual environment and activate it:

On macOS and Linux:

python3 -m venv myenv
source myenv/bin/activate

On Windows:

python -m venv myenv
.\myenv\Scripts\activate

3a. Upgrade pip (optional but recommended)

pip install --upgrade pip
  1. Install the necessary Python packages:
pip install -r requirements.txt
  1. Run the Streamlit application:
streamlit run chatbot.py

Open a web browser and navigate to http://localhost:8501 to interact with the application.

License This project is open source, under the terms of the MIT license.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published