A custom wrapper utility for a local deployment of the open-source Ollama 3:8b LLM
Report Bug
·
Request Feature
Table of Contents
This project provides the source code for a CLI to interact with a local Ollama LLM installed on the University of Dayton ACM chapter's server. Currently we are running llama3:8b as our LLM of choice, providing a balance between function and efficiency.
All users on the server have access to the CLI tool, and active development in the form of issues, forks, and commits by any club member is welcome!
This is an example of how you may give instructions on setting up your project locally. To get a local copy up and running follow these simple example steps.
This is an example of how to list things you need to use the software and how to install them.
- Set up an instance of the Ollama within a Docker container.
docker pull ollama/ollama docker run -d -v ollama:root/.ollama -p 11434:11434 --name ollama ollama/ollama docker exec -it ollama ollama pull llama3:8b # Optionally, set a restart policy for ollama to restart unless stopped by a user. docker update --restart unless-stopped ollama
- Clone the repo (on Debian filesystem)
mkdir -p /opt/ollama-chat cd /opt/ollama-chat git clone https://github.com/acm-udayton/acm-ollama-chat/repo_name.git .
- Install Python packages in a virtual environment
sudo python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt - Restrict file permissions
sudo chown -R <your_username>:<your_username> /opt/ollama-chat/
- Make the script executable as a command
chmod +x /opt/ollama-chat/ollama_chat.sh sudo ln -s /opt/ollama-chat/ollama_chat.sh /usr/local/bin/ollama-chat
- Update the source code version from the repo
git checkout <branch-name>
- Update and install Python packages your virtual environment
source .venv/bin/activate pip install -r requirements.txt - Restrict file permissions
sudo chown -R <your-username>:<your-username> /opt/ollama-chat/
- Make the script executable as a command
chmod +x /opt/ollama-chat/ollama_chat.sh
- Clone the repo (on Debian filesystem)
mkdir -p ~/ollama-chat cd ~/ollama-chat git clone https://github.com/acm-udayton/acm-ollama-chat/repo_name.git .
- Install Python packages in a virtual environment
python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt - Ensure correct file permissions
chown -R <your_username>:<your_username> /opt/ollama-chat/
- Make the script executable as a command
chmod +x ~/ollama-chat/ollama_chat.sh
To start a conversation via Ollama Chat, log into your account on the University of Dayton ACM server and run the command for your desired installation:
ollama-chat
~/ollama-chat/ollama_chat.sh
When you're done, use 'exit' or 'quit' to end the Ollama Chat instance.
- Implement a "help" menu. (View issue)
- Add username to set context. (View issue)
- Implement RAG for custom knowledge base. (View issue)
- Basic RAG setup. (View issue)
- Add RAG per-chat customization options. (View issue)
- Add RAG per-user customization options. (View issue)
- Update Documentation for RAG configuration. (View issue)
See the open issues for a full list of proposed features (and known issues).
PLEASE NOTE:
Only contributions by ACM @ UD members (past and present) will be accepted to this repository!
If that doesn't include you, feel free to fork the repo and make your own changes.
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Test on a local/development installation on the ACM server
- Open a Pull Request
Distributed under the MIT License. See LICENSE for more information.
Questions? Contact Joseph Lefkovitz, Vice President - ACM at University of Dayton
Project Link: https://github.com/acm-udayton/acm-ollama-chat
