Skip to content

acm-udayton/acm-ollama-chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Contributors Forks Stargazers Issues project_license


Ollama Chat

A custom wrapper utility for a local deployment of the open-source Ollama 3:8b LLM

Report Bug · Request Feature

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Roadmap
  5. Contributing
  6. License
  7. Contact

About The Project

Ollama Chat Screen Shot

This project provides the source code for a CLI to interact with a local Ollama LLM installed on the University of Dayton ACM chapter's server. Currently we are running llama3:8b as our LLM of choice, providing a balance between function and efficiency.

All users on the server have access to the CLI tool, and active development in the form of issues, forks, and commits by any club member is welcome!

(back to top)

Built With

  • Python
  • Docker

(back to top)

Getting Started

This is an example of how you may give instructions on setting up your project locally. To get a local copy up and running follow these simple example steps.

Prerequisites (already completed on ACM server)

This is an example of how to list things you need to use the software and how to install them.

  • Set up an instance of the Ollama within a Docker container.
    docker pull ollama/ollama
    docker run -d -v ollama:root/.ollama -p 11434:11434 --name ollama ollama/ollama
    docker exec -it ollama ollama pull llama3:8b
    
    # Optionally, set a restart policy for ollama to restart unless stopped by a user.
    docker update --restart unless-stopped ollama

Installation

Fresh global install (not needed on ACM server)

  1. Clone the repo (on Debian filesystem)
    mkdir -p /opt/ollama-chat
    cd /opt/ollama-chat
    git clone https://github.com/acm-udayton/acm-ollama-chat/repo_name.git .
  2. Install Python packages in a virtual environment
    sudo python3 -m venv .venv
    source .venv/bin/activate
    pip install -r requirements.txt
  3. Restrict file permissions
    sudo chown -R <your_username>:<your_username> /opt/ollama-chat/
  4. Make the script executable as a command
    chmod +x /opt/ollama-chat/ollama_chat.sh
    sudo ln -s /opt/ollama-chat/ollama_chat.sh /usr/local/bin/ollama-chat

Update an existing global installation (should be owner of the initial install)

  1. Update the source code version from the repo
    git checkout <branch-name>
  2. Update and install Python packages your virtual environment
    source .venv/bin/activate
    pip install -r requirements.txt
  3. Restrict file permissions
    sudo chown -R <your-username>:<your-username> /opt/ollama-chat/
  4. Make the script executable as a command
    chmod +x /opt/ollama-chat/ollama_chat.sh

Local/development installation

  1. Clone the repo (on Debian filesystem)
    mkdir -p ~/ollama-chat
    cd ~/ollama-chat
    git clone https://github.com/acm-udayton/acm-ollama-chat/repo_name.git .
  2. Install Python packages in a virtual environment
    python3 -m venv .venv
    source .venv/bin/activate
    pip install -r requirements.txt
  3. Ensure correct file permissions
    chown -R <your_username>:<your_username> /opt/ollama-chat/
  4. Make the script executable as a command
    chmod +x ~/ollama-chat/ollama_chat.sh

(back to top)

Usage

To start a conversation via Ollama Chat, log into your account on the University of Dayton ACM server and run the command for your desired installation:

Global

ollama-chat

Local/development

~/ollama-chat/ollama_chat.sh

When you're done, use 'exit' or 'quit' to end the Ollama Chat instance.

(back to top)

Roadmap


See the open issues for a full list of proposed features (and known issues).

(back to top)

Contributing

PLEASE NOTE: Only contributions by ACM @ UD members (past and present) will be accepted to this repository!
If that doesn't include you, feel free to fork the repo and make your own changes.

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Test on a local/development installation on the ACM server
  6. Open a Pull Request

(back to top)

Top contributors:

contrib.rocks image

License

Distributed under the MIT License. See LICENSE for more information.

(back to top)

Contact

Questions? Contact Joseph Lefkovitz, Vice President - ACM at University of Dayton

Project Link: https://github.com/acm-udayton/acm-ollama-chat

(back to top)

About

A Python CLI tool for chatting with our Ollama local LLM

Resources

License

Stars

Watchers

Forks