Skip to content

Linkyo75/OllamaLocalInterface

Repository files navigation

DeepSeek Chat Interface

A web interface for interacting with DeepSeek models through Ollama.

Prerequisites

  1. Install Ollama

    • Visit Ollama's website
    • Follow the installation instructions for your operating system
  2. Pull the DeepSeek Model - example below for the 7b parameters. Be mindful of the storage space (for instance, the model 7b will use 4.7GB of storage)

    ollama pull deepseek-r1:7b
  3. Start Ollama

    ollama serve

Using the Interface

  1. Make sure Ollama is running on your machine
  2. Open the chat interface at https://Linkyo75.github.io/DeepSeekInterface
  3. Select your preferred model from the dropdown
  4. Start chatting!

Troubleshooting

If you're experiencing issues:

  1. Ensure Ollama is running (ollama serve)
  2. Check that you've pulled the DeepSeek model
  3. Verify that port 11434 is accessible
  4. Check your browser's console for any error messages

Development

To run locally:

git clone https://github.com/Linkyo75/DeepSeekInterface.git
cd <repo-name>
npm install
npm run dev

License

MIT

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published