A web interface for interacting with DeepSeek models through Ollama.
-
Install Ollama
- Visit Ollama's website
- Follow the installation instructions for your operating system
-
Pull the DeepSeek Model - example below for the 7b parameters. Be mindful of the storage space (for instance, the model 7b will use 4.7GB of storage)
ollama pull deepseek-r1:7b
-
Start Ollama
ollama serve
- Make sure Ollama is running on your machine
- Open the chat interface at https://Linkyo75.github.io/DeepSeekInterface
- Select your preferred model from the dropdown
- Start chatting!
If you're experiencing issues:
- Ensure Ollama is running (
ollama serve) - Check that you've pulled the DeepSeek model
- Verify that port 11434 is accessible
- Check your browser's console for any error messages
To run locally:
git clone https://github.com/Linkyo75/DeepSeekInterface.git
cd <repo-name>
npm install
npm run devMIT