Skip to content

Fix the Ollama Endpoint 404 Error#162

Merged
RyanNg1403 merged 3 commits intomainfrom
fix/fix-ollama-url
Aug 1, 2025
Merged

Fix the Ollama Endpoint 404 Error#162
RyanNg1403 merged 3 commits intomainfrom
fix/fix-ollama-url

Conversation

@RyanNg1403
Copy link
Collaborator

Fix Ollama 404 Connection Errors with Incorrect URL Endpoints

Summary

Resolves #159 - Fixes persistent 404 errors when connecting to Ollama API by correcting URL endpoint handling for both LLM and embedding services.

Problem

Users experienced 404 "page not found" errors when setting OLLAMA_BASE_URL=http://localhost:11434. The root cause was inconsistent URL endpoint handling across different services:

  • LLM services need OpenAI-compatible /v1 endpoints (e.g., /v1/chat/completions)
  • Embedding services need native Ollama /api endpoints (e.g., /api/embeddings)
  • Multiple code paths had different default URL patterns
  • YAML config expansion bypassed URL normalization logic

Changes Made

1. Fixed LLM Service URL Handling (factory.ts)

  • ✅ Handle direct config URLs from YAML expansion (llmConfig.baseURL)
  • ✅ Handle environment variable fallbacks (env.OLLAMA_BASE_URL)
  • ✅ Ensure /v1 suffix for OpenAI-compatible endpoints
  • ✅ Preserve user flexibility for custom endpoints

2. Fixed Embedding Service URL Handling (service-initializer.ts)

  • ✅ Remove /v1 suffix when present for embedding fallback
  • ✅ Ensure proper base URL for /api/embeddings endpoint
  • ✅ Maintain correct defaults for embedding services

Test Cases Now Working

Configuration LLM Endpoint Embedding Endpoint
OLLAMA_BASE_URL=http://localhost:11434 http://localhost:11434/v1 http://localhost:11434
OLLAMA_BASE_URL=http://localhost:11434/v1 http://localhost:11434/v1 http://localhost:11434
OLLAMA_BASE_URL=http://localhost:11434/api http://localhost:11434/api http://localhost:11434/api
No URL set (default) http://localhost:11434/v1 http://localhost:11434

Impact

  • 🐛 Fixes: Persistent 404 errors with standard Ollama configuration
  • Maintains: Full backward compatibility with existing configurations
  • �� Improves: User experience with out-of-the-box Ollama setup
  • 🔧 Preserves: Advanced user flexibility for custom endpoints

Testing

  • ✅ Code compiles and passes type checking
  • ✅ Linting passes
  • ✅ Build process completes successfully

Fixes #159

@RyanNg1403 RyanNg1403 merged commit 40b55dd into main Aug 1, 2025
5 checks passed
@RyanNg1403 RyanNg1403 deleted the fix/fix-ollama-url branch August 1, 2025 16:16
@RyanNg1403 RyanNg1403 mentioned this pull request Aug 6, 2025
Ptah-CT pushed a commit to DerAuctor/ct-cipher that referenced this pull request Oct 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] Cipher fails to connect to Ollama API - 404 errors

1 participant