Skip to content

Non-local Ollama server Fail to connect #770

Closed
@mando222

Description

Describe the bug

In the ouput I keep getting the app-dev_1 | WARN Constants Failed to get Ollama models: fetch failed messages over and over. My setup is a bit different from most of the posts I see here as I am running a Ollama server on my local network. The URL is set in the .env.local file and it is being set in the UI from what I can tell because the UI seems to have a value set as you can see in the screenshot below.

Screenshot 2024-12-16 at 10 09 10 AM

This seems similar to #721 but the fix that worked for him is not working for me.

I know the Ollama server is running and ports are open since I also run an openwebui frontend from the same box I am trying to run bolt.diy from. In other words the route is tested every time I run a query in openwebui.

Additionally the model select dropdown is not populated with any options as I would expect if the connection was just full failing.

My architecture is pretty simple. Ollama server on local network-> Web server for Bolt.diy on different box on local network.

Link to the Bolt URL that caused the error

Not public accessable

Steps to reproduce

  1. Install bolt.diy on a different box form the Ollama server
  2. Configure the URL in the .env.local file
  3. start the bolt.diy server
  4. navigate to the UI and it should be broken

Expected behavior

I expected the bolt.diy frontend to be able to find the Ollama server

Screen Recording / Screenshot

Screenshot 2024-12-16 at 10 09 10 AM

Platform

  • OS: Ubuntu
  • Browser: Brave
  • Version: bolt@0.0.1 dev

Provider Used

No response

Model Used

No response

Additional context

No response

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions