Skip to content

Unable to connect to Ollama models hosted on a different computer within the same network. / 無法在相同網絡中連接到其他電腦上的 Ollama 模型 #400

@gpchit2025

Description

@gpchit2025

Describe the bug
Even when the base URL of the computer hosting the Ollama models is correctly configured in the settings, the application still attempts to connect to the model using 127.0.0.1 (localhost) instead of the specified remote address.

To Reproduce
Steps to reproduce the behavior:

  1. Configure the remote address of Ollama LLM models in settings
  2. Select the remote model in Summary page (or Mind Map page)
  3. Click AI Summary button (or AI Mind Map)
  4. See error

Expected behavior
Application connects to the remote address as configured in the settings

Screenshots
Image

Image

Desktop (please complete the following information):

  • OS: Windows 11
  • Model: Ollama Qwen3
  • Version: 1.6.8

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions