Describe the bug
Even when the base URL of the computer hosting the Ollama models is correctly configured in the settings, the application still attempts to connect to the model using 127.0.0.1 (localhost) instead of the specified remote address.
To Reproduce
Steps to reproduce the behavior:
- Configure the remote address of Ollama LLM models in settings
- Select the remote model in Summary page (or Mind Map page)
- Click AI Summary button (or AI Mind Map)
- See error
Expected behavior
Application connects to the remote address as configured in the settings
Screenshots

Desktop (please complete the following information):
- OS: Windows 11
- Model: Ollama Qwen3
- Version: 1.6.8