Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Ollama can't be conencted #2854

Open
3 of 17 tasks
hellangleZ opened this issue Jun 12, 2024 · 9 comments
Open
3 of 17 tasks

[Bug] Ollama can't be conencted #2854

hellangleZ opened this issue Jun 12, 2024 · 9 comments
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models

Comments

@hellangleZ
Copy link

📦 Environment

  • Official
  • Official Preview
  • Vercel / Zeabur / Sealos
  • Docker
  • Other

📌 Version

0.162.22/ latest

💻 Operating System

  • Windows
  • macOS
  • Ubuntu
  • Other Linux
  • iOS
  • Android
  • Other

🌐 Browser

  • Chrome
  • Edge
  • Safari
  • Firefox
  • Other

🐛 Bug Description

image
Just configure by the guidance , but still couldn't conenct to ollama local

image

image

I try the curl , it could works at 127.0.0.1

image

ollama list

image

📷 Recurrence Steps

No response

🚦 Expected Behavior

No response

📝 Additional Information

No response

@hellangleZ hellangleZ added the 🐛 Bug Something isn't working | 缺陷 label Jun 12, 2024
@lobehubbot
Copy link
Member

👀 @hellangleZ

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@mozillo
Copy link

mozillo commented Jun 12, 2024

#2855 CORS error solution

@hellangleZ
Copy link
Author

#2855 CORS error solution
@mozillo
Would you please check my configuration, such as
image

@745854919
Copy link

我也是有同样的问题,用反代域名在浏览器里可以连通,但是始终是不如内网直接连接更安全方便些

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


I also have the same problem. I can connect using a reverse proxy domain name in the browser, but it is not as safe and convenient as direct connection to the intranet.

@9Somboon
Copy link

same problem.

@ShravanSunder
Copy link

Still having the same problem after setting cors

@ktneely
Copy link

ktneely commented Jul 11, 2024

I'm seeing this, as well, and it appears to be that the platform is not honoring the Ollama settings. On a new install, I have $PROXY_URL and $OLLAMA_MODEL_LIST specified in the .env file and referenced by docker-compose.yml, just as for other inference servers. OpenAI, as an example, works fine, but the Ollama config shows defaults instead of the environment settings.

If I manually configure Ollama in the browser, it works.

@oatmealm
Copy link

oatmealm commented Aug 9, 2024

same problem nothing helps, localhost, 127.0.0.1, host.docker.internal:11434, network_mode host... it won't connect

@arvinxx arvinxx added the ollama Relative to Ollama Provider and ollama models label Sep 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

9 participants