Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] support for lm studio #1276

Open
pechaut78 opened this issue Nov 24, 2023 · 11 comments
Open

[FEATURE] support for lm studio #1276

pechaut78 opened this issue Nov 24, 2023 · 11 comments
Labels
enhancement New feature or request good first issue Good for newcomers

Comments

@pechaut78
Copy link

Lm studio.is super easy to setup, and simpler than local ai.

It mimics openai api. Langchain supports it by passing a local base path..

Would bé wonderful to do thé same thing with flowise

@HenryHengZJ HenryHengZJ added the enhancement New feature or request label Nov 27, 2023
@HenryHengZJ HenryHengZJ added the good first issue Good for newcomers label Dec 15, 2023
@jeffthorness
Copy link

I'm with you and frankly a little puzzled why this isn't already supported.

@SphaeroX
Copy link

@dev: just load openai llm like that (python)

llm = ChatOpenAI(base_url="http://localhost:1234/v1")

@KennyVaneetvelde
Copy link

KennyVaneetvelde commented Feb 2, 2024

image
Works perfectly fine like this for me, I am using LM Studio, I just have API-key set to "none" in the credentials, Note the IP address for you will probably just be localhost:1234 but I am running the docker container so to find LM studio I needed to use the IP address of the vEthernet adapter

@pechaut78
Copy link
Author

pechaut78 commented Feb 2, 2024 via email

@shiloh92
Copy link

shiloh92 commented Feb 8, 2024

Is there a website where they share workflows?

@ArkMaster123
Copy link

@KennyVaneetvelde I have tried this but didn't work. I have actually a flowise instance on DO but LMStudio on my local laptop. Do i have to have both locally for this to work? would love a DO deployment solution! Thanks in advance

@new4u
Copy link

new4u commented Mar 24, 2024

在 Docker 中,要让容器内的应用访问宿主机的 localhost,您可以使用几种方法。以下是一些常见的方法:

使用特殊的 DNS 名称: Docker 为容器提供了一个特殊的 DNS 名称 host.docker.internal,它指向宿主机的 IP 地址。您可以在容器内部使用这个名称来访问宿主机的服务。

所以我用
http://host.docker.internal:1234/v1
替换了
http://localhost:1234/v1

It works!

@RicardoFernandez-UY
Copy link

@KennyVaneetvelde Hi, sounds great!
Where did you get the docker image? I wasn't able to find one in dockerhub, and I wouldn't know how to build it.
Thank you!!

@rachdeg
Copy link

rachdeg commented May 3, 2024

在 Docker 中,要让容器内的应用访问宿主机的 localhost,您可以使用几种方法。以下是一些常见的方法:

使用特殊的 DNS 名称: Docker 为容器提供了一个特殊的 DNS 名称 host.docker.internal,它指向宿主机的 IP 地址。您可以在容器内部使用这个名称来访问宿主机的服务。

所以我用 http://host.docker.internal:1234/v1 替换了 http://localhost:1234/v1

It works!

I tried this but unfortunately it didn't work. What did work was changing the localhost url to http://172.17.0.1:1234/v1 and it worked like a charm! Note that I'm using flowise in Docker and LM studio locally.

@jan-wijman
Copy link

Hi, Try tried the idea "KennyVaneetvelde commented on Feb 2 •", but the chat window is not rolling the message. The replay of the LLM in LLMStudio appears at ones. Although in LM studio I see the repons building up.
What can be wrong in my Flowise diagram.
image

@edvinPL
Copy link

edvinPL commented Sep 2, 2024

Can you make this work by having the LM Studio locally, but flowise on Render and somehow connect the two?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests