-
-
Notifications
You must be signed in to change notification settings - Fork 16k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] support for lm studio #1276
Comments
I'm with you and frankly a little puzzled why this isn't already supported. |
@dev: just load openai llm like that (python)
|
Excellent !!
Thanks a lot
Cordialement,
CHAUT Pierre-Emmanuel
…________________________________
De : KennyVaneetvelde ***@***.***>
Envoyé : vendredi 2 février 2024 15:18
À : FlowiseAI/Flowise ***@***.***>
Cc : pechaut78 ***@***.***>; Author ***@***.***>
Objet : Re: [FlowiseAI/Flowise] [FEATURE] support for lm studio (Issue #1276)
image.png (view on web)<https://github.com/FlowiseAI/Flowise/assets/48944754/d356c9a8-6ab2-488c-b88b-0898e114f14c>
Works perfectly fine like this for me, I am using LM Studio, I just have API-key set to "none" in the credentials, Note the IP address for you will probably just be localhost:1234 but I am running the docker container so to find LM studio I needed to use the IP address of the vEthernet adapter
—
Reply to this email directly, view it on GitHub<#1276 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AAE2VARKD6C6NUEBPCRLJ43YRTYUNAVCNFSM6AAAAAA7YQB5COVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRTHE3TQMZSHA>.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Is there a website where they share workflows? |
@KennyVaneetvelde I have tried this but didn't work. I have actually a flowise instance on DO but LMStudio on my local laptop. Do i have to have both locally for this to work? would love a DO deployment solution! Thanks in advance |
在 Docker 中,要让容器内的应用访问宿主机的 localhost,您可以使用几种方法。以下是一些常见的方法: 使用特殊的 DNS 名称: Docker 为容器提供了一个特殊的 DNS 名称 host.docker.internal,它指向宿主机的 IP 地址。您可以在容器内部使用这个名称来访问宿主机的服务。 所以我用 It works! |
@KennyVaneetvelde Hi, sounds great! |
I tried this but unfortunately it didn't work. What did work was changing the localhost url to http://172.17.0.1:1234/v1 and it worked like a charm! Note that I'm using flowise in Docker and LM studio locally. |
Hi, Try tried the idea "KennyVaneetvelde commented on Feb 2 •", but the chat window is not rolling the message. The replay of the LLM in LLMStudio appears at ones. Although in LM studio I see the repons building up. |
Can you make this work by having the LM Studio locally, but flowise on Render and somehow connect the two? |
Lm studio.is super easy to setup, and simpler than local ai.
It mimics openai api. Langchain supports it by passing a local base path..
Would bé wonderful to do thé same thing with flowise
The text was updated successfully, but these errors were encountered: