Replies: 1 comment
-
Hey, some people have experimented with this See here: #142 It's possible to get something working already using custom templates and LiteLLM but when I tried the results were mixed, maybe with some more work and fiddling things could be improved. Many thanks! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
How do I use a provider like LiteLLM or something that is compatible with calling the OpenAI/Anthropic API to enable FIM non-locally? Has no one done this before?
Beta Was this translation helpful? Give feedback.
All reactions