You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ollama-integration: rename to ai-local-backend-integration
Any OpenAI-compatible backend is supported, so it would be better to
maintain them in a single script and have seperate scripts for the ones
that aren't compatible.
This would also be clearer for users which might wrongly assume that
only one backend is supported.
This change also automatically adds the OpenAI endpoint to the base URL
because all compatible backends support it already. Therefore, users
only need to provide the local address.
"description" : "This script provides integration for an OpenAI-compatible local AI backend like <a href=\"https://github.com/ollama/ollama\">Ollama</a> or <a href=\"https://github.com/ggerganov/llama.cpp\">llama-cpp</a>."
0 commit comments