Skip to content

Commit 912c3d1

Browse files
committed
ollama-integration: rename to ai-local-backend-integration
Any OpenAI-compatible backend is supported, so it would be better to maintain them in a single script and have seperate scripts for the ones that aren't compatible. This would also be clearer for users which might wrongly assume that only one backend is supported. This change also automatically adds the OpenAI endpoint to the base URL because all compatible backends support it already. Therefore, users only need to provide the local address.
1 parent becbcb5 commit 912c3d1

File tree

3 files changed

+18
-16
lines changed

3 files changed

+18
-16
lines changed

ollama-integration/ollama-integration.qml renamed to ai-local-backend-integration/ai-local-backend-integration.qml

Lines changed: 9 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,11 @@ import QtQml 2.0
22
import QOwnNotesTypes 1.0
33

44
/**
5-
* This script provides integration for a local Ollama backend
5+
* This script provides integration for a local AI backend
66
* See: https://github.com/ollama/ollama
7+
* https://github.com/ggerganov/llama.cpp
78
* List of models: https://github.com/ollama/ollama?tab=readme-ov-file#model-library
9+
* https://github.com/ggerganov/llama.cpp#description
810
* OpenAPI endpoint: https://ollama.com/blog/openai-compatibility or https://github.com/ollama/ollama/blob/main/docs/openai.md
911
*/
1012
Script {
@@ -16,9 +18,9 @@ Script {
1618
{
1719
"identifier": "baseUrl",
1820
"name": "API base URL",
19-
"description": "The chat base URL of the Ollama API.",
21+
"description": "The chat base URL of the server API.",
2022
"type": "string",
21-
"default": "http://127.0.0.1:11434/v1/chat/completions",
23+
"default": "http://127.0.0.1:11434",
2224
},
2325
{
2426
"identifier": "models",
@@ -36,10 +38,10 @@ Script {
3638
function openAiBackendsHook() {
3739
return [
3840
{
39-
"id": "ollama",
40-
"name": "Ollama",
41-
"baseUrl": baseUrl,
42-
"apiKey": "ollama",
41+
"id": "local-ai",
42+
"name": "Local AI",
43+
"baseUrl": baseUrl + "/v1/chat/completions",
44+
"apiKey": "local-ai",
4345
"models": models.split(",")
4446
},
4547
];
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
{
2+
"name": "Local AI backend integration",
3+
"identifier": "ai-local-backend-integration",
4+
"script": "ai-local-backend-integration.qml",
5+
"version": "0.0.2",
6+
"minAppVersion": "24.6.2",
7+
"authors": ["@pbek"],
8+
"description" : "This script provides integration for an OpenAI-compatible local AI backend like <a href=\"https://github.com/ollama/ollama\">Ollama</a> or <a href=\"https://github.com/ggerganov/llama.cpp\">llama-cpp</a>."
9+
}

ollama-integration/info.json

Lines changed: 0 additions & 9 deletions
This file was deleted.

0 commit comments

Comments
 (0)