You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/remote_inference_blueprints/ollama_connector_chat_blueprint.md
+49-3Lines changed: 49 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,6 +4,8 @@ This is an AI connector blueprint for Ollama or any other local/self-hosted LLM
4
4
5
5
## 1. Add connector endpoint to trusted URLs
6
6
7
+
Adjust the Regex to your local IP.
8
+
7
9
```json
8
10
PUT /_cluster/settings
9
11
{
@@ -28,6 +30,8 @@ PUT /_cluster/settings
28
30
29
31
## 3. Create the connector
30
32
33
+
In a local setting, `openAI_key` might not be needed. In case you can either set it to something irrelevant, or if removed, you need to update the `Authorization` header in the `actions`.
34
+
31
35
```json
32
36
POST /_plugins/_ml/connectors/_create
33
37
{
@@ -60,16 +64,58 @@ POST /_plugins/_ml/connectors/_create
60
64
61
65
```json
62
66
{
63
-
"connector_id": "DUFXiofepXVT9_cf1h0s_"
67
+
"connector_id": "Keq5FpkB72uHgF272LWj"
68
+
}
69
+
```
70
+
71
+
## 4. Register the model
72
+
73
+
```json
74
+
POST /_plugins/_ml/models/_register
75
+
{
76
+
"name": "Local LLM Model",
77
+
"function_name": "remote",
78
+
"description": "Ollama model",
79
+
"connector_id": "Keq5FpkB72uHgF272LWj"
80
+
}
81
+
```
82
+
83
+
### Sample response
84
+
85
+
Take note of the `model_id`. It is going to be needed going forward.
86
+
87
+
```json
88
+
{
89
+
"task_id": "oEdPqZQBQwAL8-GOCJbw",
90
+
"status": "CREATED",
91
+
"model_id": "oUdPqZQBQwAL8-GOCZYL"
92
+
}
93
+
```
94
+
95
+
## 5. Deploy the model
96
+
97
+
Use `model_id` in place of the `<MODEL_ID>` placeholder.
98
+
99
+
```json
100
+
POST /_plugins/_ml/models/<MODEL_ID>/_deploy
101
+
```
102
+
103
+
### Sample response
104
+
105
+
```json
106
+
POST /_plugins/_ml/models/WWQI44MBbzI2oUKAvNUt/_deploy
107
+
{
108
+
"node_ids": ["4PLK7KJWReyX0oWKnBA8nA"]
64
109
}
65
110
```
66
111
67
-
### 4. Corresponding Predict request example
112
+
### 6. Corresponding Predict request example
68
113
69
114
Notice how you have to create the whole message structure, not just the message to send.
115
+
Use `model_id` in place of the `<MODEL_ID>` placeholder.
70
116
71
117
```json
72
-
POST /_plugins/_ml/models/<ENTER MODEL ID HERE>/_predict
0 commit comments