Skip to content

Commit 41c1a11

Browse files
committed
Improved doc and added a missing step
Signed-off-by: Carlos Rolo <carlos.rolo@netapp.com>
1 parent 52f5e95 commit 41c1a11

File tree

1 file changed

+49
-3
lines changed

1 file changed

+49
-3
lines changed

docs/remote_inference_blueprints/ollama_connector_chat_blueprint.md

Lines changed: 49 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,8 @@ This is an AI connector blueprint for Ollama or any other local/self-hosted LLM
44

55
## 1. Add connector endpoint to trusted URLs
66

7+
Adjust the Regex to your local IP.
8+
79
```json
810
PUT /_cluster/settings
911
{
@@ -28,6 +30,8 @@ PUT /_cluster/settings
2830

2931
## 3. Create the connector
3032

33+
In a local setting, `openAI_key` might not be needed. In case you can either set it to something irrelevant, or if removed, you need to update the `Authorization` header in the `actions`.
34+
3135
```json
3236
POST /_plugins/_ml/connectors/_create
3337
{
@@ -60,16 +64,58 @@ POST /_plugins/_ml/connectors/_create
6064

6165
```json
6266
{
63-
"connector_id": "DUFXiofepXVT9_cf1h0s_"
67+
"connector_id": "Keq5FpkB72uHgF272LWj"
68+
}
69+
```
70+
71+
## 4. Register the model
72+
73+
```json
74+
POST /_plugins/_ml/models/_register
75+
{
76+
"name": "Local LLM Model",
77+
"function_name": "remote",
78+
"description": "Ollama model",
79+
"connector_id": "Keq5FpkB72uHgF272LWj"
80+
}
81+
```
82+
83+
### Sample response
84+
85+
Take note of the `model_id`. It is going to be needed going forward.
86+
87+
```json
88+
{
89+
"task_id": "oEdPqZQBQwAL8-GOCJbw",
90+
"status": "CREATED",
91+
"model_id": "oUdPqZQBQwAL8-GOCZYL"
92+
}
93+
```
94+
95+
## 5. Deploy the model
96+
97+
Use `model_id` in place of the `<MODEL_ID>` placeholder.
98+
99+
```json
100+
POST /_plugins/_ml/models/<MODEL_ID>/_deploy
101+
```
102+
103+
### Sample response
104+
105+
```json
106+
POST /_plugins/_ml/models/WWQI44MBbzI2oUKAvNUt/_deploy
107+
{
108+
"node_ids": ["4PLK7KJWReyX0oWKnBA8nA"]
64109
}
65110
```
66111

67-
### 4. Corresponding Predict request example
112+
### 6. Corresponding Predict request example
68113

69114
Notice how you have to create the whole message structure, not just the message to send.
115+
Use `model_id` in place of the `<MODEL_ID>` placeholder.
70116

71117
```json
72-
POST /_plugins/_ml/models/<ENTER MODEL ID HERE>/_predict
118+
POST /_plugins/_ml/models/<MODEL_ID>/_predict
73119
{
74120
"parameters": {
75121
"messages": [

0 commit comments

Comments
 (0)