Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion website/docs/components/models/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ The model type is inferred based on the model source and files. For more detail,
Spice supports a variety of features for large language models (LLMs):

- **Custom Tools**: Provide models with tools to interact with the Spice runtime. See [Tools](/docs/features/large-language-models/tools).
- **System Prompts**: Customize system prompts and override defaults for [`v1/chat/completion`](/docs/api/HTTP/post-chat-completions). See [Parameter Overrides](/docs/features/large-language-models/parameter_overrides).
- **System Prompts**: Declaratively define system prompts and default values for [`v1/chat/completion`](/docs/api/HTTP/post-chat-completions) parameters. See [Parameter Overrides](/docs/features/large-language-models/parameter_overrides). Use Jinja-templating to parameterise system prompts per request see [Parameterized prompts](docs/features/large-language-models/parameterized_prompts).
- **Memory**: Provide LLMs with memory persistence tools to store and retrieve information across conversations. See [Memory](/docs/features/large-language-models/memory).
- **Vector Search**: Perform advanced vector-based searches using embeddings. See [Vector Search](/docs/features/search/vector-search).
- **Evals**: Evaluate, track, compare, and improve language model performance for specific tasks. See [Evals](/docs/features/large-language-models/evals).
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
---
title: 'System Prompt parameterization'
sidebar_label: 'Parameterized Prompts'
description: 'Learn how to update system prompts for each request with Jinja-styled templating.'
sidebar_position: 7
pagination_prev: null
pagination_next: null
tags:
- models
- parameters
- overrides
- configuration
---

Spice supports defining system prompts for Large Language Models (LLM)s in the [spicepod](/docs/features/large-language-models/parameter_overrides#system_prompt).

**Example**:
```yaml
models:
- name: advice
from: openai:gpt-4o
params:
system_prompt: |
Write everything in Haiku like a pirate from Australia
```

More than this, system prompts can use Jinja syntax to allow system prompts to be altered on each [v1/chat/completion](/docs/api/HTTP/post-chat-completions) request. This involves three steps:
1. Add `parameterized_prompt: enabled` to the model.
1. Use Jinja syntax in the `system_prompt` parameter for the model in the spicepods.
```yaml
models:
- name: advice
from: openai:gpt-4o
params:
parameterized_prompt: enabled
system_prompt: |
Write everything in {{ form }} like a {{ user.character }} from {{ user.country }}
```

2. Provide the required variables in [v1/chat/completion](/docs/api/HTTP/post-chat-completions) via the `.metadata` field.
```bash
curl -X POST http://localhost:8090/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "advice",
"messages": [
{"role": "user", "content": "Where should I visit in San Francisco?"}
],
"metadata": {
"form": "haiku",
"user": {
"character": "pirate",
"country": "australia"
}
}
}'
```
Loading