Skip to content

Commit 3f1286c

Browse files
russellbminpeter
authored andcommitted
[Docs] Note that alternative structured output backends are supported (vllm-project#19426)
Signed-off-by: Russell Bryant <rbryant@redhat.com> Signed-off-by: minpeter <kali2005611@gmail.com>
1 parent 4951cd4 commit 3f1286c

File tree

1 file changed

+1
-8
lines changed

1 file changed

+1
-8
lines changed

docs/usage/v1_guide.md

Lines changed: 1 addition & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ This living user guide outlines a few known **important changes and limitations*
5454
| **FP8 KV Cache** | <nobr>🟢 Functional on Hopper devices ([PR #15191](https://github.com/vllm-project/vllm/pull/15191))</nobr>|
5555
| **Spec Decode** | <nobr>🚧 WIP ([PR #13933](https://github.com/vllm-project/vllm/pull/13933))</nobr>|
5656
| **Prompt Logprobs with Prefix Caching** | <nobr>🟡 Planned ([RFC #13414](https://github.com/vllm-project/vllm/issues/13414))</nobr>|
57-
| **Structured Output Alternative Backends** | <nobr>🟡 Planned</nobr> |
57+
| **Structured Output Alternative Backends** | <nobr>🟢 Functional</nobr> |
5858
| **Embedding Models** | <nobr>🚧 WIP ([PR #16188](https://github.com/vllm-project/vllm/pull/16188))</nobr> |
5959
| **Mamba Models** | <nobr>🟡 Planned</nobr> |
6060
| **Encoder-Decoder Models** | <nobr>🟠 Delayed</nobr> |
@@ -132,13 +132,6 @@ in progress.
132132
- **Multimodal Models**: V1 is almost fully compatible with V0 except that interleaved modality input is not supported yet.
133133
See [here](https://github.com/orgs/vllm-project/projects/8) for the status of upcoming features and optimizations.
134134

135-
#### Features to Be Supported
136-
137-
- **Structured Output Alternative Backends**: Structured output alternative backends (outlines, guidance) support is planned. V1 currently
138-
supports only the `xgrammar:no_fallback` mode, meaning that it will error out if the output schema is unsupported by xgrammar.
139-
Details about the structured outputs can be found
140-
[here](https://docs.vllm.ai/en/latest/features/structured_outputs.html).
141-
142135
#### Models to Be Supported
143136

144137
vLLM V1 currently excludes model architectures with the `SupportsV0Only` protocol,

0 commit comments

Comments
 (0)