Skip to content

core : add LlamaStatus command#112

Open
danbev wants to merge 2 commits intoggml-org:masterfrom
danbev:status-command
Open

core : add LlamaStatus command#112
danbev wants to merge 2 commits intoggml-org:masterfrom
danbev:status-command

Conversation

@danbev
Copy link
Copy Markdown
Member

@danbev danbev commented Feb 1, 2026

This commit add a new command to show the status of the Llama plugin. It will show the status of the FIM server and the instruction server (if configured).

For example, if both servers are running:

:LlamaStatus
FIM model (fim_model): ✅ Ready, Instruction model (qwen_coder_model): ✅ Ready

If the server is not running:

LlamaStatus: ❌ Server not reachable

If one server is not running:

FIM model (fim_model) ✅ Ready, Instruction model: ❌ Not loaded

This commit add a new command to show the status of the Llama plugin.
It will show the status of the FIM server and the instruction server (if
configured).

For example, if both servers are running:
```console
:LlamaStatus
FIM model: ✅ Ready, Instruction model: ✅ Ready
```
If the server is not running:
```console
LlamaStatus: ❌ Server not reachable
```

If one server is not running:
```console
FIM model: ✅ Ready, Instruction model: ❌ Not loaded
```
lavilao added a commit to lavilao/llama.vim that referenced this pull request Feb 9, 2026
Add fim_template configuration option to support models like Falcon-H1-Tiny
that use a custom FIM format and don't support the /infill endpoint.

- Add fim_template config option with placeholder substitution
- Use completion endpoint format when custom template is configured
- Support both Handlebars-style ({{{prefix}}}) and simple ({prefix}) placeholders
- Update documentation with Falcon-H1-Tiny example configuration

Example usage for Falcon-H1-Tiny:
  fim_template = '<|prefix|>{{{prefix}}}<|suffix|>{{{suffix}}}<|middle|>'
  endpoint_fim = 'http://localhost:8080/completion'
This commit adds the name of the fim/instruct model to the status output
as it can be useful if the server has many models loaded.

Example output:
```console
FIM model (fim_model): ✅ Ready, Instruction model (qwen_coder_model): ✅ Ready
```
@danbev danbev marked this pull request as ready for review March 8, 2026 15:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant