Skip to content

Misc. bug: Incorrect model name in web chat #17666

@pwilkin

Description

@pwilkin

Name and Version

version: 7220 (ecf74a8)

Operating systems

Linux

Which llama.cpp modules do you know to be affected?

Other (Please specify in the next section)

Command line

llama-server -m qwen3_next_80b_a30b_instruct-iq4_nl.gguf --jinja --host 0.0.0.0 --port 8357 -c 60000 --cpu-moe

Problem description & steps to reproduce

The new WebUI when running the server in single-model mode shows the model as "gpt-3.5-turbo".

I guess the reason for this is that by default, that's what the OpenAI client sends out as message.model and the new code has:

	let displayedModel = $derived((): string | null => {
		if (!currentConfig.showModelInfo) return null;

		if (message.model) {
			return message.model;
		}

		return serverModel;
	});

so it always prioritizes the model from the message, even if the model is bogus.

First Bad Commit

ec18edf

Relevant log output

Metadata

Metadata

Labels

bugSomething isn't workinglow severityUsed to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)regressionA regression introduced in a new build (something that was previously working correctly)server/webui

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions