Description
LocalAI version: 2.29.0 (built from source)
Environment, CPU architecture, OS, and Version: Mac T8132 MacOS 15.5 24F74
Describe the bug
The bug causes a fatal crash when sending a request which contains a certain function definition allowed in OpenAI. The same prompt can be sent to openAI without any issues. It crashes whenever you send the prompt and thus also enables users to accidentially crash the whole localAI api server.
To Reproduce
Send the following prompt to the localAI api server. You can do this via curl and it will immediately crash:
{
"messages": [
{
"role": "user",
"content": "Is the company xyz a customer of ours?"
}
],
"tools": [
{
"type": "function",
"function": {
"name": "CompanyHasRelation",
"description": "This function checks if and what type of customer a given company is. It lists every found entry formatted with their name, address and their type. It can also filter by a given type of relation.",
"parameters": {
"type": "object",
"properties": {
"companylist": {
"type": "array",
"items": {
"type": "object",
"properties": {
"companyname": {
"description": "The given name of the company.",
"type": "string"
},
"street": {
"description": "The given street name where the company resides.",
"type": [
"string",
"null"
]
},
"city": {
"description": "The given city where the company resides.",
"type": [
"string",
"null"
]
}
},
"additionalProperties": false,
"required": [
"companyname",
"street",
"city"
]
}
},
"filter": {
"description": "The type we should filter the list of companies by. Instead of infering the type you look if the user prompt mentions a specific type.",
"type": "string"
}
},
"required": [
"companylist",
"filter"
],
"additionalProperties": false
},
"strict": true
}
}
],
"parallel_tool_calls": false,
"model": "gemma-3-4b-it-qat",
"temperature": 0.2,
"n": 1
}
Expected behavior
It should return a valid JSON object issuing a function call.
Logs
12:56PM DBG Chat endpoint configuration read: &{PredictionOptions:{BasicModelRequest:{Model:google_gemma-3-4b-it-qat-Q4_0.gguf} Language: Translate:false N:0 TopP:0x1400067c788 TopK:0x1400067c790 Temperature:0x14000037930 Maxtokens:0x1400067c7c8 Echo:false Batch:0 IgnoreEOS:false RepeatPenalty:0 RepeatLastN:0 Keep:0 FrequencyPenalty:0 PresencePenalty:0 TFZ:0x1400067c7c0 TypicalP:0x1400067c7b8 Seed:0x1400067c7d8 NegativePrompt: RopeFreqBase:0 RopeFreqScale:0 NegativePromptScale:0 ClipSkip:0 Tokenizer:} Name:gemma-3-4b-it-qat F16:0x1400067c728 Threads:0x1400067c768 Debug:0x14000037ba0 Roles:map[] Embeddings:0x1400067c7d1 Backend: TemplateConfig:{Chat:{{.Input }}
<start_of_turn>model
ChatMessage:<start_of_turn>{{if eq .RoleName "assistant" }}model{{else}}{{ .RoleName }}{{end}}
{{ if .FunctionCall -}}
{{ else if eq .RoleName "tool" -}}
{{ end -}}
{{ if .Content -}}
{{.Content -}}
{{ end -}}
{{ if .FunctionCall -}}
{{toJson .FunctionCall}}
{{ end -}}<end_of_turn> Completion:{{.Input}}
Edit: Functions:<start_of_turn>system
You have access to functions. If you decide to invoke any of the function(s),
you MUST put it in the format of
{"name": function name, "parameters": dictionary of argument name and its value}
You SHOULD NOT include any other text in the response if you call a function
{{range .Functions}}
{'type': 'function', 'function': {'name': '{{.Name}}', 'description': '{{.Description}}', 'parameters': {{toJson .Parameters}} }}
{{end}}
<end_of_turn>
{{.Input -}}
<start_of_turn>model
UseTokenizerTemplate:false JoinChatMessagesByCharacter:<nil> Multimodal: JinjaTemplate:false ReplyPrefix:} KnownUsecaseStrings:[FLAG_ANY FLAG_CHAT FLAG_COMPLETION] KnownUsecases:<nil> Pipeline:{TTS: LLM: Transcription: VAD:} PromptStrings:[] InputStrings:[] InputToken:[] functionCallString: functionCallNameString: ResponseFormat: ResponseFormatMap:map[] FunctionsConfig:{DisableNoAction:false GrammarConfig:{ParallelCalls:false DisableParallelNewLines:false MixedMode:false NoMixedFreeString:false NoGrammar:false Prefix: ExpectStringsAfterJSON:false PropOrder: SchemaType: GrammarTriggers:[]} NoActionFunctionName: NoActionDescriptionName: ResponseRegex:[] JSONRegexMatch:[] ArgumentRegex:[] ArgumentRegexKey: ArgumentRegexValue: ReplaceFunctionResults:[] ReplaceLLMResult:[] CaptureLLMResult:[] FunctionNameKey: FunctionArgumentsKey:} FeatureFlag:map[] LLMConfig:{SystemPrompt: TensorSplit: MainGPU: RMSNormEps:0 NGQA:0 PromptCachePath: PromptCacheAll:false PromptCacheRO:false MirostatETA:0x1400067c7b0 MirostatTAU:0x1400067c7a8 Mirostat:0x1400067c7a0 NGPULayers:0x1400067c718 MMap:0x1400067c729 MMlock:0x1400067c7d1 LowVRAM:0x1400067c7d1 Reranking:0x1400067c7d1 Grammar: StopWords:[<|im_end|> <end_of_turn> <start_of_turn>] Cutstrings:[] ExtractRegex:[] TrimSpace:[] TrimSuffix:[] ContextSize:0x1400067c398 NUMA:false LoraAdapter: LoraBase: LoraAdapters:[] LoraScales:[] LoraScale:0 NoMulMatQ:false DraftModel: NDraft:0 Quantization: LoadFormat: GPUMemoryUtilization:0 TrustRemoteCode:false EnforceEager:false SwapSpace:0 MaxModelLen:0 TensorParallelSize:0 DisableLogStatus:false DType: LimitMMPerPrompt:{LimitImagePerPrompt:0 LimitVideoPerPrompt:0 LimitAudioPerPrompt:0} MMProj:mmproj-google_gemma-3-4b-it-qat-f16.gguf FlashAttention:false NoKVOffloading:false CacheTypeK: CacheTypeV: RopeScaling: ModelType: YarnExtFactor:0 YarnAttnFactor:0 YarnBetaFast:0 YarnBetaSlow:0 CFGScale:0} Diffusers:{CUDA:false PipelineType: SchedulerType: EnableParameters: IMG2IMG:false ClipSkip:0 ClipModel: ClipSubFolder: ControlNet:} Step:0 GRPC:{Attempts:0 AttemptsSleepTime:0} TTSConfig:{Voice: AudioPath:} CUDA:false DownloadFiles:[] Description: Usage: Options:[]}
12:56PM DBG Response needs to process functions
panic: interface conversion: interface {} is []interface {}, not string
goroutine 85 [running]:
github.com/mudler/LocalAI/pkg/functions/grammars.(*JSONSchemaConverter).visit(0x140090ff900, 0x14009c0acc0, {0x1400cd57d10, 0x26}, 0x14009c0ab40)
/Users/xyz/localai/pkg/functions/grammars/json_schema.go:65 +0x1114
github.com/mudler/LocalAI/pkg/functions/grammars.(*JSONSchemaConverter).visit(0x140090ff900, 0x14009c0ac60, {0x1400cd57ce0, 0x21}, 0x14009c0ab40)
/Users/xyz/localai/pkg/functions/grammars/json_schema.go:150 +0xa8c
github.com/mudler/LocalAI/pkg/functions/grammars.(*JSONSchemaConverter).visit(0x140090ff900, 0x14009c0ac30, {0x14008e81d40, 0x1c}, 0x14009c0ab40)
/Users/xyz/localai/pkg/functions/grammars/json_schema.go:168 +0x48c
github.com/mudler/LocalAI/pkg/functions/grammars.(*JSONSchemaConverter).visit(0x140090ff900, 0x14009c0abd0, {0x14009c27f60, 0x10}, 0x14009c0ab40)
/Users/xyz/localai/pkg/functions/grammars/json_schema.go:150 +0xa8c
github.com/mudler/LocalAI/pkg/functions/grammars.(*JSONSchemaConverter).visit(0x140090ff900, 0x14009c0ab70, {0x14009c27f40, 0x6}, 0x14009c0ab40)
/Users/xyz/localai/pkg/functions/grammars/json_schema.go:150 +0xa8c
github.com/mudler/LocalAI/pkg/functions/grammars.(*JSONSchemaConverter).visit(0x140090ff900, 0x14009c0ab40, {0x0, 0x0}, 0x14009c0ab40)
/Users/xyz/localai/pkg/functions/grammars/json_schema.go:80 +0x1050
github.com/mudler/LocalAI/pkg/functions/grammars.(*JSONSchemaConverter).Grammar(0x140090ff900, 0x14009c0ab40, {0x1400061e0e8, 0x1, 0x1})
/Users/xyz/localai/pkg/functions/grammars/json_schema.go:206 +0x80
github.com/mudler/LocalAI/pkg/functions/grammars.(*JSONSchemaConverter).GrammarFromBytes(0x140090ff900, {0x1400041c800, 0x388, 0x400}, {0x1400061e0e8, 0x1, 0x1})
/Users/xyz/localai/pkg/functions/grammars/json_schema.go:219 +0x9c
github.com/mudler/LocalAI/pkg/functions.JSONFunctionStructure.Grammar({{0x14009c0a780, 0x2, 0x2}, {0x0, 0x0, 0x0}, 0x0}, {0x1400061e0e8, 0x1, 0x1})
/Users/xyz/localai/pkg/functions/function_structure.go:30 +0x150
github.com/mudler/LocalAI/core/http/routes.RegisterOpenAIRoutes.ChatEndpoint.func15(0x14007776008)
/Users/xyz/localai/core/http/endpoints/openai/chat.go:280 +0xbdc
github.com/gofiber/fiber/v2.(*Ctx).Next(0x14007785008?)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/ctx.go:1031 +0x48
github.com/mudler/LocalAI/core/http/middleware.(*RequestExtractor).SetOpenAIRequest(0x14007d80e40, 0x14007776008)
/Users/xyz/localai/core/http/middleware/request.go:184 +0x328
github.com/gofiber/fiber/v2.(*Ctx).Next(0x1400040fa40?)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/ctx.go:1031 +0x48
github.com/mudler/LocalAI/core/http/routes.RegisterOpenAIRoutes.(*RequestExtractor).SetModelAndConfig.func12(0x14007776008)
/Users/xyz/localai/core/http/middleware/request.go:145 +0x37c
github.com/gofiber/fiber/v2.(*Ctx).Next(0x14000626150?)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/ctx.go:1031 +0x48
github.com/mudler/LocalAI/core/http/routes.RegisterOpenAIRoutes.(*RequestExtractor).BuildFilteredFirstAvailableDefaultModel.func11(0x14007776008)
/Users/xyz/localai/core/http/middleware/request.go:107 +0x138
github.com/gofiber/fiber/v2.(*App).next(0x1400023f408, 0x14007776008)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/router.go:145 +0x180
github.com/gofiber/fiber/v2.(*Ctx).Next(0x14000463d01?)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/ctx.go:1034 +0x5c
github.com/dave-gray101/v2keyauth.init.func1(0x140004e4cc7?)
/var/root/go/pkg/mod/github.com/dave-gray101/v2keyauth@v0.0.0-20240624150259-c45d584d25e2/config.go:50 +0x1c
github.com/dave-gray101/v2keyauth.New.func1(0x14007776008)
/var/root/go/pkg/mod/github.com/dave-gray101/v2keyauth@v0.0.0-20240624150259-c45d584d25e2/keyauth.go:63 +0xf4
github.com/gofiber/fiber/v2.(*App).next(0x1400023f408, 0x14007776008)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/router.go:145 +0x180
github.com/gofiber/fiber/v2.(*Ctx).Next(0x14007776008?)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/ctx.go:1034 +0x5c
github.com/gofiber/fiber/v2/middleware/favicon.New.func1(0x14007776008)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/middleware/favicon/favicon.go:121 +0xa0
github.com/gofiber/fiber/v2.(*Ctx).Next(0x14007776008?)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/ctx.go:1031 +0x48
github.com/mudler/LocalAI/core/http.API.LocalAIMetricsAPIMiddleware.func10(0x14007776008)
/Users/xyz/localai/core/http/endpoints/localai/metrics.go:41 +0x8c
github.com/gofiber/fiber/v2.(*Ctx).Next(0x14007776008?)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/ctx.go:1031 +0x48
github.com/gofiber/contrib/fiberzerolog.New.func1(0x14007776008)
/var/root/go/pkg/mod/github.com/gofiber/contrib/fiberzerolog@v1.0.2/zerolog.go:36 +0x90
github.com/gofiber/fiber/v2.(*App).next(0x1400023f408, 0x14007776008)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/router.go:145 +0x180
github.com/gofiber/fiber/v2.(*Ctx).Next(0x14007776008?)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/ctx.go:1034 +0x5c
github.com/mudler/LocalAI/core/http.API.StripPathPrefix.func7(0x14007776008)
/Users/xyz/localai/core/http/middleware/strippathprefix.go:34 +0x1a8
github.com/gofiber/fiber/v2.(*App).next(0x1400023f408, 0x14007776008)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/router.go:145 +0x180
github.com/gofiber/fiber/v2.(*App).handler(0x1400023f408, 0x104d0343c?)
/var/root/go/pkg/mod/github.com/gofiber/fiber/v2@v2.52.5/router.go:172 +0x6c
github.com/valyala/fasthttp.(*Server).serveConn(0x1400059a488, {0x106e51ac8, 0x140001966b0})
/var/root/go/pkg/mod/github.com/valyala/fasthttp@v1.55.0/server.go:2379 +0xb34
github.com/valyala/fasthttp.(*workerPool).workerFunc(0x1400e6e0be0, 0x14000346d00)
/var/root/go/pkg/mod/github.com/valyala/fasthttp@v1.55.0/workerpool.go:224 +0x70
github.com/valyala/fasthttp.(*workerPool).getCh.func1()
/var/root/go/pkg/mod/github.com/valyala/fasthttp@v1.55.0/workerpool.go:196 +0x38
created by github.com/valyala/fasthttp.(*workerPool).getCh in goroutine 1
/var/root/go/pkg/mod/github.com/valyala/fasthttp@v1.55.0/workerpool.go:195 +0x1e4
Additional context
A more simple function definition seems to work. For example I created a basic websearch-function that just takes one parameter and this works without any issues. If I send the prompt above to the openAI api it does work without any issues, too.