-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Description
When I am using KernelFunction which returns user defined object, then it is called by Auto function invocation,
the result of the kernel function will be converted to string with JsonSerializer.Serialize via FunctionCallsProcessor.ProcessFunctionResult.
semantic-kernel/dotnet/src/InternalUtilities/connectors/AI/FunctionCalling/FunctionCallsProcessor.cs
Lines 479 to 494 in f8592ad
public static string ProcessFunctionResult(object functionResult) | |
{ | |
if (functionResult is string stringResult) | |
{ | |
return stringResult; | |
} | |
// This is an optimization to use ChatMessageContent content directly | |
// without unnecessary serialization of the whole message content class. | |
if (functionResult is ChatMessageContent chatMessageContent) | |
{ | |
return chatMessageContent.ToString(); | |
} | |
return JsonSerializer.Serialize(functionResult); | |
} |
Because JsonSerializer.Serialize very likely to escape characters by default.
if I have this kind of record,
record A(string Text);
A a = new("オダ ノブナガ");
string x = JsonSerializer.Serialize(a);
the content of the x is ...
{"Text":"\uFF75\uFF80\uFF9E \uFF89\uFF8C\uFF9E\uFF85\uFF76\uFF9E"}
But at least, Azure OpenAI gpt-4o (2024-08-06) could not handle this kind of escaped characters.
But if I use
string y = JsonSerializer.Serialize(a, new JsonSerializerOptions(JsonSerializerOptions.Default)
{
Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping,
});
The content of the y is ...
{"Text":"オダ ノブナガ"}
So it will be easily handled by LLM.
Thus, IMHO, FunctionCallsProcessor.ProcessFunctionResult should use new JsonSerializerOptions(JsonSerializerOptions.Default) { Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping, }
for JsonSerializer.Serialize by default.
Also, even if we pass JsonSerializerOptions to IKernelBuilderPlugins AddFromType, it is not used for result serialization.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status