-
-
Notifications
You must be signed in to change notification settings - Fork 435
Description
Describe the bug
Coming up with a quick fix.
Reported the problem details to Vercel AI.
We recently updated "ai" ti the latest and I think it is connected with this - I didn't notice that problem earlier.
Quick description
When using convertToModelMessages from the Vercel AI SDK, the function copies providerMetadata into providerOptions on text parts, causing the resulting output to be rejected by generateText() or streamText() with:
InvalidPromptError: The messages must be a ModelMessage[]. If you have passed a UIMessage[], you can use convertToModelMessages to convert them.
Quick fix
Strip the excessive fields (like providerOptions) right before passing ui messages to ai's convertToModelMessages
Steps To Reproduce
Connect to voltagent chat endpoint directly like below and enjoy that very server error after the first assistant response and the following up user message
const { messages, sendMessage, stop, status, addToolResult } = useChat({
transport: new DefaultChatTransport({
api: `http://localhost:${port}/agents/${agentId}/chat`,
prepareSendMessagesRequest({ messages }) {
const input = [messages[messages.length - 1]];
return {
body: {
input,
options: {
userId,
conversationId,
temperature: 0.7,
maxSteps: 10,
},
},
};
},
}),
onToolCall: handleToolCall,
onFinish: () => {
console.log("Message completed");
},
onError: (error) => {
console.error("Chat error:", error);
},
sendAutomaticallyWhen: lastAssistantMessageIsCompleteWithToolCalls,
});Expected behavior
no errors
Packages
- @voltagent/core
Additional Context
No response