-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Open
Description
Description
When using convertToModelMessages from the Vercel AI SDK, the function copies providerMetadata into providerOptions on ui parts, causing the resulting output to be rejected by generateText() or streamText() with:
InvalidPromptError: The messages must be a ModelMessage[]. If you have passed a UIMessage[], you can use convertToModelMessages to convert them.
Steps to Reproduce
import { convertToModelMessages, generateText } from 'ai';
const uiMessages = [
{
role: 'assistant',
parts: [
{
type: 'text',
text: 'Hello!',
providerMetadata: {}, // <-- triggers the issue
},
],
},
];
const modelMessages = convertToModelMessages(uiMessages);
await generateText({
model: 'gpt-4o-mini',
messages: modelMessages,
});Actual behavior
The generated ModelMessage[] includes:
{
"type": "text",
"text": "Hello!",
"providerOptions": {}
}
generateText() fails with AI_InvalidPromptError due to the invalid extra key.
Expected behavior
convertToModelMessagesshould not addproviderOptionsfor messages content.providerMetadatashould remain untouched (consistent with how it’s used in UIMessage and text parts).
Proposed fix
In convertToModelMessages
→
fix in this spread:
...(part.providerMetadata != null
? { providerOptions: part.providerMetadata }
: {}),to
...(part.providerMetadata != null
? { providerMetadata: part.providerMetadata }
: {}),Or gate it only for parts that actually support providerOptions (e.g., tool-result, not text).
AI SDK Version
- "ai": "^5.0.76",
Code of Conduct
- I agree to follow this project's Code of Conduct