feat: Vercel AI SDK v6 with tool-calling#78
Conversation
|
Wow that's really cool! I'm sure @nico-martin will be interested too. |
|
Thank you, @xenova! Hopefully I can get the internals updated to use Transformers.js v4 too when it's released :) |
|
@nico-martin let me know if I need to change anything |
|
Hi @jakobhoeg That amazing! I had one small issue on The problem is that Edit: Sorry, wrong folder. But it seems like both, next-vercel-ai-sdk-v5 and next-vercel-ai-sdk-v6-tool-calling, have the same problem. |
|
Pretty solid example 👍 ! |
|
@nico-martin thank you for the feedback! |
|
@nico-martin should be fixed now |
Thats strange. Yes, I use npm v 10.9.2 and did a normal Anyways, thanks for the update! |
There was a problem hiding this comment.
I really like this pattern! Keeps the worker logic out of the app logic. I know WebLLM has a similar WorkerHandler and I think we should have one in Transformers.js too.
This PR adds a Next.js example application of how to use Transformers.js models that support tool-calling with the Vercel AI SDK v6 (I also renamed the old
next-vercel-ai-sdktonext-vercel-ai-sdk-v5).It uses @browser-ai that acts as a Transformers.js model provider for the Vercel AI SDK, enabling seamless tool-call management.
The internals of the provider handles incremental streaming that detects which tool is being called before the model finishes generating arguments, which is like how native cloud-based API's handle tool-call streaming.
It also automatically handles different model tool-call outputs, such as XML tags (
<tool_call>) and Python-style ([functionName(args)]) syntax, which enhances the Developer Experience dramatically. Easily switch between Qwen3, LFM2, Granite models without introducing a bunch of boilerplate code.On the UI side, it also allows for HITL (human in the loop) tool-calls by just setting a simple
needsApprovalflag in the tool-call schema.@xenova let me know if anything needs to be changed :)
Demo:
demooo.mp4