[AIT-206] Add message per token guide for LangGraph#3157
Conversation
|
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the
WalkthroughThis PR adds documentation and navigation for a new guide on token-by-token streaming with LangGraph. It includes navigation entries, an index tile linking to the guide, and a comprehensive tutorial document with publisher and subscriber code examples using Ably. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~22 minutes Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches🧪 Generate unit tests (beta)
Comment |
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In `@src/data/nav/aitransport.ts`:
- Around line 107-114: The nav entry named 'Vercel AI SDK token streaming -
message per response' points to a non-existent page at
'/docs/guides/ai-transport/vercel-message-per-response'; either create the
missing guide file or update the nav link to the existing Vercel guide path
'/docs/guides/ai-transport/vercel-message-per-token' (or rename the entry to
match the existing file) by editing the object in src/data/nav/aitransport.ts so
the name and link correspond to the actual MDX file.
In `@src/pages/docs/guides/ai-transport/lang-graph-message-per-token.mdx`:
- Around line 268-296: The responses Map is never cleaned up, causing unbounded
growth; in the stop handler (the channel.subscribe('stop', ...) callback where
responseId and finalText are obtained) delete the entry from responses after you
finish using finalText (e.g., call responses.delete(responseId)) so per-response
state is released when the response completes.
- Around line 180-226: The module-scoped responseId causes cross-request reuse;
make responseId a local variable inside streamLangGraphResponse (declare let
responseId = null at the top of that function) so each call gets its own ID,
update where you check and set it (inside the for-await loop) and include it in
the start and token publishes as before, and guard the final stop publish so you
only call channel.publish({ name: 'stop', ... }) when responseId was captured
(i.e., if (responseId) publish stop) to avoid emitting stop for unrelated
streams.
86bc5dc to
181c7d5
Compare
181c7d5 to
7711ba2
Compare
7711ba2 to
f5a17a4
Compare
f5a17a4 to
77da334
Compare
77da334 to
9b1658e
Compare
9b1658e to
cfb1abc
Compare
Description
Follows the existing structure of the message-per-token guides for LangGraph in JS.
Review App
Checklist
Summary by CodeRabbit
✏️ Tip: You can customize this high-level summary in your review settings.