Skip to content

feat: showcase thinking steps#98

Merged
srtaalej merged 14 commits intofeat-ai-apps-thinking-stepsfrom
ale-feat-chunks
Jan 29, 2026
Merged

feat: showcase thinking steps#98
srtaalej merged 14 commits intofeat-ai-apps-thinking-stepsfrom
ale-feat-chunks

Conversation

@srtaalej
Copy link
Contributor

@srtaalej srtaalej commented Jan 21, 2026

Type of change

  • New feature
  • Bug fix
  • Documentation

Summary

This PR showcases "chunks" in chat streams. Related: slackapi/node-slack-sdk#2467

⚠️ the chunks type is still under construction resulting in the linter error below 😿

Demo videos 📹
dice-demo.mov
wonder-demo.mov

Requirements

  • I have ensured the changes I am contributing align with existing patterns and have tested and linted my code
  • I've read and agree to the Code of Conduct

@srtaalej srtaalej self-assigned this Jan 21, 2026
@srtaalej srtaalej requested a review from a team as a code owner January 21, 2026 22:11
@srtaalej srtaalej added the enhancement New feature or request label Jan 21, 2026
@@ -6,7 +6,7 @@ Models from [OpenAI](https://openai.com) are used and can be customized for prom

## Setup
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note - still deciding how to restructure README.md to make setup easier. I think the most obvious point of confusion is deciding whether to use CLI or terminal to setup, i think a toggle like

Details
might make things simpler 🤔 ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@srtaalej Thanks for calling this out! 📚 ✨ I agree it's a confusing and hope we can encourage the CLI most-

IIRC @lukegalbraithrussell had thoughts on related changes as well and we should match such in slack-samples/bolt-python-assistant-template#42 too!

@srtaalej srtaalej marked this pull request as draft January 21, 2026 22:15
Copy link
Member

@zimeg zimeg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@srtaalej Nice - I'm leaving a few quick comments now but am looking forward to testing this! 📺 ✨

input: `System: ${DEFAULT_SYSTEM_CONTENT}\n\nUser: ${llmPrompt}`,
stream: true,
// This first example shows a generated text response for the provided prompt
if (message.text !== 'Wonder a few deep thoughts.') {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

📝 note: This != and the cases contained were switched in the final commits of slack-samples/bolt-python-assistant-template#37 which we might want to match?

@@ -6,7 +6,7 @@ Models from [OpenAI](https://openai.com) are used and can be customized for prom

## Setup
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@srtaalej Thanks for calling this out! 📚 ✨ I agree it's a confusing and hope we can encourage the CLI most-

IIRC @lukegalbraithrussell had thoughts on related changes as well and we should match such in slack-samples/bolt-python-assistant-template#42 too!

@srtaalej srtaalej marked this pull request as ready for review January 23, 2026 21:44
Copy link
Member

@zimeg zimeg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@srtaalej Sweet I'm liking the results of a roll I prompted 🏆

I'm requesting a few changes in hopes of focusing most on tool calls with this example, and also want to showcase the buffer in text streaming 🤖

Tests related to type checking might need a small change too, but once we have CI passing with a prerelease I think we should look toward merging this!

Comment on lines 36 to 41
/**
* Tool definition for OpenAI API
*
* @see {@link https://platform.openai.com/docs/guides/function-calling}
*/
export const rollDiceDefinition = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🪬 note: Using jsdoc to match the expected type of the openai.responses.create arguments might be required since we're exporting from this package? I'm unsure what that type might be, but am hoping it'd address the error above!

@srtaalej srtaalej marked this pull request as draft January 27, 2026 16:43
srtaalej and others added 2 commits January 27, 2026 11:44
Co-authored-by: Eden Zimbelman <eden.zimbelman@salesforce.com>
@srtaalej srtaalej marked this pull request as ready for review January 27, 2026 18:45
@zimeg zimeg self-requested a review January 27, 2026 18:46
Copy link
Member

@mwbrooks mwbrooks left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🙌🏻 Thanks for getting the thinking steps demo together! It's looking very closely aligned to the Python demo 👌🏻

✏️ I've left a few suggestions and I'll begin manual testing next!

const { channel, thread_ts } = message;
const { userId, teamId } = context;

// The first example shows detailed thinking steps similar to tool calls
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

question: Should this comment be swapped with the else clause? The if seems to handle the scripted "Wonder a thought" example while the else handles the roll-the-dice tool call.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mwbrooks We should reword this to be more clear in both samples IMHO, but the order I think is correct. I'm curious of other changes too, but might suggest:

The first example shows a message with thinking steps that has different chunks to construct and update a plan alongside text outputs.

model: 'gpt-4o-mini',
input: `System: ${DEFAULT_SYSTEM_CONTENT}\n\nUser: ${llmPrompt}`,
stream: true,
await streamer.stop({
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion: I think we should add the feedback blocks to this .stop (and add it to Python if it's missing).

* @see {@link https://platform.openai.com/docs/guides/streaming-responses}
* @see {@link https://platform.openai.com/docs/guides/function-calling}
*/
export async function callLlm(streamer, prompts) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion: Take it or leave it, but I feel callLlm is a funny syntax in JavaScript. Instead, I'd expect callLLM. .cc @zimeg @srtaalej

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🐣 note: I'm open to that! But am unsure of best practices for this. As long as it's not the following-

callllm

manifest.json Outdated
"assistant_thread_started",
"message.im"
]
"bot_events": ["app_mention", "assistant_thread_context_changed", "assistant_thread_started", "message.im"]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion: I'd suggest putting each item of the array on a newline to match the JSON file format.

Copy link
Member

@mwbrooks mwbrooks left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧪 Manually testing the sample goes works g-g-g-g-g-great! 👏🏻

Copy link
Member

@zimeg zimeg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Praises to these changes so far! Thanks! ✨

Responses are generating well for me but I left a few more comments with changes for CI checks. Some of these might find a test with this command too:

$ npm run check

But I realize that's not found with a proper "test" script 🧪

The comments @mwbrooks shared are all solid finds too! Once these are finalized let's mirror similar updates to Python implementation 🙏

const { channel, thread_ts } = message;
const { userId, teamId } = context;

// The first example shows detailed thinking steps similar to tool calls
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mwbrooks We should reword this to be more clear in both samples IMHO, but the order I think is correct. I'm curious of other changes too, but might suggest:

The first example shows a message with thinking steps that has different chunks to construct and update a plan alongside text outputs.

* @see {@link https://platform.openai.com/docs/guides/streaming-responses}
* @see {@link https://platform.openai.com/docs/guides/function-calling}
*/
export async function callLlm(streamer, prompts) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🐣 note: I'm open to that! But am unsure of best practices for this. As long as it's not the following-

callllm

Comment on lines 4 to 7
// OpenAI LLM client
export const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🌚 question: Is the openai client still used outside of this package, or could we remove the export here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good catch! it is not

srtaalej and others added 7 commits January 28, 2026 11:14
Co-authored-by: Michael Brooks <mbrooks@slack-corp.com>
Co-authored-by: Michael Brooks <mbrooks@slack-corp.com>
Co-authored-by: Michael Brooks <mbrooks@slack-corp.com>
Co-authored-by: Michael Brooks <mbrooks@slack-corp.com>
Co-authored-by: Eden Zimbelman <eden.zimbelman@salesforce.com>
Co-authored-by: Eden Zimbelman <eden.zimbelman@salesforce.com>
@srtaalej srtaalej requested a review from zimeg January 28, 2026 18:42
@srtaalej
Copy link
Contributor Author

srtaalej commented Jan 28, 2026

TY for your reviews @zimeg @mwbrooks ❤️ your suggestions have been committed and this PR is ready for re-review 😸

Copy link
Member

@mwbrooks mwbrooks left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Thanks a bunch for addressing all of our feedback @srtaalej!

🧪 Manual testing works great on my side! 🕹️

📝 I've left one final suggestion to rename callLLm to callLLM (or back to callLlm). Once you've update that, I think we're good to merge it!

Copy link
Member

@zimeg zimeg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🗣️ Nice! I'm leaving a few more comments of polish but will approve this for now.

Let's make sure our checks are passing with development builds linked before merging this. One comment I left might require an upstream change beforehand, but I'm so glad this is caught in this PR!

Some conventions to file casing might be nice to update here as well, but this can be saved for follow up too 👾

srtaalej and others added 2 commits January 29, 2026 11:19
Co-authored-by: Eden Zimbelman <eden.zimbelman@salesforce.com>
@srtaalej
Copy link
Contributor Author

thank you @zimeg for the ChatStreamer export pr! i'll update here once we get that merged ⭐ ⭐ ⭐

@mwbrooks
Copy link
Member

thank you @zimeg for the ChatStreamer export pr! i'll update here once we get that merged ⭐ ⭐ ⭐

@srtaalej I've merged slackapi/node-slack-sdk#2481 so the ChatStreamer export should now be available!

@srtaalej
Copy link
Contributor Author

⚠️ Note: CI will continue to fail until node feat-ai-app-thinking-steps branch is published. To run locally,

git pull origin feat-ai-apps-thinking-steps
cd web-api
npm run build
npm link
cd bolt-js-assistant-template
npm link @slack/web-api

@srtaalej srtaalej merged commit 038d316 into feat-ai-apps-thinking-steps Jan 29, 2026
1 of 4 checks passed
@srtaalej srtaalej deleted the ale-feat-chunks branch January 29, 2026 19:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants