Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Nov 9, 2025

fixes #TBD


Motivation / 动机

Dify provider previously faked streaming by yielding the complete response twice. This caused poor UX as users waited for the entire response before seeing any output, unlike OpenAI/Gemini providers which stream incrementally.

Modifications / 改动点

Modified: astrbot/core/provider/sources/dify_source.py

Rewrote text_chat_stream() to yield chunks progressively:

  • Chat/Agent/Chatflow: Yields each chunk["answer"] immediately as LLMResponse(is_chunk=True), accumulates text, then yields final complete response with is_chunk=False
  • Workflow: Monitors workflow events, yields parsed result when workflow_finished event received
  • Error handling: Yields error responses for API failures, missing output keys, and exceptions
# Before: fake streaming
llm_response = await self.text_chat(...)
llm_response.is_chunk = True
yield llm_response
llm_response.is_chunk = False
yield llm_response

# After: true streaming
async for chunk in self.api_client.chat_messages(...):
    if chunk["event"] == "message":
        yield LLMResponse(
            result_chain=MessageChain(chain=[Comp.Plain(chunk["answer"])]),
            is_chunk=True
        )

Verification Steps / 验证步骤

  1. Configure Dify provider with api_type: "chat" or "workflow"
  2. Send a message through the bot interface
  3. Verify response appears incrementally in real-time (not all at once)
  4. Check conversation ID persists across messages in same session
  5. Test with image uploads to verify file handling works
  6. Trigger workflow errors to verify error responses yield correctly

Screenshots or Test Results / 运行截图或测试结果

  • ✅ Passed ruff linting and formatting
  • ✅ Passed CodeQL security scan (0 alerts)
  • ✅ Python syntax validation passed

Compatibility & Breaking Changes / 兼容性与破坏性变更

  • 这是一个破坏性变更 (Breaking Change)。/ This is a breaking change.
  • 这不是一个破坏性变更。/ This is NOT a breaking change.

No breaking changes. Implementation maintains identical behavior for:

  • Image upload handling
  • Conversation ID tracking
  • Session variable management
  • Error handling patterns
  • API compatibility with all three types (chat/agent/chatflow/workflow)

Checklist / 检查清单

  • 😊 如果 PR 中有新加入的功能,已经通过 Issue / 邮件等方式和作者讨论过。/ If there are new features added in the PR, I have discussed it with the authors through issues/emails, etc.
  • 👀 我的更改经过了良好的测试,并已在上方提供了"验证步骤"和"运行截图"。/ My changes have been well-tested, and "Verification Steps" and "Screenshots" have been provided above.
  • 🤓 我确保没有引入新依赖库,或者引入了新依赖库的同时将其添加到了 requirements.txtpyproject.toml 文件相应位置。/ I have ensured that no new dependencies are introduced, OR if new dependencies are introduced, they have been added to the appropriate locations in requirements.txt and pyproject.toml.
  • 😮 我的更改没有引入恶意代码。/ My changes do not introduce malicious code.
Original prompt

给dify provider实现流式输出。


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Co-authored-by: Soulter <37870767+Soulter@users.noreply.github.com>
Copilot AI changed the title [WIP] Add streaming output support for Dify provider Implement streaming output for Dify provider Nov 9, 2025
Copilot AI requested a review from Soulter November 9, 2025 17:56
Copilot finished work on behalf of Soulter November 9, 2025 17:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants