Skip to content

Conversation

@ServeurpersoCom
Copy link
Collaborator

@ServeurpersoCom ServeurpersoCom commented Nov 25, 2025

Make sure to read the contributing guidelines before submitting a PR

  • multi-transport MCP client
  • full agentic orchestrator
  • isolated, idempotent singleton initialization
  • typed SSE client
  • normalized tool-call accumulation pipeline
  • integrated reasoning, timings, previews, and turn-limit handling
  • complete UI section for MCP configuration
  • dedicated controls for relevant parameters
  • opt-in ChatService integration that does not interfere with existing flows

TODO: increase coupling with the UI for structured tool-call result rendering, including integrated display components and support for sending out-of-context images (persistence/storage still to be defined).

1 2 3 4
llama-webui-mcp-client.mp4

@allozaur
Copy link
Collaborator

allozaur commented Dec 1, 2025

@ServeurpersoCom this PR needs updating after #17470

@ServeurpersoCom ServeurpersoCom marked this pull request as ready for review December 2, 2025 12:14
@github-actions github-actions bot added the server label Dec 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants