AutoGen Studio: Serve responding complete messages #6520
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Why are these changes needed?
Run the Studio with serve:
autogenstudio serve --team notebooks/team.json --port 8084
Open the doc endpoint (http://127.0.0.1:8084/docs) and make a request, you will notice the response does not show content for messages.
With this fix you will see full payload:

This error might be something related to FastAPI vs pydantic, as we are using some "complex" model structuring .
TaskResult -> BaseChatMessage(BaseMessage, ABC) -> BaseTextChatMessage(BaseChatMessage, ABC) -> TextMessage(BaseTextChatMessage)
Probably the serializer use its parent to serialize, and then can't access extension attributes declared by children classes.
PS: We have /docs page with the available endpoints, but it is not mentioned on our documentation or we could show on the Studio Deploy view. Seems that the documentation is out of date as many times it says --workflow but now is --team
Related issue number
Opened a discussion on discord.
Checks
[ X ] I've included any doc changes needed for https://microsoft.github.io/autogen/. See https://github.com/microsoft/autogen/blob/main/CONTRIBUTING.md to build and test documentation locally.
** As this is more like a bug, I believe doc change is not needed, as is expected to see full LLM results **
[ X ] I've added tests (if relevant) corresponding to the changes introduced in this PR.
[ ] I've made sure all auto checks have passed.
** Checkin **