Skip to content

Conversation

@r4881t
Copy link
Collaborator

@r4881t r4881t commented Dec 8, 2025

Why are these changes needed?

In Group Chat, the default behavior is that all agents share the same message history. While this is suitable for most cases, some cases demand the agents to have isolated views of messages.

This pattern is more common in places where (a) tool call responses can be huge. (b) The agent's don't need implicit message history and rely on external ContextVar to manage the shared state/tasks.

Related issue number

Checks

@CLAassistant
Copy link

CLAassistant commented Dec 8, 2025

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
2 out of 3 committers have signed the CLA.

✅ qingyun-wu
✅ priyansh4320
❌ 0xbitmonk
You have signed the CLA already but the status is still pending? Let us recheck it.

@joggrbot
Copy link
Contributor

joggrbot bot commented Dec 8, 2025

📝 Documentation Analysis

All docs are up to date! 🎉


✅ Latest commit analyzed: 093af91 | Powered by Joggr

@r4881t
Copy link
Collaborator Author

r4881t commented Dec 10, 2025

I am still working on this PR.

@marklysze
Copy link
Collaborator

@r4881t - I like this idea. Do you think it would be worthwhile having the ability to filter what messages get shared. In your description you noted that tool responses can be large (agreed!), would being able to suppress the tool response only be helpful?

@r4881t
Copy link
Collaborator Author

r4881t commented Dec 11, 2025

@r4881t - I like this idea. Do you think it would be worthwhile having the ability to filter what messages get shared. In your description you noted that tool responses can be large (agreed!), would being able to suppress the tool response only be helpful?

Maybe. But if tool responses are suppressed, what else is there to add meaningful context. I can maybe think of a way where the param isolate_agent_views can take an optional Callable which can be called to selectively keep messages. But in my opinion, this will be used alongwith ContextVars anyways, and using shared information in ContextVars is the right thing to do.

@marklysze
Copy link
Collaborator

@r4881t - I like this idea. Do you think it would be worthwhile having the ability to filter what messages get shared. In your description you noted that tool responses can be large (agreed!), would being able to suppress the tool response only be helpful?

Maybe. But if tool responses are suppressed, what else is there to add meaningful context. I can maybe think of a way where the param isolate_agent_views can take an optional Callable which can be called to selectively keep messages. But in my opinion, this will be used alongwith ContextVars anyways, and using shared information in ContextVars is the right thing to do.

Thanks @r4881t , understand, let's leave it as is. Would you be able to add to the documentation in the Group Chat section? If time permits, a notebook example would be useful to show how this helps.

Comment on lines +1220 to +1222
if not groupchat.isolate_agent_views:
for agent in groupchat.agents:
self.send(intro, agent, request_reply=False, silent=True)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dont think we need a if not groupchat.isolate_agent_views: case here , should be default behavior

Comment on lines +1233 to +1250
# broadcast the message to all agents except the speaker (unless isolate_agent_views is True)
if not groupchat.isolate_agent_views:
for agent in groupchat.agents:
if agent != speaker:
inter_reply = groupchat._run_inter_agent_guardrails(
src_agent_name=speaker.name,
dst_agent_name=agent.name,
message_content=message,
)
self.send(replacement, agent, request_reply=False, silent=True)
else:
self.send(message, agent, request_reply=False, silent=True)
if inter_reply is not None:
replacement = (
{"content": inter_reply, "name": speaker.name}
if not isinstance(inter_reply, dict)
else inter_reply
)
self.send(replacement, agent, request_reply=False, silent=True)
else:
self.send(message, agent, request_reply=False, silent=True)
Copy link
Collaborator

@priyansh4320 priyansh4320 Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here, don't need this case if not groupchat.isolate_agent_views this is default behavior, I suggest to keep this logic as fallback to when if groupchat.isolate_agent_views case not passes, which should serve as default behavior

Comment on lines 48 to +53
mask_llm_config: Optional LLM configuration for masking.
isolate_agent_views: If True, agents will only maintain their own message history
and will not receive messages from other agents. When False (default), all agents
receive all messages. When True, messages are still stored in groupchat.messages
for the GroupChatManager's view, but are not broadcast to other agents.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR is good , I have few logical questions from design point of view:
firstly let's look at a how a tools is being called on high level:
sys prompt + use prompt -> agent -> LLM review context -> tool choice -> tool execution -> return to agent / handoff to another agent -> end

so while this PR is treating context rotting + context window, I am seeing possibilities of context loss.
for e.g., let's look at a scenario say we have a GroupChat with [agent A, Agent B , Agent C]
we have enabled isolated agent views, which isolate agent chat history.
a tool call from agent B, requires context from agent A's output / agent A's tool output to be specific.
where the agent B tool call is recent, and the reference is lost in chat history for another agent.

i guess this is a situation we would likely hit. how should we handle this? i suppose. we can link this issue to this pull. #2242

@qingyun-wu
Copy link
Collaborator

@r4881t Nice PR! Are you still working on it? Could you review the comments and suggested changes from priyansh4320? Also please run pre-commit to fix the format issues. Thank you!

@qingyun-wu qingyun-wu self-requested a review December 14, 2025 20:48
@qingyun-wu
Copy link
Collaborator

qingyun-wu commented Dec 14, 2025

@SirEntropy, could you help review this as well? I think it's a useful feature.

Copy link
Collaborator

@priyansh4320 priyansh4320 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@r4881t any updates on this PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants