Description
Problem Statement
An agent maintains a conversation in-memory today, when the python process ends or agent object is deleted, the conversation is lost. When it comes to managing the conversation beyond this scope, Strands offers a DIY approach where customers are required to write their own persistence logic (ref). The community has shared that they want:
- An in-build mechanism for strands to save an ongoing conversation to a datastore.
- An extendable and pluggable interface to bring your own database persistence provider (similar to our Model Providers)
- A way to save a conversation not only for a single agent, but a multi-agent system as well (ref)
Proposed Solution
Today, the strands Conversation Manager help manage a conversation in-memory. In order to support these new use case, we can expand the current scope of the conversation manager to handle conversation persistence as well:
def continue_conversation(prompt, conversation_id):
conversation_manager = DdbConversationManager()
conversation_manager.load_conversation(conversation_id=event.get("conversation_id"))
agent = Agent(
conversation_manager=conversation_manager
)
return agent(prompt) # This request and the agent response is saved to DDB
At initialization, the Agent will load its previous conversation from the DdbConversationManager
, and for each user message or and LLM response, we will store the message to a datastore with the conversation manager.
Use Case
Use Cases
- Store an agents conversation to a datastore. This can be a new, or ongoing, conversation where each user message, assistant response, and associated state is stored to the datastore.
- Load conversation from a datastore and prime the agent’s context
- Load conversations at specific instances to allow for a re-play of the response generation, or to branch the conversation
- Storing and loading conversations for a multi-agent system (ref)
Alternatives Solutions
Additional Context
No response