Replies: 2 comments 4 replies
-
🤖 Hey @vikasr111! Nice to see you back here. Hope all is well on your end. Let's see if we can sort out this memory issue together. To implement the memory feature in your structured chat agent, you can use the from langchain_core.prompts.chat import HumanMessagePromptTemplate
# Create memory prompts
memory_prompts = [
HumanMessagePromptTemplate(content="I remember you told me your name is Bob.")
]
# Create the agent with memory prompts
agent = StructuredChatAgent.from_llm_and_tools(
llm=my_llm,
tools=my_tools,
memory_prompts=memory_prompts,
) In this example, the agent will have a memory of a human message saying "I remember you told me your name is Bob." This memory can be used to answer subsequent queries. To troubleshoot the memory feature, you can check the following:
You can find more details in the StructuredChatAgent class in the LangChain repository. Regarding the error you're encountering when trying to return the memory from the function, could you please provide the exact error message? This will help in identifying the issue and providing a more accurate solution. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
You can use RunnableWithMessageHistory with InMemoryChatMessageHistory. For example: from langchain.agents import create_structured_chat_agent, AgentExecutor
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_core.chat_history import InMemoryChatMessageHistory
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain import hub
from dotenv import load_dotenv
load_dotenv()
def get_message_history(session_id) -> InMemoryChatMessageHistory:
if session_id not in store:
store[session_id] = InMemoryChatMessageHistory()
return store[session_id]
store = {} # should be a database
llm = ChatGoogleGenerativeAI(model="gemini-2.0-flash", temperature=0.0)
prompt = hub.pull("hwchase17/structured-chat-agent")
structured_chat_agent = create_structured_chat_agent(
llm=llm,
tools=[],
prompt=prompt
)
structured_chat_agent_executor = AgentExecutor(
agent=structured_chat_agent,
tools=[],
verbose=True
)
structured_chat_agent_executor_with_history = RunnableWithMessageHistory(
structured_chat_agent_executor,
get_message_history,
input_messages_key="input",
history_messages_key="chat_history"
)
config = {
"configurable": {
"session_id": 1
}
}
structured_chat_agent_executor_with_history.invoke({"input": "I am aaron."}, config=config)
structured_chat_agent_executor_with_history.invoke({"input": "what is my name?"}, config=config)
print(store) Reference: https://python.langchain.com/docs/how_to/migrate_agent/#in-langchain |
Beta Was this translation helpful? Give feedback.
-
I have written a simple function to create and run a structured chat agent. In this agent I am trying to implement memory. Here's the code
I am following the documentation on this link but it's so poor and not helpful at all: https://python.langchain.com/docs/modules/agents/agent_types/structured_chat
When I try to return the memory form this function it's giving me the error. How can I troubleshoot it and implement memory properly? How do I continue the next query with the agent where it can use the memory to answer queries?
Beta Was this translation helpful? Give feedback.
All reactions