-
-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about OpenAIFunctionAgent
and the need for extraPromptMessages
#91
Comments
Hey @orenagiv , You don't need to add a placeholder for the memory if you use the final memory = ConversationBufferMemory(returnMessages: true);
final agent = OpenAIFunctionsAgent.fromLLMAndTools(
llm: chatOpenAI,
tools: [...],
memory: memory,
systemChatMessage: SystemChatMessagePromptTemplate.fromTemplate('...'),
); You can see how the default prompt template is constructed here: langchain_dart/packages/langchain_openai/lib/src/agents/functions.dart Lines 242 to 260 in 898d535
If you don't use memory, the agent places its intermediary work in I've improved the documentation in the PR: #94 Let me know if it's clear. |
Hey @davidmigloz ,
When using
OpenAIFunctionsAgent
and since it uses a Chat Model behind the scenes (and not plain LLM), is there a need / significance of using the extraPromptMessages like so:Or is it unnecessary? as the Chat Model relies on the history of messages that go through the OpenAI API messages list property of the
chatComplete()
method (and not as a plain text as part of the prompt).The text was updated successfully, but these errors were encountered: