Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about OpenAIFunctionAgent and the need for extraPromptMessages #91

Closed
orenagiv opened this issue Aug 6, 2023 · 1 comment · Fixed by #94
Closed

Question about OpenAIFunctionAgent and the need for extraPromptMessages #91

orenagiv opened this issue Aug 6, 2023 · 1 comment · Fixed by #94
Assignees
Labels
t:documentation Improvements or additions to documentation
Milestone

Comments

@orenagiv
Copy link
Contributor

orenagiv commented Aug 6, 2023

Hey @davidmigloz ,
When using OpenAIFunctionsAgent and since it uses a Chat Model behind the scenes (and not plain LLM), is there a need / significance of using the extraPromptMessages like so:

final agent = OpenAIFunctionsAgent.fromLLMAndTools(
  llm: chatOpenAI,
  tools: [...],
  memory: memory,
  systemChatMessage: SystemChatMessagePromptTemplate.fromTemplate('...'),
  extraPromptMessages: [
    const MessagesPlaceholder(variableName: BaseMemory.defaultMemoryKey),
  ],
);

Or is it unnecessary? as the Chat Model relies on the history of messages that go through the OpenAI API messages list property of the chatComplete() method (and not as a plain text as part of the prompt).

@davidmigloz
Copy link
Owner

davidmigloz commented Aug 6, 2023

Hey @orenagiv ,

You don't need to add a placeholder for the memory if you use the OpenAIFunctionsAgent.fromLLMAndTools factory constructor, as the default prompt template already takes care of it. Since it is a chat model, make sure you set returnMessages to true in your memory. So your example should look like this:

final memory = ConversationBufferMemory(returnMessages: true);
final agent = OpenAIFunctionsAgent.fromLLMAndTools(
  llm: chatOpenAI,
  tools: [...],
  memory: memory,
  systemChatMessage: SystemChatMessagePromptTemplate.fromTemplate('...'),
);

You can see how the default prompt template is constructed here:

static BasePromptTemplate createPrompt({
final SystemChatMessagePromptTemplate systemChatMessage =
_systemChatMessagePromptTemplate,
final List<BaseChatMessagePromptTemplate>? extraPromptMessages,
final BaseChatMemory? memory,
}) {
return ChatPromptTemplate.fromPromptMessages([
systemChatMessage,
...?extraPromptMessages,
if (memory == null)
const MessagesPlaceholder(
variableName: BaseActionAgent.agentScratchpadInputKey,
),
for (final memoryKey in memory?.memoryKeys ?? {})
MessagesPlaceholder(variableName: memoryKey),
const MessagePlaceholder(variableName: agentInputKey),
]);
}
}

If you don't use memory, the agent places its intermediary work in MessagesPlaceholder(variableName: BaseActionAgent.agentScratchpadInputKey). If you use memory, the agent uses the memory placeholder instead.

I've improved the documentation in the PR: #94

Let me know if it's clear.

@davidmigloz davidmigloz removed the t:enhancement New feature or request label Aug 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
t:documentation Improvements or additions to documentation
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

2 participants