Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

query_prompt_template should be contained in the LLM node data by default. #11032

Closed
4 of 5 tasks
laipz8200 opened this issue Nov 24, 2024 · 1 comment · Fixed by #11053 or #11136
Closed
4 of 5 tasks

query_prompt_template should be contained in the LLM node data by default. #11032

laipz8200 opened this issue Nov 24, 2024 · 1 comment · Fixed by #11053 or #11136
Assignees
Labels
💪 enhancement New feature or request

Comments

@laipz8200
Copy link
Member

Self Checks

  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

1. Is this request related to a challenge you're experiencing? Tell me about your story.

Currently, after creating a new chatflow, we do not include the query_prompt_template in the LLM node data until it is edited:

image

We need to include this part of the data once the File variable is supported in the prompt template.

The expected structure of the data should be:

image

2. Additional context or comments

This issue won't cause an error after #11031 is merged, but we still need to fix it.

3. Can you help us with this feature?

  • I am interested in contributing to this feature.
Copy link
Member Author

laipz8200 commented Nov 26, 2024

Still got blank content when I close and reopen memory:
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
💪 enhancement New feature or request
Projects
None yet
2 participants