Skip to content

Add an argument "negative_prompt" #549

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 18 commits into from
Oct 4, 2022
Merged

Add an argument "negative_prompt" #549

merged 18 commits into from
Oct 4, 2022

Conversation

shirayu
Copy link
Contributor

@shirayu shirayu commented Sep 18, 2022

Add negative_prompt argument to customize unconditional input.

Example

Bouquet of Roses

Bouquet of Roses

Bouquet of Roses (negative_prompt=red rose)

Bouquet of Roses (negative_prompt=red rose

@shirayu shirayu changed the title Add an argument "negative_prompt" [WIP] Add an argument "negative_prompt" Sep 18, 2022
@shirayu shirayu marked this pull request as draft September 18, 2022 06:19
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Sep 18, 2022

The documentation is not available anymore as the PR was closed or merged.

@shirayu shirayu marked this pull request as ready for review September 18, 2022 06:24
@shirayu shirayu changed the title [WIP] Add an argument "negative_prompt" Add an argument "negative_prompt" Sep 18, 2022
@OrionFive
Copy link

Super useful against extra arms and wonky hands. Thanks for sharing this!

@leszekhanusz
Copy link
Contributor

That's great!

Some remarks:

  • it seems you have some issues around batch_size, sometimes multiplying by batch_size twice and the code is not exactly the same in all the pipelines
  • Please use TypeError instead of ValueError when the type of something is not correct

@shirayu
Copy link
Contributor Author

shirayu commented Sep 18, 2022

@leszekhanusz Thank you for the comments! I fixed them.

@shirayu
Copy link
Contributor Author

shirayu commented Sep 18, 2022

@leszekhanusz Thank you! I forgot to make those changes. I fixed them.

@exo-pla-net
Copy link
Contributor

Super excited for this!

@shirayu
Copy link
Contributor Author

shirayu commented Sep 21, 2022

@anton-l @patrickvonplaten Could you review this?

@WASasquatch
Copy link

I've been playing with this on Easy Diffusion and it's been working great, and generations so much better. This is definitely a needed feature.

@hafiidz
Copy link

hafiidz commented Sep 24, 2022

@shirayu, is there any guide to get this running in my local version while waiting for the update to main library ya?

@OrionFive
Copy link

@shirayu, is there any guide to get this running in my local version while waiting for the update to main library ya?

The easiest way is where your code downloads the git repo from huggingface to replace that with this branch from shirayu's fork.

david-j-smith added a commit to david-j-smith/diffusers that referenced this pull request Sep 26, 2022
@patrickvonplaten
Copy link
Contributor

Is this related to Dream Booth cc @patil-suraj ?

@WASasquatch
Copy link

Schedulers no longer has a method set_format()

@pcuenca
Copy link
Member

pcuenca commented Sep 27, 2022

Schedulers no longer has a method set_format()

Hi @WASasquatch! You are right, #651 should fix it. Sorry!

@keturn
Copy link
Contributor

keturn commented Sep 29, 2022

Is this related to Dream Booth?

This is generally applicable to all the stable diffusion classifier-free guidance pipelines. I don't think there's any particular connection to DreamBooth.

@WASasquatch
Copy link

WASasquatch commented Sep 29, 2022

I hope this gets added soon, so it doesn't have to be maintained so long while they do other stuff. You'd think it'd be pretty high on priority considering it's a masterful tool in prompt engineering.

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Went through it again and it looks very nice! Thanks so much for working on this :-)

Added some suggestions to improve the error messages - would be nice if you could have a look and then potentially run make style once if the test fails.

@patrickvonplaten
Copy link
Contributor

@patil-suraj @pcuenca - I agree with @shirayu that we should merge this soon

@patrickvonplaten
Copy link
Contributor

Also we would need a test here, but I'm happy to take care of this when looking into: #551

shirayu and others added 8 commits October 4, 2022 20:14
…sion.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
…sion.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
…sion_inpaint.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
…sion_onnx.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
…sion_img2img.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
…sion_inpaint.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
…sion_onnx.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
@shirayu
Copy link
Contributor Author

shirayu commented Oct 4, 2022

@patrickvonplaten Thank you for your nice updates!
I merged them and fix styles.

Copy link
Member

@pcuenca pcuenca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code looks good, I just suggested minor nits. I haven't been able to test it yet, I'll do it later.

I have a question about how it works, though. Instead of using an empty prompt for the unconditioned generation we use the negative prompt, is that correct? Wouldn't this make those prompts appear for low values of guidance_scale?

@shirayu
Copy link
Contributor Author

shirayu commented Oct 4, 2022

@pcuenca Thank you for your suggestions. I update this PR.

As far as I know, in almost all cases guidance_scale greater than 1 like 7.5.
FYI: This article is a note about the CFG value I found before.

CFG 2 - 6: Let the AI take the wheel.
CFG 7 - 11: Let's collaborate, AI!
CFG 12 - 15: No, seriously, this is a good prompt. Just do what I say, AI.
CFG 16 - 20: DO WHAT I SAY OR ELSE, AI.

Replacing blanks in unconditional generation is done in some applications like AUTOMATIC1111//stable-diffusion-webui, though I can't figure out a paper to shows its theoretical validity.

https://github.com/AUTOMATIC1111/stable-diffusion-webui/blob/3ff0de2c594b786ef948a89efb1814c59bb42117/modules/processing.py#L354

@allo-
Copy link
Contributor

allo- commented Oct 4, 2022

From #699: Shouldn't there be checks (printing a warning) for the maximum number of tokens in the negative prompt as well?

@shirayu
Copy link
Contributor Author

shirayu commented Oct 4, 2022

Shouldn't there be checks (printing a warning) for the maximum number of tokens in the negative prompt as well?

Good suggestion!
However, it becomes more complex PR and the code will be repeat of previous operations.
So, I think it should be resolved by another PR after #551 resolved.

(Related to #472)

@allo-
Copy link
Contributor

allo- commented Oct 4, 2022

I guess that's the tradeoff between separate pipelines which can easily be copied to create a own one and an architecture that can do much more but maybe harder to understand at first. In #699 and the Twitter poll it didn't look like there will be a unified pipeline soon.

But maybe the tokenizing (with the warning when there are too many tokens) could be moved to a function (maybe even into one of the utility files?) so it can be reused for both prompts and in all the Pipelines.

Copy link
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me, thanks a lot for working on this!

@pcuenca pcuenca merged commit 5ac1f61 into huggingface:main Oct 4, 2022
@shirayu shirayu deleted the feature/negative_prompt branch October 4, 2022 15:14
prathikr pushed a commit to prathikr/diffusers that referenced this pull request Oct 26, 2022
* Add an argument "negative_prompt"

* Fix argument order

* Fix to use TypeError instead of ValueError

* Removed needless batch_size multiplying

* Fix to multiply by batch_size

* Add truncation=True for long negative prompt

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_onnx.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_onnx.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Fix styles

* Renamed ucond_tokens to uncond_tokens

* Added description about "negative_prompt"

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
* Add an argument "negative_prompt"

* Fix argument order

* Fix to use TypeError instead of ValueError

* Removed needless batch_size multiplying

* Fix to multiply by batch_size

* Add truncation=True for long negative prompt

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_onnx.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_onnx.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Fix styles

* Renamed ucond_tokens to uncond_tokens

* Added description about "negative_prompt"

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.