Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Safety Checker] Add Safety Checker Module #36

Merged
merged 6 commits into from
Aug 22, 2022

Conversation

patrickvonplaten
Copy link
Contributor

@patrickvonplaten patrickvonplaten commented Aug 19, 2022

With @anton-l and @patil-suraj, we've made sure that the safety_checker works correctly see here: huggingface/diffusers#219

The only thing we're not 100% sure about is whether this line https://github.com/huggingface/diffusers/blob/89e9521048067acacfdcbc2b985af8f6b155cfb6/src/diffusers/pipelines/stable_diffusion/safety_checker.py#L65 that defines the threshold is correct

scripts/txt2img.py Outdated Show resolved Hide resolved
scripts/txt2img.py Outdated Show resolved Hide resolved
scripts/txt2img.py Outdated Show resolved Hide resolved
@rromb rromb self-assigned this Aug 19, 2022
@rromb rromb self-requested a review August 19, 2022 16:30
Copy link
Collaborator

@rromb rromb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, we could consider making this optional

@patrickvonplaten
Copy link
Contributor Author

Happy to adapt the way you prefer

@pesser pesser merged commit a6e2f3b into CompVis:main Aug 22, 2022
@Bendito999
Copy link

Any instructions to adjust this? Emad on twitter had mentioned there was a way to specifically block Clowns for example, which is a fun insight into how this works. Or just a simple 'comment this out to completely take safety off' might also be a good note to have.

enzymezoo-code added a commit to enzymezoo-code/stable-diffusion that referenced this pull request Sep 2, 2022
@dannydeezy
Copy link

Any instructions to adjust this? Emad on twitter had mentioned there was a way to specifically block Clowns for example, which is a fun insight into how this works. Or just a simple 'comment this out to completely take safety off' might also be a good note to have.

yes, would be nice to be able to remove the safety checker. some of us are adults!

@woctezuma
Copy link

Or just a simple 'comment this out to completely take safety off' might also be a good note to have.

yes, would be nice to be able to remove the safety checker. some of us are adults!

You can already remove the safety, it is just that the removal procedure is not explained in the documentation.

@aisensiy
Copy link

I saw the model come from https://huggingface.co/CompVis/stable-diffusion-safety-checker/blob/main/pytorch_model.bin and I am wondering where is this model come from. There is some nsfw detector in github but this one looks different (larger).

@didpublishing
Copy link

So far I've had no luck finding the right script to remove the nsfw feature. It's blocking images that are already safe. Very frustrating.

@woctezuma
Copy link

woctezuma commented Jul 12, 2023

So far I've had no luck finding the right script.

Maybe try this notebook.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants