Add pipeline_stable_diffusion_xl_attentive_eraser#10579
Add pipeline_stable_diffusion_xl_attentive_eraser#10579hlky merged 9 commits intohuggingface:mainfrom
Conversation
hlky
left a comment
There was a problem hiding this comment.
Thanks @Anonym0u3. Could you run make style?
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
|
Hi @hlky , I sincerely apologize for missing the code style check. I have now run
|
|
We use You can also do Can we add the example output to the Hub PR/docs? |
|
@hlky Thank you for your detailed instructions! I have successfully run |
Co-authored-by: Other Contributor <a457435687@126.com>
|
Hi @hlky , I’m excited to share that my paper has been accepted as an Oral presentation at AAAI 2025! To reflect this update, I’ve submitted a new PR with the relevant documentation changes. Please let me know if there are any additional adjustments needed or if you have any feedback on the code. Looking forward to your review! |
| logger = logging.get_logger(__name__) # pylint: disable=invalid-name | ||
|
|
||
|
|
||
| EXAMPLE_DOC_STRING = """ |
There was a problem hiding this comment.
Can we change this to the example from the docs?
| print("AAS at denoising steps: ", self.step_idx) | ||
| print("AAS at U-Net layers: ", self.layer_idx) | ||
| print("start AAS") |
There was a problem hiding this comment.
| print("AAS at denoising steps: ", self.step_idx) | |
| print("AAS at U-Net layers: ", self.layer_idx) | |
| print("start AAS") |
| """ | ||
| if is_cross or self.cur_step not in self.step_idx or self.cur_att_layer // 2 not in self.layer_idx: | ||
| return super().forward(q, k, v, sim, attn, is_cross, place_in_unet, num_heads, **kwargs) | ||
| # B = q.shape[0] // num_heads // 2 |
There was a problem hiding this comment.
| # B = q.shape[0] // num_heads // 2 |
| return super().forward(q, k, v, sim, attn, is_cross, place_in_unet, num_heads, **kwargs) | ||
| # B = q.shape[0] // num_heads // 2 | ||
| H = int(np.sqrt(q.shape[1])) | ||
| # H = W = int(np.sqrt(q.shape[1])) |
There was a problem hiding this comment.
| # H = W = int(np.sqrt(q.shape[1])) |
| # checkpoint. TOD(Yiyi) - need to clean this up later | ||
| deprecation_message = "The prepare_mask_and_masked_image method is deprecated and will be removed in a future version. Please use VaeImageProcessor.preprocess instead" | ||
| deprecate( | ||
| "prepare_mask_and_masked_image", | ||
| "0.30.0", | ||
| deprecation_message, | ||
| ) |
There was a problem hiding this comment.
| # checkpoint. TOD(Yiyi) - need to clean this up later | |
| deprecation_message = "The prepare_mask_and_masked_image method is deprecated and will be removed in a future version. Please use VaeImageProcessor.preprocess instead" | |
| deprecate( | |
| "prepare_mask_and_masked_image", | |
| "0.30.0", | |
| deprecation_message, | |
| ) |
This function is still used in other community examples so it's ok to remove the deprecation message here. Alternatively refactor to use VaeImageProcessor.preprocess.
Co-authored-by: Other Contributor <a457435687@126.com>
|
Hi, @hlky, I sincerely apologize for the delay caused by my oversight last week in not submitting the review request on time. As a beginner, I truly appreciate your patience and understanding throughout this process. I’ve carefully reviewed your feedback and have addressed all the suggested changes in a new commit. Please let me know if there’s anything else I need to do before this PR can be merged into the main codebase. Once again, thank you so much for your valuable feedback and guidance! |
|
Should be good to go after CI passes. Can you run make style again? |
Co-authored-by: Other Contributor <a457435687@126.com>
|
Thanks. I checked the code and ran the make style again. |
What does this PR do?
add pipeline_stable_diffusion_xl_attentive_eraser
#10415
@hlky
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.