Skip to content

Update lora_conversion_utils.py #9980

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Dec 19, 2024
Merged

Conversation

zhaowendao30
Copy link
Contributor

@zhaowendao30 zhaowendao30 commented Nov 21, 2024

x-flux single-blocks lora load

What does this PR do?

Fixes # (issue)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

x-flux single-blocks lora load
@zhaowendao30
Copy link
Contributor Author

I check the x-flux code, the single block lora should load after qkv attention,not norm,they are just has the same shape
image
image

@sayakpaul
Copy link
Member

@raulmosa could you give this a look?

@raulmosa
Copy link
Contributor

I've checked it and it's right, it shouldn't be norm. Looks good to me @sayakpaul .
Nice catch @zhaowendao30 , thanks! =)

@sayakpaul
Copy link
Member

@zhaowendao30 thanks for your contribnutions!

Could you also do a side-by-side comparison of your changes applied and without your changes in the outputs? That would be very much appreciated.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@zhaowendao30
Copy link
Contributor Author

I've checked it and it's right, it shouldn't be norm. Looks good to me @sayakpaul . Nice catch @zhaowendao30 , thanks! =)

No thanks =)

@zhaowendao30
Copy link
Contributor Author

zhaowendao30 commented Nov 22, 2024

@zhaowendao30 thanks for your contribnutions!

Could you also do a side-by-side comparison of your changes applied and without your changes in the outputs? That would be very much appreciated.

OK, the single block i just trained index 1,2,3,4, and the first image is load in qkv, the second is load in norm, the last is no lora
image
image
image

@sayakpaul
Copy link
Member

@zhaowendao30 thanks, but please prefer not using human subjects in the public forums.

I will run the tests today and update the slices as needed because of the change.

@zhaowendao30
Copy link
Contributor Author

@zhaowendao30 thanks, but please prefer not using human subjects in the public forums.

I will run the tests today and update the slices as needed because of the change.

OK, I've uploaded it again. However, this is a LoRA about people, and the panda looks a bit strange. =)

@sayakpaul
Copy link
Member

Just ran

pytest tests/lora/ -k "test_flux_xlabs"

test_flux_xlabs is passing and test_flux_xlabs_load_lora_with_single_blocks is failing because of hardware change, which is expected. I will change the slices in https://github.com/huggingface/diffusers/pull/9845/files.

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@sayakpaul sayakpaul requested a review from yiyixuxu November 22, 2024 06:48
@sayakpaul
Copy link
Member

@yiyixuxu a gentle ping.

@yiyixuxu yiyixuxu merged commit 2f7a417 into huggingface:main Dec 19, 2024
11 of 12 checks passed
Foundsheep pushed a commit to Foundsheep/diffusers that referenced this pull request Dec 23, 2024
x-flux single-blocks lora load

Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
Co-authored-by: YiYi Xu <yixu310@gmail.com>
sayakpaul added a commit that referenced this pull request Dec 23, 2024
x-flux single-blocks lora load

Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
Co-authored-by: YiYi Xu <yixu310@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants