Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

redundant llava_lama copies #171

Closed
4lt3r3go opened this issue Dec 18, 2024 · 3 comments
Closed

redundant llava_lama copies #171

4lt3r3go opened this issue Dec 18, 2024 · 3 comments

Comments

@4lt3r3go
Copy link

4lt3r3go commented Dec 18, 2024

There is some confusion regarding Llava_Lama .. i'm sorry its always me annoy i know 😁

When using your nodes, it is automatically downloaded and placed in the LLM folder
ComfyUI\models\LLM\llava-llama-3-8b-text-encoder-tokenizer

However, comfy native nodes now asking to download one of this

image

which are placed in a different folder: ComfyUI/models/text_encoders

As a result, many users end up with a sort of duplicate (or am I wrong?).

Is there a way to avoid this?
The files are quite large, and having duplicates seems a bit redundant.

@kijai
Copy link
Owner

kijai commented Dec 18, 2024

It's not only about the files, comfy has his own code to use them so they wouldn't be compatible anyway. Sure I could work on utilizing that, but at this point it feels like wasted effort as eventually people will move onto the native only anyway.

@4lt3r3go
Copy link
Author

i think your way has more sense tho. i'll stick on your nodes for now
admit that the ability to choose different samplers is inviting... uhm ok i'll give it a try

@Ratinod
Copy link

Ratinod commented Dec 19, 2024

people will move onto the native only anyway

This transition is very painful without sageattn, BlockSwap and Torch Compile...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants