Add ChromaTransformer2DModel and ChromaPipeline#2
Open
CardSorting wants to merge 1 commit intohameerabbasi:chromafrom
Open
Add ChromaTransformer2DModel and ChromaPipeline#2CardSorting wants to merge 1 commit intohameerabbasi:chromafrom
CardSorting wants to merge 1 commit intohameerabbasi:chromafrom
Conversation
- Implement ChromaTransformer2DModel as a separate class from FluxTransformer2DModel - Create dedicated ChromaPipeline without CLIP encoder (T5-only) - Add single file converter for Chroma checkpoints - Implement proper attention masking for Chroma requirements - Export new classes in package structure This addresses the feedback from DN6 about creating separate classes rather than using variants within FluxTransformer2DModel.
hameerabbasi
pushed a commit
that referenced
this pull request
Jan 24, 2026
* flux2-klein * Apply suggestions from code review Co-authored-by: Sayak Paul <spsayakpaul@gmail.com> * Klein tests (#2) * tests * up * tests * up * support step-distilled * Apply suggestions from code review Co-authored-by: dg845 <58458699+dg845@users.noreply.github.com> * Apply suggestions from code review Co-authored-by: dg845 <58458699+dg845@users.noreply.github.com> * doc string etc * style * more * copies * klein lora training scripts (huggingface#3) * initial commit * initial commit * remove remote text encoder * initial commit * initial commit * initial commit * revert * img2img fix * text encoder + tokenizer * text encoder + tokenizer * update readme * guidance * guidance * guidance * test * test * revert changes not needed for the non klein model * Update examples/dreambooth/train_dreambooth_lora_flux2_klein.py Co-authored-by: Sayak Paul <spsayakpaul@gmail.com> * fix guidance * fix validation * fix validation * fix validation * fix path * space --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com> * style * Update src/diffusers/pipelines/flux2/pipeline_flux2_klein.py * Apply style fixes * auto pipeline --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com> Co-authored-by: dg845 <58458699+dg845@users.noreply.github.com> Co-authored-by: Linoy Tsaban <57615435+linoytsaban@users.noreply.github.com> Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR implements ChromaTransformer2DModel and ChromaPipeline as separate classes, addressing the feedback from DN6 about creating dedicated implementations rather than using variants within FluxTransformer2DModel.
Changes Made:
Core Implementation
Package Integration
__init__.pyfiles to export new classessrc/diffusers/__init__.pyExample Usage
Technical Details:
This implementation provides a clean, separate architecture for Chroma while maintaining compatibility with the existing diffusers ecosystem.