Skip to content

Conversation

@ryanontheinside
Copy link
Collaborator

Depends upon #171 .

Per-LoRA Merge Strategy Support

Enables independent merge strategy selection for each LoRA adapter instead of forcing a global strategy.

What Changed

  • Frontend: Added per-LoRA strategy dropdown in LoRA Manager
  • Backend: LoRAConfig.merge_mode optional field with fallback to global lora_merge_mode
  • Manager: Groups LoRAs by strategy, loads in order (permanent_merge → runtime_peft), routes runtime updates to appropriate handler

Benefits

  • Mix permanent_merge (max FPS) and runtime_peft (live adjustments) LoRAs simultaneously
  • Fine-grained control per adapter without pipeline-wide tradeoffs
  • Backward compatible: defaults to global strategy if not specified

Technical Details

  • LoRAManager refactored: consolidated mode resolution, generic grouping helper
  • Frontend UI consistency: both Strategy and Scale labels use LabelWithTooltip
  • Timeline export/import automatically preserves per-LoRA mergeMode
  • Scale slider tooltip dynamically reflects effective merge mode

Example Use Case

Some LoRAs are desirable for performance but do not require scale updates. Users that want to usually/always include these with maximum performance, but still want to runtime updates of other LoRA will need this feature.

Example performance LoRA for:

image

Next Steps

A more explicit/elegant pattern for merge order can be revisited when moving scale updates to modular block is considered in place of Mixin.

@ryanontheinside ryanontheinside changed the base branch from main to ryanontheinside/perf/in-mem-lora-conversion November 28, 2025 16:53
@ryanontheinside ryanontheinside force-pushed the ryanontheinside/feat/independent-lora-merge-strategies branch 2 times, most recently from 6229088 to 3d48743 Compare November 28, 2025 17:23
Signed-off-by: RyanOnTheInside <7623207+ryanontheinside@users.noreply.github.com>
Signed-off-by: RyanOnTheInside <7623207+ryanontheinside@users.noreply.github.com>
Signed-off-by: RyanOnTheInside <7623207+ryanontheinside@users.noreply.github.com>
Signed-off-by: RyanOnTheInside <7623207+ryanontheinside@users.noreply.github.com>
Signed-off-by: RyanOnTheInside <7623207+ryanontheinside@users.noreply.github.com>
@ryanontheinside ryanontheinside force-pushed the ryanontheinside/perf/in-mem-lora-conversion branch from bf512fb to 050c763 Compare November 28, 2025 17:26
Signed-off-by: RyanOnTheInside <7623207+ryanontheinside@users.noreply.github.com>
Signed-off-by: RyanOnTheInside <7623207+ryanontheinside@users.noreply.github.com>
@ryanontheinside ryanontheinside force-pushed the ryanontheinside/feat/independent-lora-merge-strategies branch from 3d48743 to 3ba228e Compare November 28, 2025 17:27
Base automatically changed from ryanontheinside/perf/in-mem-lora-conversion to main November 28, 2025 17:32
Copy link
Contributor

@yondonfu yondonfu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@yondonfu
Copy link
Contributor

Squashing to avoid dealing with rebase conflicts

@yondonfu yondonfu merged commit 65fd932 into main Nov 28, 2025
1 check passed
@yondonfu yondonfu deleted the ryanontheinside/feat/independent-lora-merge-strategies branch November 28, 2025 17:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants