Skip to content

Combining adapters linearly with negative weights is broken #3004

@orr-hirundo

Description

@orr-hirundo

System Info

Hi,

I've noticed a bug when combining adapters with negative weights using _generalized_task_arithmetic_weighted_adapter. Negative weights are indeed handled (here) but then later (eg. in task_arithmetic and in other methods) they are applied to both the A and B LoRA deltas. If using a single adapter, this cancels out the negative weight; with multiple adapters, we can get unexpected behavior due to cross-terms of the (A_1+A_2)*(B_1+B_2) multiplication.

Who can help?

No response

Reproduction

I don't have a code snippet I can share, but calling add_weighted_adapter with a single adapter will have the same result whether using weights [1.0] or '[-1.0]`.

Expected behavior

Negative weights handled correctly - eg. sign only applied to A weights, or adapter negated upfront and then linear combination only happens with positive weights.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions