-
Notifications
You must be signed in to change notification settings - Fork 2.1k
FEAT Add sine-LoRA #2434 #2457
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
FEAT Add sine-LoRA #2434 #2457
Changes from 1 commit
cbd48a0
0b7e0ec
d98ab16
f27ef97
8ed09c4
f9ae3e9
e4e3608
8d4db0c
76b16ec
1723ba8
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -785,14 +785,16 @@ def __init__( | |
lora_bias=lora_bias, | ||
) | ||
|
||
def resolve_lora_variant(self, *, use_dora: bool, **kwargs) -> Optional[LoraVariant]: | ||
if not use_dora: | ||
def resolve_lora_variant(self, *, use_dora: bool, use_sine_lora:bool,**kwargs) -> Optional[LoraVariant]: | ||
githubnemo marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
if use_dora: | ||
from .variants import DoraEmbeddingVariant | ||
return DoraEmbeddingVariant() | ||
elif use_sine_lora: | ||
from .variants import SineLoraLinearVariant | ||
return SineLoraLinearVariant() | ||
|
||
else: | ||
return None | ||
|
||
from .variants import DoraEmbeddingVariant | ||
|
||
return DoraEmbeddingVariant() | ||
|
||
def update_layer( | ||
self, adapter_name, r, lora_alpha, lora_dropout, init_lora_weights, use_rslora, use_dora, lora_bias | ||
): | ||
|
Original file line number | Diff line number | Diff line change | ||||
---|---|---|---|---|---|---|
|
@@ -285,3 +285,22 @@ class DoraConv3dVariant(_DoraConvNdVariant): | |||||
def init(module: Conv3d, adapter_name: str) -> None: | ||||||
dora_layer = DoraConv3dLayer(fan_in_fan_out=False) | ||||||
_DoraConvNdVariant.init_convd_variant(module, adapter_name, dora_layer=dora_layer) | ||||||
|
||||||
|
||||||
class SineLoraLinearVariant(LoraVariant): | ||||||
@staticmethod | ||||||
def init(module: Linear, adapter_name:str) -> None: | ||||||
|
def init(module: Linear, adapter_name:str) -> None: | |
def init(module: Linear, adapter_name:str, **kwargs) -> None: |
With PR #2455 now merged, init()
receives all the parameters that update_layer
receives.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmmmm I did not use that and do you think that is ok?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think so, no.
Currently the tests do not work because of the changes necessary in Linear.__init__
and Embedding.__init__
. Once the changes are in place you'll see that calls to init
will complain about unexpected arguments passed to init()
. That's because all the config args are passed to init
and without the wildcard **kwargs
you have to define them all (which we don't want, of course).
Also you need a place to set module.sinelora_scaling
and module.sinelora_frequency
. This is here, from the kwargs, e.g.
module.sinelora_frequency = kwargs['sinelora_frequency']
For sinelora_scaling
you need to check if kwargs['sinelora_scaling']
is None
.
Uh oh!
There was an error while loading. Please reload this page.