-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Closed
Description
System Info
peft=0.14.0
Who can help?
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examplesfolder - My own task or dataset (give details below)
Reproduction
from diffusers import SanaPipeline, SanaPAGPipeline, SanaTransformer2DModel
from peft import PeftModel
transformer = SanaTransformer2DModel.from_pretrained("frutiemax/twistedreality-sana-1600m-1024px")
print(transformer)
peft_model = PeftModel.from_pretrained(transformer, '0')
model = peft_model.merge_and_unload()
Expected behavior
I've trained a Lora model with PEFT on a SANA checkpoint. I can train and inference using the PEFT model. However, when I try to merge the Lora to the base checkpoint, I encounter a shape mismatch. I've attached the Lora model with a rank 4.
Metadata
Metadata
Assignees
Labels
No labels
