Skip to content

Issue merging a Lora model to a SANA transformer #2318

@frutiemax92

Description

@frutiemax92

System Info

peft=0.14.0

Who can help?

@BenjaminBossan @sayakpaul

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder
  • My own task or dataset (give details below)

Reproduction

from diffusers import SanaPipeline, SanaPAGPipeline, SanaTransformer2DModel
from peft import PeftModel

transformer = SanaTransformer2DModel.from_pretrained("frutiemax/twistedreality-sana-1600m-1024px")
print(transformer)
peft_model = PeftModel.from_pretrained(transformer, '0')
model = peft_model.merge_and_unload()

Expected behavior

I've trained a Lora model with PEFT on a SANA checkpoint. I can train and inference using the PEFT model. However, when I try to merge the Lora to the base checkpoint, I encounter a shape mismatch. I've attached the Lora model with a rank 4.

image

0.zip

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions