-
Notifications
You must be signed in to change notification settings - Fork 44
Description
Error occurs while loading state_dict for FluxTransformer2DModel
`Error(s) in loading state_dict for FluxTransformer2DModel:
size mismatch for x_embedder.lora_A.default.weight: copying a param with shape torch.Size([128, 64]) from checkpoint, the shape in current model is torch.Size([128, 384]).
Traceback (most recent call last):
File "/XVerse_202507/inference_single_sample.py", line 293, in
main()
File "/XVerse_202507/inference_single_sample.py", line 235, in main
load_dit_lora(model, model.pipe, config, dtype, init_device, f"{ckpt_root}", is_training=False)
File "/XVerse_202507/src/flux/pipeline_tools.py", line 649, in load_dit_lora
pipe.transformer.load_lora_adapter(ckpt_dir, use_safetensors=True, adapter_name="default", weight_name="pytorch_lora_weights.safetensors") # TODO: check if they are trainable
File "/opt/conda/envs/XVerse/lib/python3.10/site-packages/diffusers/loaders/peft.py", line 302, in load_lora_adapter
incompatible_keys = set_peft_model_state_dict(self, state_dict, adapter_name, **peft_kwargs)
File "/opt/conda/envs/XVerse/lib/python3.10/site-packages/peft/utils/save_and_load.py", line 452, in set_peft_model_state_dict
load_result = model.load_state_dict(peft_model_state_dict, strict=False)
File "/opt/conda/envs/XVerse/lib/python3.10/site-packages/torch/nn/modules/module.py", line 2581, in load_state_dict
raise RuntimeError(
RuntimeError: Error(s) in loading state_dict for FluxTransformer2DModel:
size mismatch for x_embedder.lora_A.default.weight: copying a param with shape torch.Size([128, 64]) from checkpoint, the shape in current model is torch.Size([128, 384]).
`