Replies: 3 comments 1 reply
-
supermerger extension should be able to do this, have done so myself with it. |
Beta Was this translation helpful? Give feedback.
-
There are two scripts in this repo to merge: There is a document for them (Japanese only). I hope this helps you. |
Beta Was this translation helpful? Give feedback.
-
When I merge using Kohya_ss GUI Utilities to merge lora for 4 models with weights. If they are coming from the same subset of a training set, it works. But if they are coming from different training set, the output seems as it forget the keyword and not producing the subject. However if I join them under additional network in automatic1111 gui, it works fine. Any idea? |
Beta Was this translation helpful? Give feedback.
-
Hi,
I would like to know if it is possible to merge several LoRAs, some of which are identical but with different weights, in order to create a single one.
I have done many tests on a LoRA that I am trying to refine, and I would like to keep the best of each test.
For example, in my prompt, I would like to transform
"lora:ejlor04_dream-000004:0.3 lora:ejlor04_dream-000008:0.2 lora:ejlor04_dream-000009:0.2 lora:ejlor09_dream-000008:0.2 lora:ejlor06_dream:0.1"
into a single LoRA, like "lora:ejlor_final:1".
I saw the possibility to do it in Kohya GUI, but it only works for 2 LoRA, and I can't specify the weights...
Ideas ? 🤔
Beta Was this translation helpful? Give feedback.
All reactions