How to train GAN with DDP without setting "find_unused_parameters=True" #18081
Unanswered
function2-llx
asked this question in
DDP / multi-GPU / multi-node
Replies: 1 comment 2 replies
-
Well, I find that using fabric might be a good idea, since you can wrap generator and discriminator in DDP respectively. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello everyone,
I am developing a GAN using Lightning and Distributed Data Parallel (DDP) for training. Due to the alternating training of the generator and discriminator in GANs, fulfilling DDP's requirement of all model parameters contributing to the loss calculation every forward pass presents a challenge.
While setting
find_unused_parameters=True
in the DDP configuration ensures successful execution, it compromises performance. Consequently, I want to know if it's possible to train GANs with DDP without setting find_unused_parameters=True.Any insights or recent developments related to this issue would be much appreciated. Thank you in advance!
Beta Was this translation helpful? Give feedback.
All reactions