You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I'm trying to train DenseNet-121 on ImageNet dataset, but the result is poor...
Now I wander how to calculate the batch_size in muiti GPUs. You said "It took us 10 days to train 40M densenet for 120 epochs on 4 TITAN X GPUs, with batchsize 128" on issue (https://github.com/liuzhuang13/DenseNet/issues/5), you mean "each GPU use batchsize 128" or "each GPU use 32, sum is 128"?
Thank you!
The text was updated successfully, but these errors were encountered:
Hi, I'm trying to train DenseNet-121 on ImageNet dataset, but the result is poor...
Now I wander how to calculate the batch_size in muiti GPUs. You said "It took us 10 days to train 40M densenet for 120 epochs on 4 TITAN X GPUs, with batchsize 128" on issue (https://github.com/liuzhuang13/DenseNet/issues/5), you mean "each GPU use batchsize 128" or "each GPU use 32, sum is 128"?
Thank you!
The text was updated successfully, but these errors were encountered: