You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am trying to replicate and pretrain BLIP for distillation purposes - I am using Flickr30K + COCO and my ITM loss gets stuck at 0.63 - upon an initial look, all of the ITM predictions are 1. Is this a dataset size issue or a batch issue? I've tried changing the learning rate to a smaller LR, I've tried increasing the size of the model and more, but nothing seems to work.
The text was updated successfully, but these errors were encountered:
Hi, I am trying to replicate and pretrain BLIP for distillation purposes - I am using Flickr30K + COCO and my ITM loss gets stuck at 0.63 - upon an initial look, all of the ITM predictions are 1. Is this a dataset size issue or a batch issue? I've tried changing the learning rate to a smaller LR, I've tried increasing the size of the model and more, but nothing seems to work.
The text was updated successfully, but these errors were encountered: