You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As you can see my images are of the size 300 (In contrast to 32 that you guys do for CIFAR-10).
Now when I feed these images to the model, the output batch size is totally different from what I feed in. For example: With the above transformation and batch size of 2, my output size is (162, num_classes).
But if the transformation size is 32, I get the right output (2, num_classes). May I know what is happening here?
Another doubt: Can your OECC concept be applied to any other model? EfficientNet for example?
Many thanks!
Venki
The text was updated successfully, but these errors were encountered:
Hi,
First of all, thanks for a great paper! Its quite interesting to see how you try to separate ID and OOD classes.
When I try to apply this concept for my data, I am facing some issues during the WRN model creation.
My input transform looks like the following
As you can see my images are of the size 300 (In contrast to 32 that you guys do for CIFAR-10).
Now when I feed these images to the model, the output batch size is totally different from what I feed in. For example: With the above transformation and batch size of 2, my output size is (162, num_classes).
But if the transformation size is 32, I get the right output (2, num_classes). May I know what is happening here?
Another doubt: Can your OECC concept be applied to any other model? EfficientNet for example?
Many thanks!
Venki
The text was updated successfully, but these errors were encountered: