-
Notifications
You must be signed in to change notification settings - Fork 18
Open
Description
If my understanding is correct, when you do:
self.class_embed = nn.Linear(hidden_dim, num_classes)
at DeformableDETR() in deformable_detr.py, "num_classes" should instead be "num_classes + 1"
The same thing goes for:
self.class_embed.bias.data = torch.ones(num_classes) * bias_value
in the same DeformableDETR() function, where it should be "num_classes + 1" instead of "num_classes".
Otherwise I'm just getting confused somewhere, but I'm pretty sure there should be an extra background class logit.
That is how it was done in the original DETR code anyways.
Metadata
Metadata
Assignees
Labels
No labels