Skip to content

Rt_DETR loss attempts returning auxiliary_outputs when auxiliary_loss=False #42219

@sinAshish

Description

@sinAshish

System Info

I am trying to finetune RT-DETRv2, and while trying to load the model with updated config values such as 'auxiliary_loss=False' or 'use_focal_loss=False', it throws an error that I am trying to access a variable which is not initialized here

A rudimentary fix is to set auxiliary_outputs = None at L450

- `transformers` version: 4.57.0
- Platform: Linux-6.8.0-78-generic-x86_64-with-glibc2.35
- Python version: 3.13.7
- Huggingface_hub version: 0.35.3
- Safetensors version: 0.6.2
- Accelerate version: 1.10.1
- Accelerate config:    not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.8.0+cu128 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?:

Who can help?

Add auxiliary_outputs = None at L450

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

  1. Update the model config for RT_DETRv2 with auxiliary_loss=False
  2. Load the model
  3. Run 1 step of training

Expected behavior

Should get this error:

UnboundLocalError: cannot access local variable 'auxiliary_outputs' where it is not associated with a value

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions