Skip to content

VGT: Prefix Handling for Checkpoint State Dictionary #175

Open
@meichenberger78

Description

@meichenberger78

When attempting to load model weights from a checkpoint in VGT, the model's state dictionary keys do not match the checkpoint keys due to potential prefixing issues. This mismatch results in parameters not being loaded correctly, which lead to a significant increase in "total_loss".

In my opinion, no prefixes are necessary. However, if prefixes are needed, we can implement a simple check to address this issue.

In MyDetectionCheckpointer.py, check if prefixes are required for the keys:

if needs_prefix(checkpoint_state_dict, model_state_dict): 
    new_checkpoint_state_dict = {}
    for k in checkpoint_state_dict.keys():
        new_checkpoint_state_dict[append_prefix(k)] = checkpoint_state_dict[k]

    for k in DiT_checkpoint_state_dict.keys():
        new_checkpoint_state_dict[DiT_append_prefix(k)] = DiT_checkpoint_state_dict[k]

    checkpoint_state_dict = new_checkpoint_state_dict

Here’s the function to determine if prefixes are needed:

def needs_prefix(checkpoint_state_dict, model_state_dict):
    for k in checkpoint_state_dict.keys():
        if k not in model_state_dict:
            prefixed_key = append_prefix(k)
            if prefixed_key in model_state_dict:
                return True
    return False

After implementing these changes, re-training from a checkpoint works flawlessly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions