Skip to content

floatingPointError: Loss became infinite or NaN at iteration=166! #29

@Santosh-Kumar-Singh

Description

@Santosh-Kumar-Singh

I keep getting this error every time I try to train the model. Apparently the loss values abruptly become NaN at a random iteration and training stops with this error.

Runtime Error: floatingPointError: Loss became infinite or NaN at iteration=166!
loss_dict = {'loss_cls_ce': nan, 'loss_box_reg': nan, 'loss_ins_con': 0.0, 'loss_cls_up': nan}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions