Skip to content

Conversation

@karray
Copy link

@karray karray commented Jul 19, 2024

Fixes #208

Copy link
Owner

@chr5tphr chr5tphr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for the PR.

There remains a potential issue: Since we use the torch.nn.Module's setattr to register a Parameter if it did not exist before, the Parameter may still be registered after the Canonizer is removed, since we set the attribute back to None without using the Module's setattr, which would detect this change. We need to explicitly unregister the Parameter, i.e., by using register_parameter or the Module's setattr explicitly only if the original bias was None in the Canonizer's remove().

- Edit by @chrstphr: break the long line, and use direct assignment
@chr5tphr
Copy link
Owner

I reviewed this once more, and it turns out setting the parameter back to None is enough to unregister it again, so the issue I raised earlier is invalid.

I rebased and made a small edit, where I also fixed the styling of the long line and used the normal way to assign attributes, since this is equivalent to setattr.

@chr5tphr chr5tphr merged commit 1362a83 into chr5tphr:main Jul 12, 2025
7 checks passed
@karray
Copy link
Author

karray commented Jul 14, 2025

@chr5tphr btw, another idea is to mark new parameters so that they can be deleted on remove

For example:

#...
for module in modules:
    if module.bias is None:
        module.bias = torch.nn.Parameter(torch.zeros(1, device=module.weight.device, dtype=module.weight.dtype))
        module.bias.is_temporary = True

@chr5tphr
Copy link
Owner

Thanks for the reply!
I verified that setting an attribute that is a parameter to None does unregister it, so we do not need to explicitly delete it, as we store the parameter regardless of whether it was a torch.nn.Parameter, or None.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Unexpected behavior with object.__setattr__ in canonizers.MergeBatchNorm

2 participants