Skip to content

Conversation

@fritzo
Copy link
Member

@fritzo fritzo commented Nov 10, 2021

This attempts to apply LocScaleReparam in AutoReparam only for latent variables whose prior distributions depends on upstream variables. In case the prior has fixed parameters, there is no need to reparametrize, and reparametrizing introduces unnecessary complexity and obfuscation.

This first attempt is blocked by nuances in the interaction of get_dependencies() with AutoReparam(), since the model's latent variables are determined only dynamically, but they want to use dependencies to determine strategy. The fix may involve subtler use of ProvenanceTensor 🤔

@fritzo fritzo added this to the 1.8 release milestone Nov 10, 2021
Comment on lines -203 to -206
# TODO reparametrize only if parameters are variable. We might guess
# based on whether parameters are differentiable, .requires_grad. See
# https://github.com/pyro-ppl/pyro/pull/2824

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR resolves this TODO

@fritzo fritzo added the Blocked label Nov 10, 2021
@fritzo fritzo modified the milestones: 1.8 release, 1.9 release Dec 13, 2021
@fritzo fritzo modified the milestones: 1.9 release, 1.10 Mar 18, 2022
@fritzo fritzo modified the milestones: 1.9 release, 1.10 release Feb 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants