Releases: pyro-ppl/numpyro
0.9.0
New Features
- New VI inference: SteinVI. Checkout a couple examples in PRs #1297 #1298 for the usage.
- New distributions: MultivariateStudentT, DiscreteUniform, Kumaraswamy, RelaxedBernoulli .
- New tutorials and examples:
- Tutorial for Truncated distributions: a complete guide for how to construct a NumPyro distribution.
- Bayesian Hierarchical Stacking case study to average models based on weights from a hierarchical structure.
- Sine-skewed sine (bivariate von Mises) mixture to model the dihedral angles that occur in the backbone of a protein.
- AR2 processes to show how to avoid the (slow) Python for-loop.
- Holt-winter Exponential Smoothing example for time series forecasting.
- Hilbert space approximation for Gaussian processes example is significantly revised.
Enhancements and Bug Fixes
- #1305 Fixes HMCECS bug for likelihoods with multiple plates
- #1304 Improves warning mechanism when plates are missing.
- #1301 Fixes sparse Poisson density sometimes returns int output.
- #1289 Make HMC Gibbs algorithms work with improper distributions
- #1284 Adds various KL divergences for Gamma/Beta families
- #1281 Raises error if there are duplicated deterministic sites
- #1271 Better warning mechanism with stacklevel
- #1270 Incorporate kl divergences of Tensorflow Probability distributions
- #1259 #1266 Allow TruncatedNormal/Cauchy to take both low and high
- #1254
numpyro.contrib.indexingis moved tonumpyro.ops.indexing - #1252 Use multipledispatch for
kl_registry - #1250 Added
cdfmethods for gamma, inverse gamma, log normal densities - #1248 Add ProvenanceArray to infer relational structure in a model
- #1244 Raise warning for the automatic enumeration behavior
- #1237 Enhance warnings for invalid parameters of
BetaProportiondistribution - #1227 Allow
priorto be callable inrandom_flax_moduleandrandom_haiku_module - #1226 Allow init_to_sample work with scalar values
- #1225 Add color for divergences in Neal's example
- #1196 Allow custom precision function in laplace approximation autoguide
- #1194 Option to specify init state for SVI run
- #1185 #1189 Avoid -inf/nan samples in truncated distributions
- #1182 Extend scope handler for plate stack frames
- #1179 Support enumerate support for zero inflated distributions
- #1169 Allow pickle autoguides
This release is composed of great contributions and feedback from the Pyro community: @amalvaidya @MarcoGorelli @omarfsosa @maw501 @bjeffrey92 @hessammehr @OlaRonning @dykim29 @Carlosbogo @wataruhashimoto52 @Vedranh13 @ahmadsalim @austereantelope and many others. Thank you!
0.8.0
Breaking changes
Switch to softplus transforms for autoguide scales (thanks to experiments performed by @vitkl).
New Features
- New autoguide: AutoDAIS leverages HMC and annealed importance sampling within a variational inference framework
- New distributions: MixtureSameFamily, and directional distributions SineBivariateVonMises, SineSkewed
- New constraints: l1_ball for vectors with L1 norm less than 1
- New transforms: L1BallTransform, SimplexToOrderedTransform, ScaledUnitLowerCholeskyTransform
- #1116 New format_shapes utility to interpret the shapes of random variables/plates in a model.
- #1109 Allow direct use of TFP distributions in numpyro.sample
- New tutorials and examples:
- Principled prior with Dirichlet distribution for Ordinal Regression case study
- Horseshoe regression
- Bad posterior geometry and how to deal with it
Enhancements and Bug Fixes
- #1108 Avoid numerical problems when using BernoulliProbs
- #1118 Recommend AutoNormal guide when hessian in AutoLaplace is singular
- #1126 Smarter warning about discrete inference in SVI models
- #1136 Support to use SA sampler with arviz
- #1139 Document Poisson
is_sparseargument - #1140 Make Sigmoid and StickBreakingTransform more stable
- #1149 Raise value error if num_steps bad in svi.run
- #1162 Use black[jupyter] in notebooks
This release is composed of great contributions and feedback from the Pyro community: @MarcoGorelli @OlaRonning @d-diaz @quattro @svilupp @peterroelants @prashjet @freddyaboulton @tcbegley @julianstastny @alexlyttle and many others. Thank you!
0.7.2
This is a patch release with the following new feature and fixes:
- New example Hilbert space approximation of Gaussian processes #1097 thanks to @omarfsosa
- Fix for rendering models with only discrete variables #1099 thanks to @bdatko
- Fix progress-bar issues when running multi-chain MCMC #1101
0.7.1
0.7.0
Since this release, NumPyro can be installed along with the latest jax and jaxlib releases (their version restrictions have been relaxed). In addition, NumPyro will use the default JAX platform so if you installed JAX with GPU/TPU support, their devices will be used by default.
New Features
- New distributions: SoftLaplace, Weibull, BetaProportion, NegativeBinomial, NegativeBinomial2, ZeroInflatedDistribution, ZeroInflatedPoisson, ZeroInflatedNegativeBinomial2, FoldedDistribution
- Support for DeepMind's Optax optimizers in SVI
- New ELBO objective: TraceGraph_ELBO for non-reparameterized latent variables (e.g. discrete latent variables)
- A new wrapper NestedSampler to leverage the nested sampling package jaxns for NumPyro models
- Implement
cdfandicdfmethods for many distributions - New cond primitive.
- New infer_discrete handler to sample discrete sites under enumeration. Check out the annotation example for a usage.
- Structural mass matrix can be specified via dense_mass argument of the HMC/NUTS constructor #963
- New examples:
- Thompson sampling for Bayesian optimization with GPs
- Latent Dirichlet Allocation for topic modeling: a great example to illustrate the usage of Flax/Haiku in NumPyro
Enhancements and Bug Fixes
- Documentation and examples are greatly enhanced to make features more accessible
- Fix chain detection for various CPU device strings #1077
- Fix AutoNormal's
quantilesmethod for models with non-scalar latent sites #1066 - Fix LocScaleReparam with
center=1#1059 - Enhance auto guides to support models with deterministic sites #1022
- Support for mutable states in Flax and Haiku modules #1016
- Fix a bug in auto guides that happens when using the guide in Predictive #1013
- Support decorator syntax for effect handlers #1009
- Implement sparse Poisson log probability #1003
- Support
total_count=0in Multinomial distribution #1000 - Add a flag to control regularize mass matrix behavior in mass matrix adaptation #998
- Add experimental Dockerfiles #996
- Allow setting max tree depth of NUTS sampler during warmup phase #984
- Fix dimensions mixed up in
ExpandedDistribution.samplemethod #972 - MCMC objects can be pickled now #968
This release is made of great contributions and feedbacks from the Pyro community: @ahoho, @kpj, @gustavehug, @AndrewCSQ, @jatentaki, @tcbegley, @dominikstrb, @justinrporter, @dirmeier, @irustandi, @MarcoGorelli, @lumip, and many others. Thank you!
0.6.0
New Features
- Progress bar is available for running parallel MCMC chains.
- New samplers:
- BarkerMH - a Metropolis-Hastings sampler that uses a skew-symmetric proposal distribution that depends on the gradient of the potential
- New taylor_proxy for HMCECS sampler. This control variate significantly improves the performance of HMCECS on tall data.
- MixedHMC for mixed discrete and continuous variables
- New distributions:
- ProjectedNormal is similar to von Mises and von Mises-Fisher distributions but permits tractable variational inference via reparametrizers
- TruncatedDistribution to truncate over a family of symmetric distributions: Cauchy, Laplace, Logistic, Normal, StudentT
- New method Distribution.infer_shapes() for static shape analysis.
- New constraints: sphere, positive_ordered_vector, softplus_positive, softplus_lower_cholesky
- New transforms: SoftplusTransform, SoftplusLowerCholeskyTransform
- New reparameterizer: ProjectedNormalReparam for
ProjectedNormaldistribution - New obs_mask argument in
sampleprimitive for masked conditioning - New examples:
Enhancements and Bug Fixes
- Improve precision for Dirichlet distributions with small concentration #943
- Make it easy to use softplus transforms in autoguides #941
- Improving compiling time in MCMC samplers - compiling time is 2x faster than previously #924
- Reduce memory requirement for
AutoLowRankMultivariateNormal.quantiles#921 - Example of how to use Distribution.mask #917
- Add goodness of fit helpers for testing distributions #916
- Enabling sampling with intermediates for
ExpandedDistribution#909 - Fix DiscreteHMCGibbs to work with multiple chains #908
- Fix missing
inferkey inhandlers.lift#892
Thanks @loopylangur, Dominik Straub @dominikstrb, Jeremie Coullon @jeremiecoullon, Ola Rønning @OlaRonning, Lukas Prediger @lumip, Raúl Peralta Lozada @RaulPL, Vitalii Kleshchevnikov @vitkl, Matt Ludkin @ludkinm, and many others for your contributions and feedback!
0.5.0
New documentation page with galleries of tutorials and examples num.pyro.ai.
New Features
- New primitive: prng_key to draw a random key under
seedhandler. - New autoguide: AutoDelta
- New samplers:
- HMCGibbs: a general HMC/NUTS-within-Gibbs interface.
- DiscreteHMCGibbs: HMC/NUTS-within-Gibbs for models with discrete latent variables.
- HMCECS: HMC/NUTS with energy conserving subsampling.
- New example:
- New kernels module in
numpyro.contrib.einstein, in preparing for (Ein)Stein VI inference in future releases. - New user-friendly SVI.run method to simplify the training phase of SVI inference.
- New feasible_like method in constraints.
- New methods
forward_shapeandinverse_shapein Transform to infer output shape given input shape. - Transform.inv now returns an inversed transform, hence enables many new (inversed) transforms.
- Support thinning in MCMC.
- Add post_warmup_state and last_state to allow sequential sampling strategy in MCMC: allow to keep calling
.runmethod to get more samples. - New
historyargument to support for Markov models withhistory > 1in scan. - New forward_model_differentiation argument in HMC/NUTS kernels to allow to use forward mode differentiation.
Enhancements and Bug Fixes
- #886 Make TransformReparam compatible with
.to_event() - #883 Improve gradient computation of Euclidean kinetic energy.
- #872 Enhance masked distribution to allow gradient propagate properly when using
maskhandler for invalid data. - #865 Make subsample faster in CPU.
- #860 Fix for memory leak in MCMC.
- #849 Expose
logitsattribute to some discrete distributions - #848 Add
has_rsampleandrsampleattribute to distributions - #832 Allow a callable to return an init value in
paramprimitive - #824 Fix for cannot using sample method of TFP distributions in
sampleprimitive. - #823 Demo on how to use various init strategies in Gaussian Process example.
- #822 Allow haiku/flax modules to take general args/kwargs in
init. - #821 Better error messages when
rng_keyis missing. - #818 Better error messages when an error happens in the middle of inference.
- #805 Display correct progress bar message after running
MCMC.warmup. - #801 Raise an error early if missing plates for models with discrete latent variables.
- #797 MCMC
vectorizedchain method works for models with deterministic sites. - #796 Bernoulli distribution returns an int instead of a boolean.
- #795 Reveal signature for
help(Distribution).
Thanks Ola Ronning @OlaRonning, Armin Stepanjan @ab-10, @cerbelaut, Xi Wang @xidulu, Wouter van Amsterdam @vanAmsterdam, @loopylangur, and many others for your contributions and helpful feedback!
0.4.1
New Features
- #772 Add DirichletMultinomial distribution.
- #773 (experiment) Add collapse handler to exploit conjugacy relations.
Enhancements and Bug Fixes
- #764 Make exception chaining more user-friendly. Thanks, @akihironitta!
- #766 Relax interval constraint.
- #776 Fix bugs in methods
log_probandsampleofVonMisesdistribution. - #775 Make validation mechanism compatible with omnistaging since JAX 0.2.
- #780 Fix name dimensions of sample sites under
contrib.funsor'splatehandler.
0.4.0
Experimental integrations with JAX-based TensorFlow Probability and neural network libraries Flax and Haiku. New high-quality tutorials written by NumPyro contributors. JAX 0.2 enables "omnistaging" by default (see this guide for what omnistaging means and how to update your code if it is broken after the upgrade - you can also disable this new behavior with jax.config.disable_omnistaging()).
New Features
- New primitives in
numpyro.contrib.moduleto create Bayesian Neural Networks (BNN) using Flax or Haiku: flax_module, random_flax_module, haiku_module, random_haiku_module. See random_flax_module doc for an end-to-end example to construct, train, and make prediction with a BNN. - Wrappers for many TensorFlow Probability distributions in
numpyro.contrib.tfp.distributions. - Wrappers for many TensorFlow MCMC kernels in
numpyro.contrib.tfp.mcmc. A user-defined TensorFlow MCMC kernel can be converted to NumPyro-compatible one using TFPKernel. - New distribution: Geometric distribution.
- New primitive: subsample primitive for data/param subsampling.
- New auto guide: AutoNormal which is similar to
AutoDiagonalNormalbut more suitable for mean field ELBO and param subsampling. - New SVI objective: TraceMeanField_ELBO.
- New optimizer: Minimize with BFGS method.
New Examples
- Bayesian Imputation for Missing Values in Discrete Covariates tutorial: leverage enumeration mechanism to marginalize discrete missing covariates - applied for clinical synthesis data.
- Bayesian Hierarchical Linear Regression tutorial: practical Bayesian inference for Kaggle competitions.
- Ordinal Regression tutorial: how to deal with ordered discrete data.
Deprecation
Changes to match Pyro api.
ELBOobjective is renamed toTrace_ELBO.valueargument inDeltadistribution is replaced byv.init_strategyargument in autoguides is replaced byinit_loc_fn.
Enhancements and Bug Fixes
- Relax
simplexconstraint. #725 #737 - Fix
init_strategyargument not respected in HMC and SA kernels. #728 - Validate the model when cannot find valid initial params. #733
- Avoid
nanacceptance probability in SA kernel. #740
Thanks @xidulu, @vanAmsterdam, @TuanNguyen27, @ucals, @elchorro, @RaulPL, and many others for your contributions and helpful feedback!
0.3.0
Breaking Changes
- HMC's find_heuristic_step_size (this functionality is different from step size adaptation scheme) is disabled by default to improve compiling time. Previous behavior can be enabled by setting
find_heuristic_step_size=True. - The automatic reparameterization mechanism introduced in NumPyro 0.2 is removed, in favor of
reparamhandler. See the eight schools example for the new usage pattern. - Automatic Guide Generation module is moved from
numpyro.contrib.autoguideto the main inference modulenumpyro.infer.autoguide. - Various API changes to match Pyro API:
- mask handler:
mask_arrayarg is renamed tomask. - scale handler:
scale_factorarg is renamed toscale. - condition and substitute handlers:
param_mapis renamed todata. MultivariateAffineTransformtransform is renamed to LowerCholeskyAffine.init_to_priorstrategy is renamed to init_to_sample.
- mask handler:
New Features
- Funsor-based NumPyro: allow enumeration over discrete latent variables. See mixture and Markov examples below for some applications.
- New primitives: plate_stack and scan. If your model has Python
forloop, considering usingscaninstead to improve compiling time. - New handlers: reparam, scope, and lift.
- New distributions: von Mises, Gumbel, Logistic, Laplace, TruncatedPolyaGamma, ExpandedDistribution, MaskedDistribution, and ImproperUniform.
- Distribution has new properties
is_discrete,has_enumerate_support, and new methods shape, enumerate_support, expand, expand_by, mask. In addition,Distributionhas been registered as a JAX Pytree class, with corresponding methodstree_flattenandtree_unflatten. - New constraint: less_than.
- Port Tensor Indexing from Pyro.
- Port some Reparameterizers from Pyro.
- Add
batch_ndimsarg to Predictive and log_likelihood to allow using those utilities with arbitrary number of batch dimensions.
New Examples
- Proportion Test example: You are managing a business and want to test if calling your customers will increase their chance of making a purchase.
- Bayesian Models of Annotation examples: illustrates enumeration for mixture models.
- Enumerate HMM examples: illustrates enumeration for markov models.
- Bayesian Imputation tutorial.
Enhancements and Bug Fixes
- HMC/NUTS compiling time is greatly improved, especially for large models.
- More efficient BTRS algorithm for sampling from Binomial distribution. #537
- Allow arbitrary order of
platestatements. #555 - Fix KeyError with
scalehandler anddeterministicprimitive. #577 - Fix Poisson sampler entering into infinite loop under vmap. #582
- Fix the double compilation issue in
numpyro.optimclasses. #603 - Use ExpandedDistribution in
numpyro.plate. #616 - Timeseries forecasting tutorial is updated with
scanprimitive and the usage ofPredictivefor forecasting. #608 #657 - Tweak sparse regression example to bring the model into exact alignment with the reference. #669
- Add MetropolisHastings algorithm as an example of MCMCKernel. #680
Thanks Nikolaos @daydreamt, Daniel Sheldon @dsheldon, Lukas Prediger @lumip, Freddy Boulton @freddyaboulton, Wouter van Amsterdam @vanAmsterdam, and many others for their contributions and helpful feedback!