Posterior, MCSampler & Closure Refactors, Entropy Search Acquisition Functions
Highlights
This release includes some backwards incompatible changes.
- Refactor
PosteriorandMCSamplermodules to better support non-Gaussian distributions in BoTorch (#1486).- Introduced a
TorchPosteriorobject that wraps a PyTorchDistributionobject and makes it compatible with the rest ofPosteriorAPI. PosteriorListno longer accepts Gaussian base samples. It should be used with aListSamplerthat includes the appropriate sampler for each posterior.- The MC acquisition functions no longer construct a Sobol sampler by default. Instead, they rely on a
get_samplerhelper, which dispatches an appropriate sampler based on the posterior provided. - The
resampleandcollapse_batch_dimsarguments toMCSamplers have been removed. TheForkedRNGSamplerandStochasticSamplercan be used to get the same functionality. - Refer to the PR for additional changes. We will update the website documentation to reflect these changes in a future release.
- Introduced a
- #1191 refactors much of
botorch.optimto operate based on closures that abstract away how losses (and gradients) are computed. By default, these closures are created using multiply-dispatched factory functions (such asget_loss_closure), which may be customized by registering methods with an associated dispatcher (e.g.GetLossClosure). Future releases will contain tutorials that explore these features in greater detail.
New Features
- Add mixed optimization for list optimization (#1342).
- Add entropy search acquisition functions (#1458).
- Add utilities for straight-through gradient estimators for discretization functions (#1515).
- Add support for categoricals in Round input transform and use STEs (#1516).
- Add closure-based optimizers (#1191).
Other Changes
- Do not count hitting maxiter as optimization failure & update default maxiter (#1478).
BoxDecompositioncleanup (#1490).- Deprecate
torch.triangular_solvein favor oftorch.linalg.solve_triangular(#1494). - Various docstring improvements (#1496, #1499, #1504).
- Remove
__getitem__method fromLinearTruncatedFidelityKernel(#1501). - Handle Cholesky errors when fitting a fully Bayesian model (#1507).
- Make eta configurable in
apply_constraints(#1526). - Support SAAS ensemble models in RFFs (#1530).
- Deprecate
botorch.optim.numpy_converter(#1191). - Deprecate
fit_gpytorch_scipyandfit_gpytorch_torch(#1191).