Releases: madminer-tool/madminer
Releases · madminer-tool/madminer
v0.6.2
New features:
- Reweight existing samples (either generated with MadMiner or standalone MadGraph) through
MadMiner.reweight_existing_sample() - Custom parameter grids / evaluation points in
AsymptoticLimitsthrough the new keywordthetas_eval
v0.6.1
New features:
- Dropout support
- Many more activation functions
- Number of workers for data loading can be specified
Bug fixes:
- Fixed crash in DelphesReader and LHEReader when no systematics are used
- Fixed logging error with double parameterized ratio estimation methods
v0.6.0
New features:
- Expanded systematics system. Users now declare systematics with
MadMiner.add_systematics(), which in addition to the previous PDF and scale variations also allows normalization uncertainties. When adding samples in theLHEReaderandDelphesReaderfunctions, each sample can be linked to an arbitrary subset of systematics, giving the user a lot of flexibility.
Breaking / API changes:
- For processes with systematic uncertainties, the MadMiner file format changed in a not-backward-compatible way. Please do not use files with systematic uncertainties that were generated with MadMiner versions before v0.6.0 with the new code version (MadMiner will crash). Sorry about this.
Bug fixes:
- Fixed wrongly including the central element in the calculation of the PDF uncertainties.
Documentation:
- Updated and expanded tutorial on systematic uncertainties.
Internal changes:
- Some internal changes related to nuisance parameters, including the MadMiner file format.
v0.5.1
New features:
- Automatic shuffling of MadMiner HDF5 files after reading in LHE or Delphes files
Bug fixes:
- Fixed rare crash in
AsymptoticLimits
v0.5.0
New features:
- Clean separation between training and validation events: the
SampleAugmenterfunctions have new keywordspartitionandvalidation_split. Withpartition="validation", validation data without potential overlap with the training samples can be generated. In themadminer.mlclasses, this can be provided with new keywords likex_val,theta_valwhen callingtrain(). - More consistent calculation of the Fisher information covariance: the covariance matrices in
mode="score"are now the ensemble covariance, without dividing bysqrt(n_estimators)as before. This is also the default behavior. The old default behavior can be used withmode="modified_score". - Parameter rescaling also for
DoubleParameterizedRatioEstimatorandLikelihoodEstimator. - When continuing the training of a pre-trained model, the parameter and observable rescaling is not overwritten.
- SALLY limits can now be calculated with
Ensembleinstances of multiple score estimators.
Breaking / API changes:
- The
SampleAugmenterfunctions do no longer accept the keywordswitch_train_test_events. Usepartition="train"orpartition="test"instead.
Bug fixes:
- Fixed bug in the logging output about relative cross-section uncertainties during the Fisher information calculation.
- Fixed MadMiner crashing when calculating adaptive histogram binnings.
- Fixed bug in
AsymptoticLimitswhere only 1 observed event was returned.
Internal changes:
- Abstracted
Estimatorclasses with parameter rescaling into newConditionalEstimatorclass.
v0.4.10
New features:
ParameterizedRatioEstimatornow optionally rescales parameters (theta) to zero mean and unit variance during training. Use the keywordrescale_paramsinParameterizedRatioEstimator.train().- Batching of parameter points in the
AsymptoticLimitsfunctions now also when using weighted events.
API and breaking changes:
- In MET smearing, the relative term is now multiplied with HT, defined as the scalar sum over the pT of all visible particles (before it was all particles).
Bug fixes:
- Fixed critical bug in the MET calculation in
LHEReader, which caused the y component of the MET object to be wrong. - Fixed crashes when training a likelihood ratio estimator without providing joint score information.
v0.4.9
New features:
plot_histograms()can now also visualize observed data / Asimov data.- In 2D parameter spaces, calculating limits with
mode="adaptive-sally"now works as described in https://arxiv.org/abs/1805.00020. In higher dimensions, it still just concatenates the scalar product of score and parameter vector with all score components to form a(d+1)-dimensional observable space.
Bug fixes:
- Fixed bug in
plot_histograms().
Tutorials and documentation:
- The MadMiner paper is out!
- Updated README and docs, including a new troubleshooting list.
- Cleaned up examples folder.
v0.4.8
New features:
- In
AsymptoticLimits, the adaptive histogram binning can now be based on the weights summed over the whole parameter grid instead of just a central point. This is now also the default option. - New function
plot_histograms()inmadminer.plottingto visualize the histograms used byAsymptoticLimits.
Bug fixes:
- Substantially improved automatic histogram binning and fixed some numerical issues in
AsymptoticLimitsfunctions.
Tutorials and documentation:
- Updated tutorial with new histogram plots.
v0.4.7
New features:
- More observables for
LHEReader.add_observable: Users can use"p_truth"to access particles before smearing, and (at least with XML parsing) there are new global observables"alpha_qcd", "alpha_qed", "scale".LHEReader.add_observable_from_function()now accepts functions that take unsmeared particles as first argument.
Bug fixes:
- Fixed bug in
sample_train_ratio()withreturn_individual_n_effective=True - Fixed bug in
DelphesReaderwhen no events survive cuts
Tutorials and documentation:
- Removed outdated Docker link from docs
- Changed morphing basis in particle physics tutorial to work around a weird bug inn the MG-Pythia interface, see #371
Internal changes:
- Refactored LHE parsing. LHE files are now not read into memory all at once, but sequentially.
v0.4.6
New features:
AsymptoticLimitsnow supports the SALLINO method, estimating the likelihood with one-dimensional histograms of the scalar product ofthetaand the estimated score.- Improved default histogram binning in
AsymptoticLimitsand added more binning options, including fully manual specification of the binning. - Histograms now calculate a rough approximate of statistical uncertainties in each bin and give out a warning if it’s large. (At DEBUG logging level they’ll also print the uncertainties always, and
Histogram.histo_uncertaintieslets the user access the uncertainties.)
Breaking / API changes:
- The
AsymptoticLimitsfunctionsexpected_limits()andobserved_limits()now return(theta_grid, p_values, i_ml, llr_kin, log_likelihood_rate, histos).histosis a list of histogram classes, the tutorial shows how they allow us to plot the histograms. Thereturnskeyword to these functions is removed. The keywordstheta_rangesandresolutionswere renamed togrid_rangesandgrid_resolutions. - Changed some ML default settings: less hidden layers, smaller batch size.
- Changed function names in the
FisherInformationclass (the old names are still available as aliases for now, but deprecated).
Bug fixes:
- Various small bug fixes.
Tutorials and documentation:
AsymptoticLimitsis finally properly documented.- All incomplete user-facing docstrings were updated.
Internal changes:
- Refactored histogram class.
AsymptoticLimitsis now much more memory efficient.