This repository contains the experiments for the paper by Panchenko, Benmerzoug and de Benito Delgado, Class-wise and reduced calibration methods, submitted to the 21st IEEE International Conference on Machine Learning and Applications (ICMLA 2022).
This project uses Poetry for dependency management. More specifically version 1.2.0 of Poetry.
Start by installing it and then proceed to installing the requirements:
poetry install --no-root
And then activating the created virtual environment:
poetry shell
As an alternative you could build a docker image from the Dockerfile:
docker build . -t classwise-calibration:local
And then simply start a container:
docker container run -it --rm classwise-calibration:local
To start the notebooks from within the container use:
docker run -it --rm -p 8888:8888 classwise-calibration:local jupyter notebook --NotebookApp.default_url=/lab/ --ip=0.0.0.0 --port=8888
Experiment | Binder |
---|---|
Random Forest on Synthetic Data | |
LightGBM on Sorel20M | |
ResNet56 on CIFAR10 | |
DistilBERT on IMDB | |
DeiT on RVL-CDIP |
To run the experiments use:
python -m src.experiments.<Experiment Module>
Where you would replace with the name of one of experiments' module.
For example, to run the Random Forest experiment with Synthetic Data use:
python -m src.experiments.random_forest_synthetic
You could also use the notebooks if we to interactively run the experiments.
They are generated from the experiment scripts as follows:
bash scripts/generate_notebooks.sh
This repository is distributed under LGPL-3.0. A complete version can be found in two files: here and here.
All contributions will be distributed under this license.