Skip to content

ToyotaCRDL/SimplexTS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Accuracy-Preserving Calibration via Statistical Modeling on Probability Simplex

This is an official Pytorch implementation of "Accuracy-Preserving Calibration via Statistical Modeling on Probability Simplex". This study was presented at AISTATS 2024 (acceptance rate: 27.6%). [arXiv|PMLR]

overview

Preparation

  1. Git clone this repository.

    git clone https://github.com/ToyotaCRDL/SimplexTS.git
  2. Build a Docker image using Dockerfile and requrements.txt.

    docker build --build-arg USER=${USER} -t simplex-ts SimplexTS
  3. Launch a container.

    docker run -it --rm --gpus all \
            --mount type=bind,source=$(pwd)/SimplexTS,target=${HOME}/SimplexTS \
            --workdir ${HOME}/SimplexTS \
            simplex-ts

We conducted our experiments on a NVIDIA A100 GPU.

Example

The following is a demonstration of FashionMNIST.

  1. Train a model for classification. This is Step 1, as described in Section 4.1 of this paper.

    bash scripts/classification.sh
  2. Calibrate a model using Simplex Temperature Scaling (STS). This is Step 2, as described in Section 4.1 of this paper.

    bash scripts/calibration.sh

Citation

To cite our work, you can use the following:

@inproceedings{SimplexTS,
  title     = {Accuracy-Preserving Calibration via Statistical Modeling on Probability Simplex},
  author    = {Esaki, Yasushi and Nakamura, Akihiro and Kawano, Keisuke and Tokuhisa, Ryoko and Kutsuna, Takuro},
  booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics},
  pages     = {1666--1674},
  year      = {2024},
  volume    = {238},
  series    = {PMLR}
}

License

Copyright (C) 2025 TOYOTA CENTRAL R&D LABS., INC. All Rights Reserved.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published