Implementation of the bias-to-text (B2T) algorithm described in "Bias-to-Text: Debiasing Unknown Visual Biases through Language Interpretation." B2T identifies and mitigates visual biases in image classifiers and text-to-image generative models using language descriptions.
See the link for more detailed results in bias discovery and debiasing.
Download datasets.
- CelebA
- Waterbirds (direct downloadable link), formed from Caltech-UCSD Birds 200 + Places
Clone our repository.
$ git clone https://github.com/Erena-Kim/b2t.gitRun below to create virtual environment for b2t and install all prerequisites.
$ pip install pipenv
$ pipenv --python 3.8
$ pipenv installTo run our code, you need to place datasets and model checkpoints to right directory.
You can download the ClipCap pretrained model here and place the model to [root_dir]/function. (Note that our paper uses the model that trained on Conceptual Captions)
The main point of entry to the code is b2t.py
dataset:celebaorwaterbirdmodel: download pretrained checkpoints of CelebA and Waterbirds here and put them into[root_dir]/modelextract_caption:TrueorFalse- If set
True, automatically generate[root_dir]/data/[dataset]/caption/and store extracted captions there
- If set
save_result:TrueorFalse- If set
True, automatically generate[root_dir]/diff/and store csv file of results there
- If set
Our code expects the following files/folders in the [root_dir]/data/celebA directory:
data/list_eval_partition.csvdata/list_attr_celeba.csvdata/image_align_celeba/
A sample command to run b2t on CelebA with pretrained erm model is:
$ python b2t.py --dataset celeba --model best_model_CelebA_erm.pthOur code expects the following files/folders in the [root_dir]/data/cub directory:
data/waterbird_complete95_forest2water2/
A sample command to run b2t on Waterbirds with pretrained erm model is:
$ python b2t.py --dataset waterbird --model best_model_Waterbirds_erm.pthTo reproduce the debiasing classifier experiments, see b2t_debias.
To reproduce the diffusion model experiments, see b2t_diffusion.
