Skip to content

This is the repository for paper: Gradient-based Parameter Selection for Efficient Fine-Tuning

Notifications You must be signed in to change notification settings

FightingFighting/GPS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GPS

This is an official repository for the paper: Gradient-based Parameter Selection for Efficient Fine-Tuning

For the segmentation Task on the SAM model using our GPS method, please see SAM GPS.

Environment

Please follow SSF for installation.

Datasets

FGVC

Please follow VPT to download them.

You can also download them from baiduyun code: nc9f

If you only want to download the annotation for split, please download from fgvc_split.zip

VTAB

Please follow SSF to download them.

You can also download them from baiduyun code: r1s7

If you only want to download the annotation for split, please download from vtab-1k_split.zip

Train

Take the Stanford Cars task in FGVC for example:

  1. Replace /path/to/FGVC/ with your path of the FGVC dataset in train_scripts/vit/fgvc/stanford_cars.sh
  2. cd GPS
  3. run bash train_scripts/vit/fgvc/stanford_cars.sh

For the VTAB task, please see the scripts in train_scripts/vit/vtab. We have already updated the scripts for VTAB.

Cite

If this project is helpful for you, you can cite our paper:

@inproceedings{zhang2024gradient,
  title={Gradient-based Parameter Selection for Efficient Fine-Tuning},
  author={Zhang, Zhi and Zhang, Qizhe and Gao, Zijun and Zhang, Renrui and Shutova, Ekaterina and Zhou, Shiji and Zhang, Shanghang},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={28566--28577},
  year={2024}
}

Acknowledgement

Our experiment follows SSF.

The code is built upon SSF and VPT.

About

This is the repository for paper: Gradient-based Parameter Selection for Efficient Fine-Tuning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published