Skip to content

Bilican/Sparse_PEFT

Repository files navigation

Sparse_PEFT: Exploring Sparsity for Parameter-Efficient Fine-Tuning | [Paper]

The original images (top), WaveFT results (middle), LoRA results (bottom).

Overview

This repository contains the implementation and results of our research on sparse parameter-efficient fine-tuning in the wavelet domain. We explore the benefits of incorporating structured sparsity into PEFT approaches to achieve better parameter efficiency while maintaining or improving performance.

Method

Usage

To use this repository, follow these steps:

  1. Clone the repository:
    git clone https://github.com/your-username/Sparse_PEFT.git
    cd Sparse_PEFT
  2. Install dependencies:
    pip install -r requirements.txt
  3. Run experiments:
    bash personalization.sh

To test other methods it is sufficient to run

  bash other_scripts/vera_personalization.sh

PEFT Comparison

Comparison of different PEFT methods including our novel sparse approaches against baseline methods.

Rank vs Sparsity Trade-off

Analysis of the relationship between rank and sparsity parameters in our methods.

WaveFT: Wavelet-based Fine-Tuning

WaveFT leverages wavelet transformations to identify important parameter subspaces for efficient fine-tuning, achieving strong results with minimal parameter updates.

Performance Metrics

Comprehensive evaluation metrics across different tasks and model configurations.

Dataset Acknowledgment

The dataset used in this work is from the DreamBooth repository by Google. We use their dataset of subjects for our fine-tuning experiments to maintain consistency with prior work and enable fair comparison.

Acknowledgements

This work utilizes the following open-source libraries:

  • Hugging Face PEFT: A library for state-of-the-art parameter-efficient fine-tuning methods.

    • Repository: https://github.com/huggingface/peft
    • Citation:
      @Misc{peft,
        title =        {PEFT: State-of-the-art Parameter-Efficient Fine-Tuning methods},
        author =       {Sourab Mangrulkar and Sylvain Gugger and Lysandre Debut and Younes Belkada and Sayak Paul and Benjamin Bossan},
        howpublished = {\url{https://github.com/huggingface/peft}},
        year =         {2022}
      }
  • PyTorch Wavelet Toolbox (ptwt): A toolbox for differentiable fast wavelet transforms in PyTorch with GPU support.

    • Repository: https://github.com/v0lta/PyTorch-Wavelet-Toolbox
    • Citation:
      @article{JMLR:v25:23-0636,
        author  = {Moritz Wolter and Felix Blanke and Jochen Garcke and Charles Tapley Hoyt},
        title   = {ptwt - The PyTorch Wavelet Toolbox},
        journal = {Journal of Machine Learning Research},
        year    = {2024},
        volume  = {25},
        number  = {80},
        pages   = {1--7},
        url     = {http://jmlr.org/papers/v25/23-0636.html}
      }

Citation

If you find this work useful, please cite our paper:

@misc{bilican2025exploringsparsityparameterefficient,
      title={Exploring Sparsity for Parameter Efficient Fine Tuning Using Wavelets}, 
      author={Ahmet Bilican and M. Akın Yılmaz and A. Murat Tekalp and R. Gökberk Cinbiş},
      year={2025},
      eprint={2505.12532},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2505.12532}, 
}

License

This project is licensed under the Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published