Skip to content

Conversation

BenjaminBossan
Copy link
Member

@BenjaminBossan BenjaminBossan commented Dec 13, 2024

This is a refactor of how new PEFT methods are registered. It got bigger than I initially expected.

Goal

The goal of this refactor is the following: Right now, when a new PEFT method is added, a new directory is created in src/peft/tuners/<name> with a config, model, etc. This is fine and self-contained.

However, in addition to that, a couple of other places in the PEFT code base need to be touched for this new PEFT method to become usable.

As an example, take the recently added Bone method (#2172). Ignoring tests, docs, and examples, we have the additions to src/peft/tuners/bone, but also need to:

  1. Add an entry to PEFT_TYPE_TO_CONFIG_MAPPING in mapping.py.
  2. Add an entry to PEFT_TYPE_TO_TUNER_MAPPING in mapping.py.
  3. Add an entry to PEFT_TYPE_TO_MODEL_MAPPING in peft_model.py
  4. Add an entry to PEFT_TYPE_TO_PREFIX_MAPPING in utils/constants.py
  5. Add some code to get_peft_model_state_dict in utils.save_and_load.py

With the changes in this PR, all these steps can be omitted.

On top of that, we also have the re-imports to peft/__init__.py and peft/tuners/__init__.py but those are still required (I'm hesitant to mess with the import system). Furthermore, it's still required to add an entry to PeftType in utils.peft_types.py. Since this is an enum, it can't be easily generated automatically. Therefore, adding a new PEFT method is still not 100% self-contained.

Changes in this PR

With this PR, less book-keeping is required. Instead of the 5 steps described above, contributors now only need to call

# example for the Bone method
register_peft_method(name="bone", config_cls=BoneConfig, model_cls=BoneModel)

in the __init__.py of their PEFT method. In addition to registering the method, this also performs a couple of sanity checks (e.g. no duplicate names, method name and method prefix being identical).

Moreover, since so much book keeping is removed, this PR reduces the number of lines of code overall (at the moment +317, - 343).

Implementation

The real difficulty of this task is that the module structure in PEFT is really messy, easily resulting in circular imports. This has been an issue in the past but has been especially painful here. For this reason, some stuff had to be moved around:

  • MODEL_TYPE_TO_PEFT_MODEL_MAPPING is now in auto.py instead of mapping.py
  • PEFT_TYPE_TO_PREFIX_MAPPING has been moved to mapping.py from constants.py
  • get_peft_model had to be moved out of mapping.py and is now in its own module, func.py (better name suggestions welcome). This should be safe, as the function is re-imported to the main PEFT namespace, which all examples use.

The PEFT_TYPE_TO_MODEL_MAPPING dict could be completely removed, as it was basically redundant with PEFT_TYPE_TO_TUNER_MAPPING. The get_peft_model_state_dict could be simplified, as a lot of code was almost duplicated.

There were a few instances in peft_model.py like:

        elif config.peft_type == PeftType.P_TUNING:
            prompt_encoder = PromptEncoder(config)

Now, instead of hard-coding the model, I just do model_cls = PEFT_TYPE_TO_TUNER_MAPPING[config.peft_type].

Overall, I think this is a cleaner module structure, but still not very clean overall.

Open questions

I'm not 100% sure if this should be merged. AFAICT it should be a safe refactor that does not affect user code. There could be other packages out there that use some PEFT internals that could break with this refactor. If we decide to merge this, we should consider alerting potentially affected packages to test it.

I'm also open to the argument that the benefits are not outweighing the cost of the refactor.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@BenjaminBossan BenjaminBossan marked this pull request as ready for review December 16, 2024 14:38
Copy link
Collaborator

@githubnemo githubnemo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As for the name of func.py: Maybe mapping_func.py is a more appropriate name?

@BenjaminBossan
Copy link
Member Author

As for the name of func.py: Maybe mapping_func.py is a more appropriate name?

Good idea, I renamed the module.

Copy link
Collaborator

@githubnemo githubnemo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In general I do like static mapping more than dynamic, makes the code more readable without interpreting too much. However, in this case I can see that it is easy to forget certain mappings and keeping them at one place is not always easy. It also makes for a good public interface.

If you could add the doc string for register_peft_method we can merge this, I think.

@BenjaminBossan
Copy link
Member Author

In general I do like static mapping more than dynamic, makes the code more readable without interpreting too much. However, in this case I can see that it is easy to forget certain mappings and keeping them at one place is not always easy. It also makes for a good public interface.

I agree. The change is not pure upside, but overall I think it is cleaner than it was before. Another possible advantage: If we ever require something more to happen when a new PEFT method is added, we could possibly change it in a single place instead of having to touch each method individually.

If you could add the doc string for register_peft_method we can merge this, I think.

Thanks for catching this, I added the docstring.

@BenjaminBossan BenjaminBossan merged commit 1e8bc60 into huggingface:main Jan 13, 2025
14 checks passed
@BenjaminBossan BenjaminBossan deleted the refactor-peft-method-registration branch January 13, 2025 14:07
BenjaminBossan added a commit to BenjaminBossan/AutoGPTQ that referenced this pull request Jan 14, 2025
The PEFT_TYPE_TO_MODEL_MAPPING variable was removed in PEFT PR
huggingface/peft#2282. This leads to an error
when using AutoGPTQ with PEFT installed from main (and later starting
with PEFT v0.15.0).

This PR fixes this issue by using the PEFT_TYPE_TO_TUNER_MAPPING
variable instead. The fix is implemented in a way that the code should
continue to work with older PEFT versions while being compatible with
the new PR.
BenjaminBossan added a commit to BenjaminBossan/peft that referenced this pull request Jan 14, 2025
This is for backwards compatibility: In huggingface#2282,
PEFT_TYPE_TO_MODEL_MAPPING was removed as it was redundant with
PEFT_TYPE_TO_TUNER_MAPPING. However, third party code could still use
this mapping, e.g.:

https://github.com/AutoGPTQ/AutoGPTQ/blob/6689349625de973b9ee3016c28c11f32acf7f02c/auto_gptq/utils/peft_utils.py#L8

Therefore, it is reinstated here, but a DeprecationWarning will be given
if it's used.
BenjaminBossan added a commit that referenced this pull request Jan 14, 2025
…2328)

This is for backwards compatibility: In #2282,
PEFT_TYPE_TO_MODEL_MAPPING was removed as it was redundant with
PEFT_TYPE_TO_TUNER_MAPPING. However, third party code could still use
this mapping, e.g.:

https://github.com/AutoGPTQ/AutoGPTQ/blob/6689349625de973b9ee3016c28c11f32acf7f02c/auto_gptq/utils/peft_utils.py#L8

Therefore, it is reinstated here, but a DeprecationWarning will be given
if it's used.
Qubitium pushed a commit to AutoGPTQ/AutoGPTQ that referenced this pull request Jan 16, 2025
The PEFT_TYPE_TO_MODEL_MAPPING variable was removed in PEFT PR
huggingface/peft#2282. This leads to an error
when using AutoGPTQ with PEFT installed from main (and later starting
with PEFT v0.15.0).

This PR fixes this issue by using the PEFT_TYPE_TO_TUNER_MAPPING
variable instead. The fix is implemented in a way that the code should
continue to work with older PEFT versions while being compatible with
the new PR.
Guy-Bilitski pushed a commit to Guy-Bilitski/peft that referenced this pull request May 13, 2025
Goal

The goal of this refactor is the following: Right now, when a new PEFT
method is added, a new directory is created in src/peft/tuners/<name>
with a config, model, etc. This is fine and self-contained.

However, in addition to that, a couple of other places in the PEFT code
base need to be touched for this new PEFT method to become usable.

As an example, take the recently added Bone method (huggingface#2172). Ignoring
tests, docs, and examples, we have the additions to
src/peft/tuners/bone, but also need to:

1. Add an entry to PEFT_TYPE_TO_CONFIG_MAPPING in mapping.py.
2. Add an entry to PEFT_TYPE_TO_TUNER_MAPPING in mapping.py.
3. Add an entry to PEFT_TYPE_TO_MODEL_MAPPING in peft_model.py
4. Add an entry to PEFT_TYPE_TO_PREFIX_MAPPING in utils/constants.py
5. Add some code to get_peft_model_state_dict in utils.save_and_load.py

With the changes in this PR, all these steps can be omitted.

On top of that, we also have the re-imports to peft/__init__.py and
peft/tuners/__init__.py but those are still required (I'm hesitant to
mess with the import system). Furthermore, it's still required to add an
entry to PeftType in utils.peft_types.py. Since this is an enum, it
can't be easily generated automatically. Therefore, adding a new PEFT
method is still not 100% self-contained.

Changes in this PR

With this PR, less book-keeping is required. Instead of the 5 steps
described above, contributors now only need to call

# example for the Bone method

register_peft_method(
    name="bone", config_cls=BoneConfig, model_cls=BoneModel
)

in the __init__.py of their PEFT method. In addition to registering the
method, this also performs a couple of sanity checks (e.g. no duplicate
names, method name and method prefix being identical).

Moreover, since so much book keeping is removed, this PR reduces the
number of lines of code overall (at the moment +317, - 343).

Implementation

The real difficulty of this task is that the module structure in PEFT is
really messy, easily resulting in circular imports. This has been an
issue in the past but has been especially painful here. For this reason,
some stuff had to be moved around:

- MODEL_TYPE_TO_PEFT_MODEL_MAPPING is now in auto.py instead of
  mapping.py
- PEFT_TYPE_TO_PREFIX_MAPPING has been moved to mapping.py from
  constants.py
- get_peft_model had to be moved out of mapping.py and is now in its own
  module, func.py (better name suggestions welcome). This should be
  safe, as the function is re-imported to the main PEFT namespace, which
  all examples use.

The PEFT_TYPE_TO_MODEL_MAPPING dict could be completely removed, as it
was basically redundant with PEFT_TYPE_TO_TUNER_MAPPING. The
get_peft_model_state_dict could be simplified, as a lot of code was
almost duplicated.

There were a few instances in peft_model.py like:

        elif config.peft_type == PeftType.P_TUNING:
            prompt_encoder = PromptEncoder(config)

Now, instead of hard-coding the model, I just do model_cls =
PEFT_TYPE_TO_TUNER_MAPPING[config.peft_type].
Guy-Bilitski pushed a commit to Guy-Bilitski/peft that referenced this pull request May 13, 2025
…uggingface#2328)

This is for backwards compatibility: In huggingface#2282,
PEFT_TYPE_TO_MODEL_MAPPING was removed as it was redundant with
PEFT_TYPE_TO_TUNER_MAPPING. However, third party code could still use
this mapping, e.g.:

https://github.com/AutoGPTQ/AutoGPTQ/blob/6689349625de973b9ee3016c28c11f32acf7f02c/auto_gptq/utils/peft_utils.py#L8

Therefore, it is reinstated here, but a DeprecationWarning will be given
if it's used.
cyyever pushed a commit to cyyever/peft that referenced this pull request Sep 4, 2025
* Create mergekit_utils.py

* adding mergekit as an optional dependancy

* adding MergeModel to callbacks

* adding mergekit_utils dependencies to callbacks

* setting lower bound for mergekit

* setting mergekit lower band to 0.0.5.1

* adding support for MergeModelCallBack __init__.py

* adding support for mergemodelcallback

* mergemodelcallback tests

* Update callbacks.py

* Update __init__.py

* Update __init__.py

* Update test_callbacks.py

* Update trl/trainer/callbacks.py

removing ## from docs

Co-authored-by: lewtun <[email protected]>

* Update trl/trainer/callbacks.py

removing ## from docs

Co-authored-by: lewtun <[email protected]>

* Update trl/trainer/callbacks.py

Co-authored-by: lewtun <[email protected]>

* using different dataset for tests

Co-authored-by: lewtun <[email protected]>

* Update trl/mergekit_utils.py

adding types

Co-authored-by: lewtun <[email protected]>

* Update trl/mergekit_utils.py

Co-authored-by: lewtun <[email protected]>

* Apply suggestions from code review

Co-authored-by: lewtun <[email protected]>

* replacing get_last_checkpoint

* renaming Merge to merge_models

* setting mergers default value to linear

* removing unnecessary docs and comments

* adding docstring to Mergeconfig

* adding mergekits link to docstring

* precommit

* removing duplicated import

* typos in mergekit_utils docstring

* fixing tests

* making mergemodelcallback tests optional

* Make import optional

* minor

* use tmp dir in test

* sort

* Add import error checks for mergekit extra

* use a common _merge_and_maybe_push method and compat with windows path

* debug windows

* Update dependencies for mergekit and add test dependencies

* Add assertion to check if merged folder exists in the last checkpoint

* Fix temporary directory cleanup in test_callbacks.py

* Add sys import and skip test for Python versions below 3.10 due to cleanup errors with temp dir

* revert change for debug

---------

Co-authored-by: Quentin Gallouédec <[email protected]>
Co-authored-by: lewtun <[email protected]>
Co-authored-by: Quentin Gallouédec <[email protected]>
Co-authored-by: Kashif Rasul <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants