-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Fix docstring interlinks #4221
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix docstring interlinks #4221
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
docs/source/peft_integration.md
Outdated
## How to use it? | ||
|
||
Simply declare a `PeftConfig` object in your script and pass it through `.from_pretrained` to load the TRL+PEFT model. | ||
Simply declare a [`~peft.PeftConfig`] object in your script and pass it through `.from_pretrained` to load the TRL+PEFT model. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I know, @qgallouedec: see my previous comment in:
The interlink will be properly rendered (as it is already the case for other HF libs, like
transformers
ordatasets
) once this PR indoc-builder
is merged:
trl/models/modeling_base.py
Outdated
A wrapper class around a (`transformers.PreTrainedModel`) to be compatible with the (`~transformers.PreTrained`) | ||
class in order to keep some attributes and methods of the (`~transformers.PreTrainedModel`) class. | ||
""" | ||
A wrapper class around a [`~transformers.PreTrainedModel`] to be compatible with the (`~transformers.PreTrained`) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
~transformers.PreTrained
doesn't exist?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, it doesn't exist. I guess it was intended to mean PreTrainedModel
instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I changed the description:
Wrapper for a [`~transformers.PreTrainedModel`] implemented as a standard PyTorch [`torch.nn.Module`].
This class provides a compatibility layer that preserves the key attributes and methods of the original
[`~transformers.PreTrainedModel`], while exposing a uniform interface consistent with PyTorch modules. It enables
seamless integration of pretrained Transformer models into custom training, evaluation, or inference workflows.
- **v_head_init_strategy** (`str`, `optional`, defaults to `None`) -- The initialization strategy for the | ||
`ValueHead`. Currently, the supported strategies are: | ||
- **`None`** -- Initializes the weights of the `ValueHead` with a random distribution. This is the | ||
[`ValueHead`]. Currently, the supported strategies are: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok! I checked most of them. I think this can be merge despite my comments
Fix docstrings interlinks.
This PR updates documentation and type annotations throughout the codebase to use explicit, linkable references to classes from external libraries such as
transformers
,datasets
, andpeft
. The changes improve clarity for users by making it easier to identify and navigate to the relevant external types and configuration objects used in the code and documentation.