Skip to content

Conversation

albertvillanova
Copy link
Member

Fix docstrings interlinks.

This PR updates documentation and type annotations throughout the codebase to use explicit, linkable references to classes from external libraries such as transformers, datasets, and peft. The changes improve clarity for users by making it easier to identify and navigate to the relevant external types and configuration objects used in the code and documentation.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

## How to use it?

Simply declare a `PeftConfig` object in your script and pass it through `.from_pretrained` to load the TRL+PEFT model.
Simply declare a [`~peft.PeftConfig`] object in your script and pass it through `.from_pretrained` to load the TRL+PEFT model.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For some reasons, this one doesn't render

Image

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I know, @qgallouedec: see my previous comment in:

The interlink will be properly rendered (as it is already the case for other HF libs, like transformers or datasets) once this PR in doc-builder is merged:

A wrapper class around a (`transformers.PreTrainedModel`) to be compatible with the (`~transformers.PreTrained`)
class in order to keep some attributes and methods of the (`~transformers.PreTrainedModel`) class.
"""
A wrapper class around a [`~transformers.PreTrainedModel`] to be compatible with the (`~transformers.PreTrained`)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

~transformers.PreTrained doesn't exist?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, it doesn't exist. I guess it was intended to mean PreTrainedModel instead.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I changed the description:

    Wrapper for a [`~transformers.PreTrainedModel`] implemented as a standard PyTorch [`torch.nn.Module`].

    This class provides a compatibility layer that preserves the key attributes and methods of the original
    [`~transformers.PreTrainedModel`], while exposing a uniform interface consistent with PyTorch modules. It enables
    seamless integration of pretrained Transformer models into custom training, evaluation, or inference workflows.

- **v_head_init_strategy** (`str`, `optional`, defaults to `None`) -- The initialization strategy for the
`ValueHead`. Currently, the supported strategies are:
- **`None`** -- Initializes the weights of the `ValueHead` with a random distribution. This is the
[`ValueHead`]. Currently, the supported strategies are:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This one doesn't render Image

Copy link
Member

@qgallouedec qgallouedec left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok! I checked most of them. I think this can be merge despite my comments

@albertvillanova albertvillanova merged commit 2aa9506 into huggingface:main Oct 13, 2025
9 of 10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants