Skip to content

Conversation

@yonigozlan
Copy link
Member

What does this PR do?

As the title says

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I saw your reply about this new kwarg under a different PR, so I will move my q here for easier communication
Do we need to add it as a default kwarg for all image processors? IMO only a few VLMs where the image_seq_length is saved in config should have it in model's image processor.

Otherwise LGTM, just want to clarify before merging

@yonigozlan
Copy link
Member Author

@zucchini-nlp Fully agreed I'll try to think of something better. The difficulty I found is that some models on the hub use this attribute in their image processor, even if there's no mention of it anywhere in the corresponding model files. So it's hard to determine which model should support this kwarg.

In the meantime though, this missing arg is causing a lot of annoying warning (totally my fault) so if we can merge this quickly that'd be great.

Copy link
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, also experienced the warnings today 😄 Oke, let's merge and clean-up later by adding to the image processors where it is indeed needed/used

IIRC these are now accessed by processor like self.image_processor.image_seq_length, so we can search models with this pattern

@yonigozlan yonigozlan merged commit 1c2e50a into huggingface:main Nov 6, 2025
23 checks passed
Abdennacer-Badaoui pushed a commit to Abdennacer-Badaoui/transformers that referenced this pull request Nov 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants