-
Notifications
You must be signed in to change notification settings - Fork 288
Add handling for Qwen3VLMoe on older transformers versions #2040
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Fynn Schmitt-Ulms <[email protected]>
|
👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review. Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed. |
Summary of ChangesHello @fynnsu, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request addresses compatibility issues with the Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request aims to provide compatibility with older versions of the transformers library by making the import of Qwen3VLMoe components conditional. The changes in the test file are well-implemented, using a try-except block to gracefully handle missing dependencies and skip tests. However, the corresponding change in the source file src/llmcompressor/modeling/qwen3_vl_moe.py is less robust. I've provided a suggestion to improve its reliability by adopting a similar try-except pattern, which will prevent potential runtime errors.
Signed-off-by: Fynn Schmitt-Ulms <[email protected]>
Signed-off-by: Fynn Schmitt-Ulms <[email protected]>
75d9325 to
9027587
Compare
SUMMARY: The goal of this pr is to speed up time to signal on new prs. Currently, we have the following CI tests: - `base-tests`: total runtime `4m26s`, install deps `3m43s`, run test `30s` - `pytorch-tests`: total runtime `3m27s`, install deps `3m4s`, run test `12s` - `quality-check`: total runtime `3m3s`, install deps `2m58s`, run test `0s` *times based on runs from this pr #2040. Note exact times will fluctuate substantially. After this pr, we have: - `base-tests`: total runtime `1m14s`, install deps `16s`, run test `41s` - `pytorch-tests`: total runtime `55s`, install deps `14s`, run test `22s` - `quality-check`: total runtime `19s`, install deps `10s`, run test `0s` *times based on runs from this pr fynnsu#1 on my fork. TEST PLAN: Tested by running CI. Signed-off-by: Fynn Schmitt-Ulms <[email protected]>
SUMMARY:
#1981 added Qwen3VLMoe with associated tests, however this model isn't available on all transformers versions that we support. Therefore, (similar to #2030) this pr ensures we don't import or test the model when using a transformers version that doesn't support it.
TEST PLAN:
Confirmed that this change fixes
import llmcompressorwhen using oldest support transformers version4.54.0. Ran test with old transformers version (test gets skipped) and new transformers version (test passes).