Skip to content

Commit 773fe7f

Browse files
brian-dellabettakylesayrs
authored andcommitted
awq -- hotfix to missing kwargs (#1395)
SUMMARY: This PR resolves the issues surrounding an Optional parameter passed into a torch module's .forward method during AWQ. Previous attempts to resolve in #1384 also added kwargs for parameters passed in positionally later on. This will make the addition to kwargs more strict, only if the annotation indicates if it is an optional field. This hotfix will fail if optional fields are passed in positionally, if typing annotation is `a: int | None` instead of `a: typing.Optional[int]`, or if there is no typehint at all and the field is not provided. It will be addressed with a more general solution soon, see #1385 TEST PLAN: New test was run with python 3.9 and passed -- https://github.com/neuralmagic/llm-compressor-testing/actions/runs/14713323963/job/41291028422 Signed-off-by: Brian Dellabetta <[email protected]>
1 parent addef4e commit 773fe7f

File tree

1 file changed

+1
-0
lines changed
  • src/llmcompressor/modifiers/awq

1 file changed

+1
-0
lines changed

src/llmcompressor/modifiers/awq/base.py

+1
Original file line numberDiff line numberDiff line change
@@ -623,6 +623,7 @@ def _sanitize_kwargs(inputs_kwargs, module):
623623
k not in sanitized_kwargs
624624
and k != "use_cache"
625625
and v.default is inspect.Parameter.empty
626+
and str(v.annotation).startswith("typing.Optional")
626627
):
627628
sanitized_kwargs[k] = None
628629

0 commit comments

Comments
 (0)