Skip to content

Conversation

@AbdulmalikDS
Copy link
Contributor

Motivation

  • Transformers 4.56.0 (released 2024‑11‑05, see https://newreleases.io/project/pypi/transformers/release/4.56.0) deprecated torch_dtype and now prints:
    UserWarning: torch_dtype is deprecated! Use dtype instead!
  • Any harness run that instantiates HFLM or IPEXLM logs that warning once per worker, so long evaluations spam the same message.

Changes

  • lm_eval/models/huggingface.py: call from_pretrained(..., dtype=get_dtype(dtype)) in the main load path and the delta-load path.
  • lm_eval/models/optimum_ipex.py: same change for the IPEX loader.
  • CLI stays the same; users still pass dtype via --model_args.

- Update huggingface.py to use dtype instead of torch_dtype (2 places)
- Update optimum_ipex.py to use dtype instead of torch_dtype (1 place)
- Aligns with Transformers v4.56.0+ API and removes deprecation warnings
@CLAassistant
Copy link

CLAassistant commented Nov 18, 2025

CLA assistant check
All committers have signed the CLA.

@baberabb
Copy link
Contributor

Thank you!

@baberabb baberabb merged commit 7ddb2b1 into EleutherAI:main Nov 19, 2025
6 checks passed
@AbdulmalikDS AbdulmalikDS deleted the fix-torch-dtype-deprecation branch November 19, 2025 08:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants