Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Re-introduce "XLA_USE_32BIT_LONG" flag #8589

Merged
merged 3 commits into from
Jan 17, 2025

Conversation

rpsilva-aws
Copy link
Contributor

Cherry pick of #8571

In this PR, we reintroduce the XLA_USE_32BIT_LONG flag to give customers the flexibility to use INT32 for their workloads. This was previously removed in 2.5 in #7582. PyTorch's inept INT32 support across the board (pytorch/pytorch#141994) remains an issue, but given that some hw compilers do not support 64-bit types, this helps unblock customers who require this implicit XLA downcasting.

There were previously no tests for XLA_USE_32BIT_LONG, so simultaneously refined the test to not require specifying the flag prior to calling the test, and extending the tests to capture more cases.

@rpsilva-aws rpsilva-aws marked this pull request as ready for review January 17, 2025 21:53
@rpsilva-aws
Copy link
Contributor Author

@jeffhataws @tengyifei PTAL

@tengyifei tengyifei merged commit fccd395 into pytorch:r2.6 Jan 17, 2025
12 checks passed
@rpsilva-aws rpsilva-aws deleted the rpsilva_r2.6_implicit_downcast branch January 17, 2025 23:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants