-
Notifications
You must be signed in to change notification settings - Fork 516
Missing torch-xla-gpu-plugin #8876
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@ysiraichi wondering do you know the answer to this? Would it be possible to document this somewhere? |
I don't actually know how the nightly builds are built/stored/released. |
Gotcha. Do you know conceptually what is a "torch-xla-gpu-plugin"? How's that different from the "torch-xla" package built with CUDA support? I'm very unfamiliar with these unfortunately. Presumably, our CI tests know to find these things since we have GPU tests in the GitHub Action Checks, I think? |
As far as I understand, it's a library provided by OpenXLA with the device-specific implementation of PJRT.
I don't think there's a difference... |
Okay, I did some more digging
It turns out this is incorrect. In fact we build another wheel! The project folder of the wheel is https://github.com/pytorch/xla/tree/master/plugins/cuda. IIUC, using PyTorch/XLA on GPU requires two wheels:
There are two separate wheels. Now the issue is that we have somehow stopped uploading newer |
@ysiraichi recently re-enabled CUDA build jobs in CI. It is possible that the nightly wheel build was missed and simply never turned back on. Is that even a separate job, or do we just have a nightly trigger that adds an upload workflow? |
Those are good questions. I don't know any of them. If Yukio doesn't have internal access, maybe @zpcore could help check the triggers. |
The cuda build nightly trigger is there. We should have |
You are right. However, that wheel is only composed of
I don't think so. I have been using XLA:CUDA without that plugin.
Yes. If you compile PyTorch/XLA without CUDA support (i.e. |
@ysiraichi I see. But if I go to https://github.com/pytorch/xla/blob/master/README.md?plain=1#L94-L107 which is the installation instruction, does that use PyTorch/XLA build with or without CUDA support? I assume the final stable versions of PyTorch/XLA uploaded to PyPI will not be built with CUDA support, so we will need a plugin, IIUC? Do we publish any wheel built with |
That makes sense to me. But, I don't actually know what gets published in PyPI. |
A user reported the following issue:
we have been trying to use
torch-xla
nightly builds to get around some of the slowness issues seen in torch-xla 2.5. We foundtorch-xla
nightly builds for GPU undergs://pytorch-xla-releases/wheels/cuda/12.6
, however these don’t containtorch-xla-gpu-plugin
(this was present for oldertorch-xla
versions e.g.gs://pytorch-xla-releases/wheels/cuda/12.1/torch_xla_cuda_plugin-2.6.0-py3-none-any.whl
). Is there any location that contains the cuda plugin nightly builds for torch-xla 2.8.0?The text was updated successfully, but these errors were encountered: