We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
torch.baddbmm
beta=0
Quoated from the PyTorch doc:
If beta is 0, then input will be ignored, and nan and inf in it will not be propagated.
beta
input
But torch_musa propagates nan.
torch_musa
nan
>>> import torch >>> import torch_musa >>> dev = torch.device('musa') >>> inp = torch.full((8, 8, 8), float('nan'), dtype=torch.float16).to(dev) >>> batch1 = torch.zeros((8, 8, 8), dtype=torch.float16).to(dev) >>> batch2 = torch.zeros((8, 8, 8), dtype=torch.float16).to(dev) >>> torch.baddbmm(inp, batch1, batch2, beta=0) tensor([[[nan, nan, nan, nan, nan, nan, nan, nan], [nan, nan, nan, nan, nan, nan, nan, nan], [nan, nan, nan, nan, nan, nan, nan, nan], [nan, nan, nan, nan, nan, nan, nan, nan], [nan, nan, nan, nan, nan, nan, nan, nan], [nan, nan, nan, nan, nan, nan, nan, nan], [nan, nan, nan, nan, nan, nan, nan, nan], [nan, nan, nan, nan, nan, nan, nan, nan]], (more outputs are omitted)
Tested with torch_musa 1.1.0+fb1871f.
1.1.0+fb1871f
The text was updated successfully, but these errors were encountered:
thank you! we are fixing it!
Sorry, something went wrong.
No branches or pull requests
Quoated from the PyTorch doc:
But
torch_musa
propagatesnan
.Tested with
torch_musa
1.1.0+fb1871f
.The text was updated successfully, but these errors were encountered: