-
Notifications
You must be signed in to change notification settings - Fork 36
Closed
Description
Quoated from the PyTorch doc:
If
betais 0, theninputwill be ignored, and nan and inf in it will not be propagated.
But torch_musa propagates nan.
>>> import torch
>>> import torch_musa
>>> dev = torch.device('musa')
>>> inp = torch.full((8, 8, 8), float('nan'), dtype=torch.float16).to(dev)
>>> batch1 = torch.zeros((8, 8, 8), dtype=torch.float16).to(dev)
>>> batch2 = torch.zeros((8, 8, 8), dtype=torch.float16).to(dev)
>>> torch.baddbmm(inp, batch1, batch2, beta=0)
tensor([[[nan, nan, nan, nan, nan, nan, nan, nan],
[nan, nan, nan, nan, nan, nan, nan, nan],
[nan, nan, nan, nan, nan, nan, nan, nan],
[nan, nan, nan, nan, nan, nan, nan, nan],
[nan, nan, nan, nan, nan, nan, nan, nan],
[nan, nan, nan, nan, nan, nan, nan, nan],
[nan, nan, nan, nan, nan, nan, nan, nan],
[nan, nan, nan, nan, nan, nan, nan, nan]],
(more outputs are omitted)
Tested with torch_musa 1.1.0+fb1871f.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels