Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warning about non-writable tensors from params.py #9

Open
ScottTodd opened this issue Apr 29, 2024 · 1 comment
Open

Warning about non-writable tensors from params.py #9

ScottTodd opened this issue Apr 29, 2024 · 1 comment

Comments

@ScottTodd
Copy link
Member

Observed while running the downstream https://github.com/nod-ai/sharktank/blob/main/sharktank/tests/types/dataset_test.py

sharktank/tests/types/dataset_test.py::DatasetTest::testDatasetRoundtrip
  D:\dev\projects\sharktank\deps\shark-turbine\shark_turbine\aot\params.py:163: UserWarning: The given NumPy array is not writable, and PyTorch does not support non-writable tensors. This means writing to this tensor will result in undefined behavior. You may want to copy the array to protect its data or make it writable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:212.)
    return torch.from_numpy(wrapper)

The warning goes away if I flip copy=False to copy=True here:

if self.raw.is_file:
wrapper = np.array(self.raw.file_view, copy=False)
elif self.raw.is_splat:
wrapper = np.array(self.raw.splat_pattern, copy=True)

I imagine we want to avoid copies of large files though, and behavior seems to be correct even with the undefined behavior. Is that warning relevant for our usage? https://github.com/pytorch/pytorch/blob/f1d1e3246f3203a4c9641fcda28b0ed66eb8f4d4/torch/csrc/utils/tensor_numpy.cpp#L205C6-L232

@stellaraccident
Copy link
Collaborator

It's fine. Ignore it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants