Skip to content

torch.cuda.OutOfMemoryError: CUDA out of memory. #15

@sankexin

Description

@sankexin

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 1226.19 GiB. GPU has a total capacity of 63.98 GiB of which 36.29 GiB is free. Of the allocated memory 24.03 GiB is allocated by PyTorch, and 987.10 MiB is reserved by PyTorch but unallocated.

how to reduce memory for infer?

Metadata

Metadata

Assignees

Labels

help wantedExtra attention is needed

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions