Skip to content

Conversation

@nshaheed
Copy link

@nshaheed nshaheed commented Mar 9, 2024

  • Added torch.no_grad to main. Without this, this gradients accumulate when doing inference and the memory consumption on longer files explode. Without this fix I was getting OOM errors on 24gb of vram with a 17-minute file. With this change, roughly 9gb of VRAM is used with the same file (and ~1gb when using a small chunk size)
  • removed the re.sub. This isn't being used and caused rave generate to fail with windows filepaths (due to \ being incorrectly read as escape characters)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant