Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Insufficient memory when processing large files #158

Open
1 task done
echoIIImk2 opened this issue Nov 25, 2024 · 1 comment
Open
1 task done

[Bug]: Insufficient memory when processing large files #158

echoIIImk2 opened this issue Nov 25, 2024 · 1 comment
Labels
bug Something isn't working good first issue Good for newcomers help wanted Extra attention is needed

Comments

@echoIIImk2
Copy link

Describe the bug

When I processed an 8-hour audio file, the memory usage exceeded 100GB, and then it crashed due to insufficient memory. I hope to add a feature that automatically splits the file (for example, into 2-hour segments), processes each segment separately, and then merges them back together.

Have you searched for existing issues? 🔎

  • I have searched and found no existing issues.

Screenshots or Videos

No response

Logs

No response

System Info

Operating System: 
Python version: 
Other...

Additional Information

No response

@echoIIImk2 echoIIImk2 added the bug Something isn't working label Nov 25, 2024
@beveradb
Copy link
Collaborator

beveradb commented Dec 8, 2024

A PR to add this splitting/combining for large files would be welcomed!

See existing discussion and code on #44 (comment)

@beveradb beveradb added good first issue Good for newcomers help wanted Extra attention is needed labels Dec 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working good first issue Good for newcomers help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants