Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable S3 Upload of >5GB files #38

Open
william-silversmith opened this issue Dec 27, 2020 · 4 comments
Open

Enable S3 Upload of >5GB files #38

william-silversmith opened this issue Dec 27, 2020 · 4 comments
Labels
feature A new capability is born.

Comments

@william-silversmith
Copy link
Contributor

e.g. S3 won't allow upload of large files without performing multi-part upload.

Related to #7

@william-silversmith william-silversmith added the feature A new capability is born. label Dec 27, 2020
@sunnysidesounds
Copy link

Kind of an old issue. But we are in need of this feature. I've been digging around in the cloud-file code a little. How hard do you think this would be to implement? Maybe I (or my team) could contribute to this functionality? Thoughts?

@william-silversmith
Copy link
Contributor Author

If you have the time and energy, I think this would be a great addition. It's not particularly difficult, but I never got around to it. Here's some helpful info: https://medium.com/analytics-vidhya/aws-s3-multipart-upload-download-using-boto3-python-sdk-2dedb0945f11

The CloudFiles interface would look something like:

cf.put_multipart(...) and later if we're clever, we can make it so that it gets called inside of cf.put if the file is too big.

@sunnysidesounds
Copy link

Awesome, let me take a look. I appreciate the quick response 👍

@william-silversmith
Copy link
Contributor Author

This might be solved in #85 though 5GB files were not tested.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature A new capability is born.
Projects
None yet
Development

No branches or pull requests

2 participants