Unify buffer_size across TorchData DataPipes#1077
Unify buffer_size across TorchData DataPipes#1077alexanderbattig wants to merge 1 commit intometa-pytorch:mainfrom
Conversation
|
Hi @alexanderbattig! Thank you for your pull request and welcome to our community. Action RequiredIn order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you. ProcessIn order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with If you have received this in error or have any questions, please contact us at [email protected]. Thanks! |
|
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks! |
ejguan
left a comment
There was a problem hiding this comment.
I updated the summary and PR title. And, I have an early comment that we might want to add a constant DEFAULT_BUFFER_SIZE to torchdata/_constants.py. And, reference all of usage to this place. Then, we can add a function/context manager to update it if users need to change all buffer size at the same time.
Fixes: #335
As draft, since the tests don't run for me at the moment.
Also, CLA pending at the moment.