Download a folder with more than 50 files #185
Replies: 8 comments 2 replies
-
I thought that if I could get the links/ids of all small folders (<50 files) and files, then I will be able to iterate through them and download the contents of the full folder. I also thought that recording the download status is a good idea in case the process interrupts so that you don't need to start from scratch. Here is how you can get the links to files/folders: StackExchange Answer. I recorded my attempt of using it in TCGA-lung-download repository. It did not fully work since after some time I started running into the problem described in issue#43 for Do you have any idea how I can catch the |
Beta Was this translation helpful? Give feedback.
-
Line 183 in 12217e3 |
Beta Was this translation helpful? Give feedback.
-
Thank you! I will try to use this somehow. It does not solve the problem straight away since I was using gdown/gdown/download_folder.py Line 359 in 12217e3 By the way, why do you limit the number of downloadable files within one folder to 50? gdown/gdown/download_folder.py Line 27 in 12217e3 |
Beta Was this translation helpful? Give feedback.
-
This is why: #90 (comment) |
Beta Was this translation helpful? Give feedback.
-
Makes sense! Thank you! |
Beta Was this translation helpful? Give feedback.
-
Hm, this would be interesting. An alternative would be to script |
Beta Was this translation helpful? Give feedback.
-
Have not tried |
Beta Was this translation helpful? Give feedback.
-
It could be a good workaround if |
Beta Was this translation helpful? Give feedback.
-
I tried downloading a folder from Google Drive with more than 50 files. Is there a way to use
gdown
to download such a folder?Beta Was this translation helpful? Give feedback.
All reactions