Need to see if your shared folder is taking up space on your dropbox 👨💻? Find out how to check here.
Forum Discussion
ncw
6 years agoCollaborator | Level 8
Rate limiting when uploading files with rclone
I've received complaints from rclone users that file uploads are progressing really slowly.
Digging into it what I see is this:
2020-09-03 11:44:33 DEBUG : too_many_requests/: Too many re...
- 6 years ago
Thanks for the detailed feedback! I'm sharing this with the team.
Greg-DB
Dropbox Community Moderator
6 years agoThanks for the information! I'll follow up here once I have an update on this.
That batch upload functionality can improve performance overall, especially when you are seeing contention issues, but it will vary from case to case. It helps by only taking a lock once per batch, as opposed to once per file.
ncw
6 years agoCollaborator | Level 8
I have implemented batching, and yes it does make a HUGE difference. In fact for my test directory with 200MB of small images (average size 400k) it transfers it 20x faster!
I think the major disadvantage for rclone is that it can't check the hashes of the uploaded files any more. This is because the batch completes after rclone has finished with the input file due to the architecture of rclone. However the user can run "rclone check" after transfers and that is what I'll recommend to users using the batching feature.
So thank you for pointing me at that feature. It makes a huge difference.
- Greg-DB6 years ago
Dropbox Community Moderator
Great, thanks for following up! I'm glad to hear it helps a lot in this case.
- ncw6 years agoCollaborator | Level 8
My Test user is uploading 10TB of images. So lots of 2-3MB files.
Using batching the upload is proceeding very quickly (at about 35MB/s) which is great. However every 30 minutes or so they get this error
too_many_requests/.: Too many requests or write operations.
Along with a 300 second Retry-After which rclone obeys.
Rclone is now using batching of 1000 files at a time. It only sends one batch at a time and waits for it to complete. I think that is all the ingredients for successful batching.
If I upload lots of very small (10 byte) files, I can provoke this message after uploading about 5000 of them.
Any ideas on how I can avoid this 300 second lockout? I tried pacing the uploads but it didn't seem to help.
Thanks
- Greg-DB6 years ago
Dropbox Community Moderator
That rate limit with the longer Retry-After should just be a result of making a very large number of API calls for a particular user in a short period of time (as opposed to lock contention). If you're sending thousands of requests very quickly like that, you can run in to it. (It sounds like you would have normally run in to lock contention before you hit this limit, but that's no longer an issue with the batch commit so you can run much faster.) The only way to avoid this then is to just call at a slower rate. It sounds like you already tried that, but if it didn't help you'll need to limit it further.
About Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.
The Dropbox Community team is active from Monday to Friday. We try to respond to you as soon as we can, usually within 2 hours.
If you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X, Facebook or Instagram.
For more info on available support options for your Dropbox plan, see this article.
If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!