Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.
I am using Duplicacy for backing up data to my Dropbox Business account. It is a Dropbox API app and writes to the Apps directory. The issue I am having is that when downloading (for restoring or verification of a backup), I'm being rate-limited to an effective download rate of about 20 MB/s, which is a lot slower than I'd like, especially in a disaster recovery scenario.
Duplicacy seems to be making one `/2/files/get_metadata` and one `/2/files/download` per chunk being downloaded. Each chunk (file) tends to be around 5-20 MB. Duplicacy does respect the Retry-After and backs off as requested.
I have two questions:
1. The Dropbox API rate limiting system operates on the number of calls per time period, not bandwidth/throughput.
2. There may be some optimizations that could be made, but they would need to be made by the programmer of the app. It sounds like you're an end-user of Duplicacy, and not the programmer, so you may want to reach out to them regarding this. For example though, based on your description, I would suggest they investigate if they actually need to make one /2/files/get_metadata call per "chunk". For a large number of files, that could add up to a large number of calls. They may want to look into using /2/files/list_folder[/continue] instead, as that enables apps to list multiple files/folders under any particular path using fewer calls (as each call can return multiple entries).
Wow, it's nice to see you're still at Dropbox, Greg! Thanks for the very informative and concise answer as always, especially on #1.
Duplicacy is open-source so it can be changed, but its architecture makes removing the get_metadata call here difficult (for one thing, Duplicacy is generic and needs to work with a variety of storage backends). list_folder doesn't work either because successive chunks are unlikely to be stored in the same directory, and enumerating the entire storage with the recursive option is infeasible for large data sets (I have millions of files). I appreciate the pointers!
-davidz
Thanks for the additional context! In that case, I'm sending this along as a feature request for a batch version of /2/files/get_metadata, but I can't promise if or when that might be implemented.
Hi there!
If you need more help you can view your support options (expected response time for a ticket is 24 hours), or contact us on Twitter or Facebook.
For more info on available support options, see this article.
If you found the answer to your question, please 'like' the post to say thanks to the user!