We’re Still Here to Help (Even Over the Holidays!) - find out more here.
Forum Discussion
journeymigration
4 years agoHelpful | Level 5
Retrieve thousands of files and folders without throttling
I'm trying to generate a nice big array for each user that contains all the files and folders the user owns as well as the users that each file/folder is shared with (e.g. users: ["John Doe": [{"name...
Greg-DB
Dropbox Community Moderator
4 years ago[Cross-linking for reference: https://stackoverflow.com/questions/74813769/retrieve-thousands-of-files-and-folders-without-throttling ]
It sounds like you have the right idea here, though be sure to use batch options/endpoints whenever possible. For instance, use recursive:true on /2/files/list_folder and then page through with /2/files/list_folder/continue instead of recursing for each folder yourself. Also, you can use /2/sharing/list_file_members/batch to get the members of multiple files at once.
Also, you may want to start from /2/team/namespaces/list and /2/team/namespaces/list/continue to iterate over and list unique namespaces (such as team member folders, shared folders, etc.) as that may be more efficient than working from the member list.
And for reference, the Dropbox API does have a rate limiting system, but if/when you do hit it, the back-off period is generally only on the scale of seconds/minutes, so that hopefully wouldn't be too disruptive if you do run in to that. You can find more information in the error documentation and Error Handling Guide.
About Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.
The Dropbox Community team is active from Monday to Friday. We try to respond to you as soon as we can, usually within 2 hours.
If you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X, Facebook or Instagram.
For more info on available support options for your Dropbox plan, see this article.
If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!