We Want to Hear From You! What Do You Want to See on the Community? Tell us here!
Forum Discussion
Alek S.
10 years agoNew member | Level 1
Downloading all the files in a folder using CoreSDK
Since you are deprecating the Datastore and Sync SDKs for iOS. I'll need to redo quite a bit of stuff. Anyway, one of the things I'll need to do now is:
I want to download all the files in a particular folder, this folder might have an order of 1000 or 10000 files (each around 500 or so bytes). The approach I'm taking is loadMetadata for a particular folder that contains these files. Then I iterate through the metadata.contents to loadFile for each one. These folders are flat, meaning there are no sub-folders below them. All they contain are a bunch of json files.
The problem I'm seeing atm is, when I try to download all of them using the Core SDK, even with a queueing mechanism it always fails after just 172 files. I've tried different approaches, but no matter what, 172 files is what it ends up with. The errors I get after this magical cut off relate to timeouts (error code -1001).
2015-10-01 14:51:25.601 Constructor[18334:922733] [WARNING] DropboxSDK: error making request to /1/files/sandbox/data/Hello/Stores/docs/f63a8f83b9cf13e0d8607748cecc6a9a76783c4ba209e122cb557c9e53b60ca4 - (-1001) Error Domain=NSURLErrorDomain Code=-1001 "(null)" UserInfo={path=/data/Hello/Stores/docs/f63a8f83b9cf13e0d8607748cecc6a9a76783c4ba209e122cb557c9e53b60ca4, rev=19a031bc0a88, destinationPath=/Users/skela/Library/Developer/CoreSimulator/Devices/5E17BFB4-E369-472B-AEFA-E9BB5D46BAF4/data/Containers/Data/Application/1C99C300-1128-423D-AB9F-170036FD8C2C/Documents/boxes/db/data/Hello/Stores/docs/f63a8f83b9cf13e0d8607748cecc6a9a76783c4ba209e122cb557c9e53b60ca4}
So I'm wondering, is there some kind of mechanism on your end that's only allowing me to download 172 files? If there is, how do I work around it?
7 Replies
Replies have been turned off for this discussion
- Greg-DB10 years ago
Dropbox Community Moderator
There isn't any particular limit on our side around 172 files. The -1001 error code corresponds to NSURLErrorTimedOut though, which can happen for a variety of reasons, e.g., if the network connection fails, or if your connection is too congested. How many simultaneous connections are you allowing? I've seen other developers choosing a value of 4 for that.
- Alek S.10 years agoNew member | Level 1
I'm doing it 1 at a time, in essence its:
for var m in metadata.contents
{
var req = client?.createLoadFileRequest(m.path, atRev:m.rev, intoPath:local.stringByAppendingPathComponent(m.filename))
addToQueue(req)
}
Where createLoadFileRequest can be seen here:
https://github.com/skela/dbcore/blob/master/DropboxCore/DBCRestClient.m#L180
I can't for the life of me figure out why its always 172 files, without fail, even with or without a queue.
Just wanted to make sure that this isn't an issue on your end before I spend too much time trying to solve this.
What's considered best practice if I have to download 10k very small files in a folder using CoreSDK?
- Greg-DB10 years ago
Dropbox Community Moderator
There certainly shouldn't be anything on our side causing this. If the API was going to intentionally fail a request, it would respond with an actual HTTP error code, but -1001 is just a client side error code.
The best practice is to only download files if/when you need them, but if you do need to download many files, the best practice is just to loop through and download them individually, limiting the number of simultaneous connections.
Is it always the same file that this occurs on? Are you able to see how long the request is in fact taking before that error occurs?
- Alek S.10 years agoNew member | Level 1
Thing is, this is a replacement for the Dropbox Datastores (thanks again /s), which were always synced to the device, so yes, I do need all the files in that folder ;)
The gist of what I'm doing and plan on doing is:
1) App starts up
2) App loads the metadata for 1 particular folder where these 10k json files are stored
3) LoadMetadata finishes, with 1 metadata object (called metadata) that has a member called contents, which has 10k items in it.
4) for var m in metadata.contents { addToLoadFileQueueIfNecessary(m) }
5) 1 by 1 go through the queue, loading each file.
6) poll to check if there are any changes and repeat.
Its a bit odd that it always happens at file number 172, so perhaps there's something wrong with that file. Weird if there is because loadMetadata is called right before really.
Part of me was wondering whether because I'm creating all of the DBRequests but not starting them. That perhaps once the queue has reached further down the list, they timeout immediately when I try to start them, because they were created X seconds ago. But its still too odd that its always 172 files.
I'll take a closer look tomorrow though, thanks for your help Gregory.
- Alek S.10 years agoNew member | Level 1
After playing around with the timeouts, and moving some code around , it seems more robust. So thanks Gregory.
Obviously it will still timeout from time to time, whats your preferred strategy to retry failed downloads? I assume you had to build some kind of logic like that into the Sync SDK.
- Steve M.10 years ago
Dropbox Staff
I believe that the Sync SDK employed capped exponential backoff. Pseudocode:
failCount = 0
while true
if attemptDownload().success
break
sleep(min(maxSleep, minSleep * (2**failCount)))
failCount += 1 - Alek S.10 years agoNew member | Level 1
Thanks!
About Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.6,036 PostsLatest Activity: 9 months ago
The Dropbox Community team is active from Monday to Friday. We try to respond to you as soon as we can, usually within 2 hours.
If you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X or Facebook.
For more info on available support options for your Dropbox plan, see this article.
If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!