Need to see if your shared folder is taking up space on your dropbox 👨💻? Find out how to check here.
Forum Discussion
Cynthia B.2
10 years agoNew member | Level 1
500 internal Server Error https://api.dropboxapi.com/2/files/list_folder/continue
I've using the following endpoint to retrieve files and folders:
https://api.dropboxapi.com/2/files/list_folder this returns 2000 entries and then get hasmore = true and the cursor. I then user th...
attic
9 years agoExplorer | Level 4
Also experiencing the 500 error, please let me know if this is the same issue or something new and hopefully an idea of when we could expect a fix?
x-dropbox-request-id:1aad2bb340a5f060d4890c5a1185a8ab
Greg-DB
Dropbox Community Moderator
9 years agoThanks for the report attic! That's the same issue as the one volkeru reported. We're working on it.
- aaronboodman9 years agoNew member | Level 2Can you please give some indication of how long you expect this to take? We need to decide whether to workaround or not.
- Greg-DB9 years ago
Dropbox Community Moderator
I don't have an ETA for a fix right now. I'll post here once I have an update. - attic9 years agoExplorer | Level 4
FYI, my problem seems to have been resolved/fixed as of Wednesday, January 18th. Thanks for helping to ensure it got fixed!
- Greg-DB9 years ago
Dropbox Community Moderator
Thanks for confirming that, attic. Yes, the issue that volkeru/attic reported should be (mostly) resolved now. You may see some more, that we're working on fixing, but the rate should be much lower now. - jackolicious9 years agoNew member | Level 2
Hi there, also seeing this happen repeatedly with no way to get around it.
Running:
files_list_folder('', recursive=True,include_media_info=True)Then getting:
dropbox.exceptions.InternalServerError: InternalServerError('b1b9642f8210d246a2654454721d6469', 500, u'')Any help would be greatly appreciated!
- Greg-DB9 years ago
Dropbox Community Moderator
Thanks for the report! We're still working on these remaining errors. - thedeck9 years agoHelpful | Level 6
Hello,
I'm experiencing the exact same symptoms while calling list_folder/continue.
X-Dropbox-Request-Id: 5f1b4cdc32283d77ee8f6fecb42597f0
Any ETA on the fix ? the v1 end date is coming fast.
Thanks !
- jackolicious9 years agoNew member | Level 2
If it helps, I was able to rectify the problem by:
- iterating manually over all folders (instead of the recursive list)
- finding the folder (or folders in my case) that would cause a 500 error
- delete the folder(s) (send to dropbox trash)
- once the complete iterative test is able to be run, restore all folders from dropbox trash
- tada! the recursive list_folder works again.
:thinking:
- thedeck9 years agoHelpful | Level 6
Thanks for the info. Unfortunately it's not really a possible workaround for my usecase :cry:
- Greg-DB9 years ago
Dropbox Community Moderator
Thanks for checking in, thedeck. I don't have an update on that one yet. I'll follow up here once I do.
And thanks for the information, jackolicious. For reference though, while the errors from different people on this thread are reported the same way to the API clients (i.e., a 500 error), they may be caused by different issues on the backend, so the workaround may not work for everyone. - volkeru9 years agoExplorer | Level 3As I stated earlier in this thread it's not necessary to delete files or folders. It's sufficient to rename the folder that is affected by the problem to a different name and rename it back to it's initial name.
About Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.
The Dropbox Community team is active from Monday to Friday. We try to respond to you as soon as we can, usually within 2 hours.
If you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X, Facebook or Instagram.
For more info on available support options for your Dropbox plan, see this article.
If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!