Need to see if your shared folder is taking up space on your dropbox 👨💻? Find out how to check here.
Forum Discussion
axew3
8 years agoCollaborator | Level 8
Linked files not retrieved with list_shared_links while files exists in dbx and appears as linked
Hi all! Out of contest, i'm on the way to release a new wordpress plugin on wp repo, with a custom chooser (that i maybe will publish on github also, because the chooser is a standalone js piece of ...
- 8 years agoWe'll be happy to look into this, but we'll likely need some more specific information to make sure we have a good understanding of the issue you're describing. Can you share the following:
- the specific steps to reproduce the issue
- the code used in those steps to reproduce the issue
- sample output showing the issue
If you'd prefer to share that privately, you can open an API ticket with it here:
https://www.dropbox.com/developers/contact
Thanks in advance!
axew3
8 years agoCollaborator | Level 8
Unfortunately the code is within a plugin, it will require too much time to remove all things doing a standalone more short version of all (except the part about upload that is substantially an html/js file): easy, but too long! i will send the entire plugin, that need to be installed into a wordpress to work. If possible you'll maybe take a look into the two main php files i will indicate, that are containing the code that execute php curl calls. But i assume it do the work correctly, since normally it work fine and this behavior only sometime happen, but maybe i'm wrong.
I'm just experiencing right now the issue reported and another time i see this:
Using a full dropbox access, i had a single folder named wtest, linked on sharing: i have upload 9 files on root (but the same result happened last time doing a deletion inside a linked test subfolder, so not root).
Then i have curl to delete these 9 files.
Result: next curl to list_shared_links, return nothing.
No reason looking into axew3 dbx account via browser, because the folder wtest apparently result as linked and anyone with the link should have access to it. A curl should return it, instead it is ignored. I've leave this folder wtest in this state if anyone would like (or have the power to) to take a look into this folder setting. It result as linked, but is not retrieved.
wtest folder about link setting result as follow:
Who can view this link? -> Anyone with the link can view
I'm quite sure that if i go to unlink the folder and to re-create the link, then the folder will return available like last times this is occurred. At moment i've leave the folder wtest in this state, creating a new folder testme and creating the link for it. It is correctly retrieved, while the wtest folder is not. I so assume that is a problem that happen on files deletion.
Sorry for the delay on posting the code to you, hard time here due to "no coding" reasons. I'll be over as soon i can.
axew3
8 years agoCollaborator | Level 8
Ok Greg, i've take a look in various little moments these days, and i've discover that there is a problem with my php recursion function.
The problem: the function work fine on first call, but after fall into a loop after second recursion, that before never was coming out because the code for other reasons was stopped and recursion never happen, so this was the reason of "folders not retrieved".
I ask you forgive me for this big mistake!
I just see now that the first returned cursor for recursion is different by the second logged, but after, the function retrieve ever the same cursor (that's at moment my doubt: i can't understand at moment why, after another curl the function return me another time the previous cursor value, so the recursive function loop because the next files array will be ever the same of before, and
$res_more->has_more
never become false!
This lead to the loop now. Should be easy to resolve, by the way i've still not understand why.
This stupid code i've do more then 2 years ago, is making me crazy. But i'm near to solution i guess! Sorry Greg, and thank you again!
- axew38 years agoCollaborator | Level 8
Here again for a question:
it seem to me that the cursor return on results, even if the first call not retrieve 200 files?
I remember (maybe i'm wrong) that each curl retrieve 200 files.
So asking me why, if the first call not retrieve almost 200 files, it return by the way a cursor and has_more set as value 1.
???
I'm wrong? I get all working, after i apply a count() to the first call: i then will go to iterate only if files are more then 200, and not if the has_more is false (because it return 1 on first curl even if there aren't more then 200 files). Isn't it?
- axew38 years agoCollaborator | Level 8
So just an hint to all cool people that eventually will fall into my same mess:
1) memo that the cursor ever return, and you need to count retrieved links to iterate or stop the iteration (this is true using http with php curl). I have forget this after long time (+ then 2 years), and i've lost my time behind this stupid thing another time now.
2) memo that maybe error,
PHP Fatal error: Maximum execution time of 30 seconds exceeded
can come out like in my case: set php ini to almost 60 sec, or you'll maybe thought like me, that the code was wrong looping in some part.
I will add a preloader for the curl execution time. I realize it is necessary.My cache class was wrong, and the recursive function was failing in a specific situation.
Just to let know that now all is working like a charm.
... as we like to say: hope this will save an headache to someone else!
About Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.
The Dropbox Community team is active from Monday to Friday. We try to respond to you as soon as we can, usually within 2 hours.
If you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X, Facebook or Instagram.
For more info on available support options for your Dropbox plan, see this article.
If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!