Your workflow is unique 👨‍💻 -  tell us how you use Dropbox here.

Forum Discussion

bb-jacin's avatar
bb-jacin
Explorer | Level 3
3 years ago

/files/save_url has high rate of failure

Hi, I am currently testing using the /files/save_url API endpoint and am experiencing a high rate of failure. I am testing using a set of 6 small image files stored in S3-compatible storage environments, and using their public URLs with the aforementioned API endpoint. In every test run, there is at least one file that fails to be created in Dropbox, and it is a different file from the set each time. A file that fails in test #1 will succeed in tests #2, #3 and fail again in test #4 etc. These image files are standard .jpg files and generally 500kb or less.

 

Upon checking the status of the API requests using /files/save_url/check_job_status, the failed IDs return the "download_failed" response. I know there is a 5 minute limit on file transfers using /files/save_url, however all these failed IDs are returning their failure response only a few seconds after initiating the API request.

 

It may be worth mentioning that the API requests to /files/save_url are occurring all together. I don't believe the URLs are being rate-limited on the S3-compatible storage side, as it is tested with multiple environments, including self-hosted storage without such restrictions.

 

Done some googling, found some users with the same issue, but unable to find anything that resolves the issue.

 

Some example failed job IDs:

nJekCKfZPnwAAAAAAAAAAQ
aPQH1k_ei7cAAAAAAAAAAQ

 

Is there a way to resolve this issue?

14 Replies

  • DB-Des's avatar
    DB-Des
    Icon for Dropbox Community Moderator rankDropbox Community Moderator
    1 month ago

    Hi nyou045​ 

    Apologies, I looked into this deeper and found that your samples were not explicit rate limiting, but were actually hitting the same internal constraint as discussed earlier in this thread. This is still open with the team and I'll follow up with any updates.

  • nyou045's avatar
    nyou045
    Explorer | Level 4
    7 days ago

    My colleague has just sent me an email asking me to copy 3.8TB from an AWS S3 bucket into his Dropbox team folder. Some of the files in the bucket are 1.7GB. At ~1MB/s, these files will take ~29 minutes to save. Which is greater than the 15 minute timeout. Can this 15 minute timeout please be increased?

  • nyou045's avatar
    nyou045
    Explorer | Level 4
    6 days ago

    As predicted:

    FAILED after 952s https://quadrant-universityofauckland.s3.amazonaws.com/delivery/year%3D2025/month%3D07/day%3D06/000004_0_c8601ccb-23fe-4e2d-8e3c-043e13aa37b4_20260203_185011_00313_7w84i.gz?AWSAccessKeyId=AKIAZDICVH6VPNMUQAXQ&Signature=49QsO2MiZxeObXnnsibD6cq94qY%3D&Expires=1770581161 2147247536 bytes SaveUrlResult('async_job_id', 'IBewBfjP7IIAAAAAAAAAAQ')

  • DB-Des's avatar
    DB-Des
    Icon for Dropbox Community Moderator rankDropbox Community Moderator
    5 days ago

    nyou045​ 

    We’re not able to increase rate limits for a specific app, user, or team. That said, I'll pass this along as a feature request to increase the 15 minute time-out. I can't promise if or when that might be implemented though.

About Dropbox API Support and Feedback

Node avatar for Dropbox API Support and Feedback
Get help with the Dropbox API from fellow developers and experts.

The Dropbox Community team is active from Monday to Friday. We try to respond to you as soon as we can, usually within 2 hours.

If you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X, Facebook or Instagram.

For more info on available support options for your Dropbox plan, see this article.

If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!