2025 sparked some incredible conversations across our community 👩💻. Discover the highlights and see what’s ahead in 2026.
Forum Discussion
bb-jacin
3 years agoExplorer | Level 3
/files/save_url has high rate of failure
Hi, I am currently testing using the /files/save_url API endpoint and am experiencing a high rate of failure. I am testing using a set of 6 small image files stored in S3-compatible storage environments, and using their public URLs with the aforementioned API endpoint. In every test run, there is at least one file that fails to be created in Dropbox, and it is a different file from the set each time. A file that fails in test #1 will succeed in tests #2, #3 and fail again in test #4 etc. These image files are standard .jpg files and generally 500kb or less.
Upon checking the status of the API requests using /files/save_url/check_job_status, the failed IDs return the "download_failed" response. I know there is a 5 minute limit on file transfers using /files/save_url, however all these failed IDs are returning their failure response only a few seconds after initiating the API request.
It may be worth mentioning that the API requests to /files/save_url are occurring all together. I don't believe the URLs are being rate-limited on the S3-compatible storage side, as it is tested with multiple environments, including self-hosted storage without such restrictions.
Done some googling, found some users with the same issue, but unable to find anything that resolves the issue.
Some example failed job IDs:
nJekCKfZPnwAAAAAAAAAAQ
aPQH1k_ei7cAAAAAAAAAAQ
Is there a way to resolve this issue?
8 Replies
Looks like this is still a problem in 2026. Here's an example errors output from my Python script:
FAILED after 9s https://nz-imagery.s3.amazonaws.com/auckland/auckland_2022_0.075m/rgb/2193/AY30_1000_4743.json 1162 bytes SaveUrlResult('async_job_id', 'XGNn-ChuPpYAAAAAAAAAAQ') SaveUrlJobStatus('failed', SaveUrlError('download_failed', None))
FAILED after 6s https://nz-imagery.s3.amazonaws.com/auckland/auckland_2022_0.075m/rgb/2193/AY31_1000_2427.json 1161 bytes SaveUrlResult('async_job_id', 'wE6jqobcPoEAAAAAAAAAAQ') SaveUrlJobStatus('failed', SaveUrlError('download_failed', None))
FAILED after 6s https://nz-imagery.s3.amazonaws.com/auckland/auckland_2022_0.075m/rgb/2193/AY31_1000_3031.json 1163 bytes SaveUrlResult('async_job_id', '8M1dCfp8QgUAAAAAAAAAAQ') SaveUrlJobStatus('failed', SaveUrlError('download_failed', None))
When it does work, the performance seems a lot lower than I expected too - it takes ~95s to save a 99.4MB file (1MB/s, 8.374Mbit/s)
Copied after 95s https://nz-imagery.s3.amazonaws.com/auckland/auckland_2022_0.075m/rgb/2193/AY31_1000_3520.tiff 99435762 bytes
- Greg-DB1 year ago
Dropbox Community Moderator
julio_diniz_perdigao There isn't webhook functionality specifically for /2/files/save_url jobs, but webhooks would be sent for files added via /2/files/save_url in a connected account just like files from any other source. To check the status of any particular /2/files/save_url job, you should poll /2/files/save_url/check_job_status.
- julio_diniz_perdigao1 year agoExplorer | Level 3
Great! Another question: This method has an webhook or we need to check for updates periodically using save_url/check_job_status to see if process was done?
Thanks
- Greg-DB1 year ago
Dropbox Community Moderator
julio_diniz_perdigao The documentation for /2/files/save_url is correct; the limit was changed from 5 minutes to 15 minutes in 2023.
- julio_diniz_perdigao1 year agoExplorer | Level 3
Documentation says 15 minutes to upload using save_url and here, you said that is 5 minutes. The doc was update and we can trust it? Or it's really 5 minutes?
Thanks
- Greg-DB3 years ago
Dropbox Community Moderator
I don't have much more specific information to share, but it does seem that it may be related to issuing multiple jobs at or around the same time, likely resulting in the same sort of lock contention discussed here.
I'll also pass this along as a feature request for a batch version of this call to better support use cases where you need to submit many of these, but I can't promise if or when that might be implemented.
- bb-jacin3 years agoExplorer | Level 3
Greg-DB wrote:Thanks for the report! It looks like this is hitting an internal constraint. We'll look into what we may be able do about that. In the meantime, to work around this, you may want to add an automatic retry or two (i.e., starting at save_url again for the relevant URL) when this occurs.
Thanks for the response. Unfortunately, our intended use case for our Dropbox API implementation would result in us transferring a greater amount than 6 files at once, so automatic retries would not really be viable at the current rate of failure. Are you able to provide any information on what this internal constraint is? If so, we may be able to work around it.
- Greg-DB3 years ago
Dropbox Community Moderator
Thanks for the report! It looks like this is hitting an internal constraint. We'll look into what we may be able do about that. In the meantime, to work around this, you may want to add an automatic retry or two (i.e., starting at save_url again for the relevant URL) when this occurs.
About Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.
The Dropbox Community team is active from Monday to Friday. We try to respond to you as soon as we can, usually within 2 hours.
If you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X, Facebook or Instagram.
For more info on available support options for your Dropbox plan, see this article.
If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!