Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.
Say I'd got a 300MB file, the example docs state to read the file to be uploaded in one gulp since an append method is not available(https://www.dropboxforum.com/t5/API-support/How-to-append-to-existing-file/m-p/271603/highlight/true...) , via the api(or any method possible for which the file can be uploaded in chunks) is there a api call that tells us the amount of file uploaded so that we can measure the status to construct something like a progressbar for our applications?
There isn't a way to check the progress of an in-process upload remotely via the API. Depending on what, if any, SDK or library you're using though, there may be a local progress callback available though. (For example, here's a sample of using that in the Objective-C SDK.) Check the documentation for whatever SDK/library, if any, you're using for more information.
Note that for files larger than 150 MB though, you should be using "upload sessions", which do have you upload files in pieces:
https://www.dropbox.com/developers/documentation/http/documentation#files-upload_session-start
https://www.dropbox.com/developers/documentation/http/documentation#files-upload_session-append_v2
https://www.dropbox.com/developers/documentation/http/documentation#files-upload_session-finish
In that way, you can track (again, from the API client side) how much you've uploaded, in between calls.
Yes, I actually found out about the upload session recently, but it's giving me errors. For example, I'm using this core logic for upload sessions, but I'm getting a connection timeout error or something like API error.
test_files = ['testfile.iso']
file_path = os.path.join(os.path.dirname(os.path.abspath('__file__')),test_files[0]) file_size =os.path.getsize(file_path) chunk_size = 4*1024*1024 # 4mb chunks dest_path = '/upload_filetest.iso' dbx = dropbox.Dropbox("{}".format(dropbox_id))#dropbox_id is my api key f = open(file_path, 'rb') if file_size <= chunk_size: dbx.files_upload(f.read(), dest_path) f.close() print("file has been uploaded") else: upload_session_info = dbx.files_upload_session_start(f.read(chunk_size), close=False) print("Chunk uploaded from start: ", f.tell()/(1024*1024)) #session_id = upload_session_info.session_id print("The session_id is: ", upload_session_info.session_id) cursor = dropbox.files.UploadSessionCursor(session_id=upload_session_info.session_id, offset=f.tell()) commit = dropbox.files.CommitInfo(dest_path) while f.tell() < file_size: if (file_size - f.tell()) <= chunk_size: dbx.files_upload_session_finish(f.read(chunk_size), cursor, commit) print("chunk uploaded in finish: ", f.tell()/(1024*1024)) time.sleep(1) else: dbx.files_upload_session_append_v2(f.read(chunk_size), cursor) cursor = dropbox.files.UploadSessionCursor(session_id=upload_session_info.session_id,offset=f.tell()) print("Chunk uploaded in append: ", f.tell()/(1024*1024)) time.sleep(1) f.close()
I had been researchin on this, and found that, I should replace
cursor = dropbox.files.UploadSessionCursor(session_id=upload_session_info.session_id,offset=f.tell())
with
cursor = f.tell()
Don't know if it was the issue, but, the error is gone. I tried cutting off my network and it gave me a similar issue, so, it could've related to network connectivity as well.
Hi there!
If you need more help you can view your support options (expected response time for a ticket is 24 hours), or contact us on X or Facebook.
For more info on available support options for your Dropbox plan, see this article.
If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!