cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements
What’s new: end-to-end encryption, Replay and Dash updates. Find out more about these updates, new features and more here.

Dropbox API Support & Feedback

Find help with the Dropbox API from other developers.

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

[Python V2] How to do batch upload?

[Python V2] How to do batch upload?

ivanhigueram
Explorer | Level 4

Hi! 

I am trying to upload several files using the sessions and batch operators in the Dropbox SDK for Python. I'm trying to do something like this: 

dbx = dropbox.Dropbox(<API KEY>)

commit_info = []
for df in list_pandas_df: 
    df_raw_str = df.to_csv(index=False)
    upload_session = dbx.upload_session_start(df_raw_str.encode())
    commit_info.append(
dbx.files.CommitInfo(path=/path/to/db/folder.csv
) dbx.files_upload_finish_batch(commit_info)

But, I do not completely understand from the documentation, how should I pass the commit data and the session info to the "files_upload_session_finish_batch" function. The "files_upload_session_finish" function does allow a commit and a cursor, while the documentation states that the batch option only takes a list of commits. 

My files are not particularly big, but they are numerous. That's why I'm not using any of the appending options. Should I use any cursor? I'm a little bit lost here 😞 

12 Replies 12

Greg-DB
Dropbox Staff

[Cross-linking for reference: https://stackoverflow.com/questions/54758978/dropbox-python-api-upload-multiple-files ]

Apologies for the confusion! The Python SDK documentation unfortunately doesn't do a good job identifying the types expected in certain parameters like this; I'll ask the team to work on improving that in the future.

The `files_upload_session_finish_batch` method does work differently than the `files_upload_session_finish` method. The `files_upload_session_finish_batch` method expects a list of `UploadSessionFinishArg`, where each one encapsulates the cursor and commit info together.

Here's a basic working example that shows how to do this:

import dropbox

ACCESS_TOKEN = "..."

dbx = dropbox.Dropbox(ACCESS_TOKEN)

local_file_path = "..."

upload_entry_list = []

for i in range(5):
    f = open(local_file_path)
    upload_session_start_result = dbx.files_upload_session_start(f.read(), close=True) # assuming small files
    cursor = dropbox.files.UploadSessionCursor(session_id=upload_session_start_result.session_id,
                                               offset=f.tell())
    commit = dropbox.files.CommitInfo(path="/test_329517/%s" % i)
    upload_entry_list.append(dropbox.files.UploadSessionFinishArg(cursor=cursor, commit=commit))

print(dbx.files_upload_session_finish_batch(upload_entry_list))

# then use files_upload_session_finish_batch_check to check on the job

 

ivanhigueram
Explorer | Level 4

Hi Greg! 

Again thanks for replying back so promptly. I have some follow up questions. If I am looping through different Python objects, not files, that's why I'm first converting the pd.DataFrame to a string, and then pointing it to the dbx.files_upload_session_start() function. 

Since my file is complete, and I am not passing a files in a context manager or a StringIO, I did not specify any offset in the cursor. Now that I'm trying to run the loop, I received the following error: 

Traceback (most recent call last):
  File "/Users/ivan/.pyenv/versions/weather_data/lib/python3.6/site-packages/dropbox/stone_serializers.py", line 337, in encode_struct
    field_value = getattr(value, field_name)
  File "/Users/ivan/.pyenv/versions/weather_data/lib/python3.6/site-packages/dropbox/files.py", line 10278, in offset
    raise AttributeError("missing required field 'offset'")
AttributeError: missing required field 'offset'

 

What are the real advantages of using the batch operations to upload files? Seems a convulted use case for objects in memory, rather than files. 

Thanks again for your help.

Greg-DB
Dropbox Staff

Regardless of where the data is coming from, the `UploadSessionCursor` object does require an `offset`, in order "to make sure upload data isn’t lost or duplicated in the event of a network error". It sounds like in your case the `offset` value would be the length of the string. 

The main advantage of using `files_upload_session_finish_batch` is to minimize the number of "locks" needed when uploading multiple files. The Data Ingress Guide covers this in more detail. This applies to uploading from memory or files.

The main advantage of using "upload sessions" to begin with is to enable apps to upload large files. If you're just uploading small files, you can certainly do so just using `files_upload`, but you'd need to do so serially to avoid lock contention. Based on the code you provided, you're already uploading serially though, so it may not make much of a difference if you wish to switch to `files_upload` for simplicity.

9krausec
Explorer | Level 4

Hey Greg. Would you mind going over an approach assuming the files being batch uploaded aren't small? As in "files_upload_session_append_v2" would be best to read in each file by a chunk size? Thank you.


Greg-DB
Dropbox Staff

@9krausec I posted a sample here that may be useful. Hope this helps!

9krausec
Explorer | Level 4

Thank you for the reply Greg. I did stumble on that post.

I should of been more specific with my question - 

If I have 20 files that I want uploaded in a single session, using "dbx.files_upload_session_finish_batch()" and "dbx.files_upload_session_append_v2()" for files exceeding the set chunk size, how would I go about doing so?

The example you posted here works great without using "files_upload_session_append_v2" as the files are assumed to be small.

The example you linked works great, if uploading a single large file and wanted to chunk it out.

I'm having difficulties figuring out how to approach combining the two examples with successful results. 

Thank you for any help.

9krausec
Explorer | Level 4

@Greg-DB  - Sorry, forgot to mention you in reply.

Greg-DB
Dropbox Staff

@9krausec Thanks for following up with the clarification. First, note that one "upload session" is required per file to be uploaded. So, you'd need to call files_upload_session_start once per file (so, 20 total in this case), and files_upload_session_append_v2 zero or more times per file (depending on each file's size).

 

Then, to commit them together to avoid contention, you'd call files_upload_session_finish_batch once with the information for all 20 upload sessions. (That's instead of calling files_upload_session_finish once per file.)

 

So, you will need to combine portions of each of the examples for your use case. The example in this thread shows how to start multiple upload sessions and then commit them all together, while the example in the other thread shows how to append to any given upload session repeatedly as needed.

9krausec
Explorer | Level 4

@Greg-DB  - Thank you for following this and providing input. I think I'm close to a solution and will continue tinkering. In the meantime, please review my method below and provide any further suggestions.

For output, I am getting an "UploadSessionFinishBatchLaunch" return at the end with an async_job_id. However, nothing is uploaded to dropbox after completion. Thank you.

from os import listdir
from os.path import isfile, join
import dropbox

def batchUpload(srcFiles, dstDir, dbx=None):
    if not dbx: dbx = DropboxTeam(DROPBOX_TOKEN).as_admin(ADMIN_ID)
    chunkSize = 2 * 1024 * 1024
    uploadEntryList = []
    for src in srcFiles:
        
        baseName = os.path.basename(src)
        dstPath  = dstDir + "/" + baseName
        fileSize = os.path.getsize(src)

        print("{0} ==> {1}".format(src, dstPath))

        with open(src, "rb") as fp:
            if fileSize <= chunkSize: # file smaller than chunkSize, no files_upload_session_append_v2
                uploadSessionStartResult = dbx.files_upload_session_start(fp.read(), dstPath, close=True)
                cursor = dropbox.files.UploadSessionCursor(session_id=uploadSessionStartResult.session_id,
                                            offset=fp.tell())
                commit = dropbox.files.CommitInfo(path=dstPath)
                
            else: # file larger than chunkSize. Use files_upload_session_append_v2
                uploadSessionStartResult = dbx.files_upload_session_start(fp.read(chunkSize))
                cursor = dropbox.files.UploadSessionCursor(session_id=uploadSessionStartResult.session_id,
                                                        offset=fp.tell())
                commit = dropbox.files.CommitInfo(path=dstPath)

                while fp.tell() <= fileSize:
                    if ((fileSize - fp.tell()) <= chunkSize): # file chunk append complete
                        print("COMPLETE CHUNKS")
                        uploadEntryList.append(files.UploadSessionFinishArg(cursor=cursor, commit=commit))
                        break
                    else:
                        print(fp.tell())
                        dbx.files_upload_session_append_v2(fp.read(chunkSize),
                                cursor)
                        cursor.offset = fp.tell()
            uploadEntryList.append(files.UploadSessionFinishArg(cursor=cursor, commit=commit))

    print( dbx.files_upload_session_finish_batch(uploadEntryList) )


srcDir = "C:/Users/alpha/Documents/scratch/LIGHTS - Copy"
srcFiles = [join(srcDir, f).replace('\\', '/') for f in listdir(srcDir) if isfile(join(srcDir, f))]
dstDir = "/scratch/poopOne"
dbx = DropboxTeam(DROPBOX_TOKEN).as_admin(ADMIN_ID)
batchUpload(srcFiles, dstDir, dbx)
Need more support?
Who's talking

Top contributors to this post

  • User avatar
    Greg-DB Dropbox Staff
  • User avatar
    9krausec Explorer | Level 4
What do Dropbox user levels mean?