cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements
What’s new: end-to-end encryption, Replay and Dash updates. Find out more about these updates, new features and more here.

Dropbox API Support & Feedback

Find help with the Dropbox API from other developers.

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

python upload file OverflowError

python upload file OverflowError

hramosvz
Explorer | Level 4
Go to solution

I need to upload all the files in a directory through the API, the largest file is about 20 gb, but I keep getting an OverflowError with files larger than 2GB. I'm using the example of this thread (https://www.dropboxforum.com/t5/API-Support-Feedback/python-upload-big-file-example/m-p/166626). But I can't make it work.

 

This is the code:

 
def subebackups():
    dbx = dropbox.Dropbox('MY_APP_TOKEN',timeout=36000)
    rootdir = 'ROOT_DIR'
    file_size = os.path.getsize(rootdir)
    CHUNK_SIZE = 50 * 1024 * 1024
    for dir, dirs, files in os.walk(rootdir):
        for file in files:
            file_path = os.path.join(dir, file)
            dest_path =  os.path.join('/Backups/Backups', file).replace("\\","/")
 
            with open(file_path, "rb") as f:
                if file_size <= CHUNK_SIZE:
                    dbx.files_upload(f.read(), dest_path)

                else:
                
                   upload_session_start_result = dbx.files_upload_session_start(f.read(CHUNK_SIZE))
                   cursor = dropbox.files.UploadSessionCursor(session_id=upload_session_start_result.session_id,
                                                              offset=f.tell())
                   commit = dropbox.files.CommitInfo(path=dest_path)

                   while f.tell() < file_size:
                       if ((file_size - f.tell()) <= CHUNK_SIZE):
                           print(dbx.files_upload_session_finish(f.read(CHUNK_SIZE),
                                                           cursor,
                                                           commit))
                       else:
                           dbx.files_upload_session_append(f.read(CHUNK_SIZE),
                                                           cursor.session_id,
                                                           cursor.offset)
                           cursor.offset = f.tell()

This is the ouput with files larger than 2 GB:

 

Traceback (most recent call last):
File "C:\Users\Ordico\Downloads\tools-master\tools-master\dropbox\SubeBackupsDropbox.py", line 66, in <module>
subebackups()
File "C:\Users\Ordico\Downloads\tools-master\tools-master\dropbox\SubeBackupsDropbox.py", line 40, in subebackups
dbx.files_upload(f.read(), dest_path)
File "C:\env\dropbox\lib\site-packages\dropbox\base.py", line 2960, in files_upload
r = self.request(
File "C:\env\dropbox\lib\site-packages\dropbox\dropbox.py", line 311, in request
res = self.request_json_string_with_retry(host,
File "C:\env\dropbox\lib\site-packages\dropbox\dropbox.py", line 461, in request_json_string_with_retry
return self.request_json_string(host,
File "C:\env\dropbox\lib\site-packages\dropbox\dropbox.py", line 557, in request_json_string
r = self._session.post(url,
File "C:\env\dropbox\lib\site-packages\requests\sessions.py", line 578, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "C:\env\dropbox\lib\site-packages\requests\sessions.py", line 530, in request
resp = self.send(prep, **send_kwargs)
File "C:\env\dropbox\lib\site-packages\requests\sessions.py", line 643, in send
r = adapter.send(request, **kwargs)
File "C:\env\dropbox\lib\site-packages\requests\adapters.py", line 439, in send
resp = conn.urlopen(
File "C:\env\dropbox\lib\site-packages\urllib3\connectionpool.py", line 670, in urlopen
httplib_response = self._make_request(
File "C:\env\dropbox\lib\site-packages\urllib3\connectionpool.py", line 392, in _make_request
conn.request(method, url, **httplib_request_kw)
File "c:\users\ordico\appdata\local\programs\python\python38\lib\http\client.py", line 1230, in request
self._send_request(method, url, body, headers, encode_chunked)
File "c:\users\ordico\appdata\local\programs\python\python38\lib\http\client.py", line 1276, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "c:\users\ordico\appdata\local\programs\python\python38\lib\http\client.py", line 1225, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "c:\users\ordico\appdata\local\programs\python\python38\lib\http\client.py", line 1043, in _send_output
self.send(chunk)
File "c:\users\ordico\appdata\local\programs\python\python38\lib\http\client.py", line 965, in send
self.sock.sendall(data)
File "c:\users\ordico\appdata\local\programs\python\python38\lib\ssl.py", line 1204, in sendall
v = self.send(byte_view[count:])
File "c:\users\ordico\appdata\local\programs\python\python38\lib\ssl.py", line 1173, in send
return self._sslobj.write(data)
OverflowError: string longer than 2147483647 bytes

 

 

 

1 Accepted Solution

Accepted Solutions

Greg-DB
Dropbox Staff
Go to solution

From the stack trace, I see that this code is attempting to use the 'files_upload' method to upload this file, which doesn't support files of this size. You should instead be using the upload sessions functionality for large files like this.

 

Looking at your code, it looks like that 'files_upload' code path is mistakenly being used because you've changed the 'file_size' check to be checking the size of 'rootdir', instead of the size of the actual file to upload. You'll need to fix the code to use the size of the file to determine which upload functionality to use. That way, it will use the upload sessions code path for large files.

View solution in original post

2 Replies 2

Greg-DB
Dropbox Staff
Go to solution

From the stack trace, I see that this code is attempting to use the 'files_upload' method to upload this file, which doesn't support files of this size. You should instead be using the upload sessions functionality for large files like this.

 

Looking at your code, it looks like that 'files_upload' code path is mistakenly being used because you've changed the 'file_size' check to be checking the size of 'rootdir', instead of the size of the actual file to upload. You'll need to fix the code to use the size of the file to determine which upload functionality to use. That way, it will use the upload sessions code path for large files.

hramosvz
Explorer | Level 4
Go to solution

You are right! it's working now. Thanks!

Need more support?
Who's talking

Top contributors to this post

  • User avatar
    hramosvz Explorer | Level 4
  • User avatar
    Greg-DB Dropbox Staff
What do Dropbox user levels mean?