Cut the Clutter: Test Ignore Files Feature - sign up to become a beta tester here.

Forum Discussion

Vlopez1's avatar
Vlopez1
Explorer | Level 3
2 years ago

Continuous timeout errors for a few minutes when uploading files

I have timeouts which may be related to uploading files using upload sessions.
For instance after uploading a 200MB file I get a read operation timeout when trying to list shared links to that file.
I have had the same issue with a 11MB file, also uploaded with sessions.
 
After that timeout, I attempt to upload again and have a write timeout.
All the time I uninterrupted internet connectivity.
After some time  the problem clears.
 
Read timeout when listing shared links:
----------------------------------------------------
Uploading chunk... 92.7314361944079%
Uploading chunk... 94.79213477650586%
Uploading chunk... 96.8528333586038%
Uploading chunk... 98.91353194070176%
File uploaded successfully to test/attachments_20240716_124331.zip.
Checking if shared link with password already exists...
[ERROR] [1721126840.768204]: Error sending notification: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py", line 536, in _make_request
    response = conn.getresponse()
  File "/usr/local/lib/python3.8/dist-packages/urllib3/connection.py", line 464, in getresponse
    httplib_response = super().getresponse()
  File "/usr/lib/python3.8/http/client.py", line 1348, in getresponse
    response.begin()
  File "/usr/lib/python3.8/http/client.py", line 316, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python3.8/http/client.py", line 277, in _read_status
    line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
  File "/usr/lib/python3.8/socket.py", line 669, in readinto
    return self._sock.recv_into(b)
  File "/usr/lib/python3.8/ssl.py", line 1241, in recv_into
    return self.read(nbytes, buffer)
  File "/usr/lib/python3.8/ssl.py", line 1099, in read
    return self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out
 
The above exception was the direct cause of the following exception:
 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/requests/adapters.py", line 667, in send
    resp = conn.urlopen(
  File "/usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py", line 843, in urlopen
    retries = retries.increment(
  File "/usr/local/lib/python3.8/dist-packages/urllib3/util/retry.py", line 474, in increment
    raise reraise(type(error), error, _stacktrace)
  File "/usr/local/lib/python3.8/dist-packages/urllib3/util/util.py", line 39, in reraise
    raise value
  File "/usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py", line 789, in urlopen
    response = self._make_request(
  File "/usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py", line 538, in _make_request
    self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
  File "/usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py", line 369, in _raise_timeout
    raise ReadTimeoutError(
urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='api.dropboxapi.com', port=443): Read timed out. (read timeout=100)
 
During handling of the above exception, another exception occurred:
 
Traceback (most recent call last):
  File "/deployed_ws/ros1/key_external_notifications/lib/python3/dist-packages/key_external_notifications/dropbox_api.py", line 117, in upload_and_share
    shared_link = create_shared_link_with_password(
  File "/deployed_ws/ros1/key_external_notifications/lib/python3/dist-packages/key_external_notifications/dropbox_api.py", line 97, in create_shared_link_with_password
    links = dbx.sharing_list_shared_links(file_name, direct_only=True)
  File "/usr/lib/python3/dist-packages/dropbox/base.py", line 5190, in sharing_list_shared_links
    r = self.request(
  File "/usr/lib/python3/dist-packages/dropbox/dropbox_client.py", line 326, in request
    res = self.request_json_string_with_retry(host,
  File "/usr/lib/python3/dist-packages/dropbox/dropbox_client.py", line 476, in request_json_string_with_retry
    return self.request_json_string(host,
  File "/usr/lib/python3/dist-packages/dropbox/dropbox_client.py", line 589, in request_json_string
    r = self._session.post(url,
  File "/usr/local/lib/python3.8/dist-packages/requests/sessions.py", line 637, in post
    return self.request("POST", url, data=data, json=json, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/local/lib/python3.8/dist-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/requests/adapters.py", line 713, in send
    raise ReadTimeout(e, request=request)
requests.exceptions.ReadTimeout: HTTPSConnectionPool(host='api.dropboxapi.com', port=443): Read timed out. (read timeout=100)
--------------------------------
 

8 Replies

Replies have been turned off for this discussion
  • Jay's avatar
    Jay
    Icon for Dropbox Community Moderator rankDropbox Community Moderator
    2 years ago

    Hi Vlopez1, thanks for messaging the Community.

     

    Could you clarify how you're uploading these files? What device and OS are you using?

     

    Is this using an app that uses the Dropbox API?

     

    This will help me to assist further!

  • Здравко's avatar
    Здравко
    Legendary | Level 20
    2 years ago

    Vlopez1 wrote:
    I have timeouts which may be related to uploading files using upload sessions.
    ...
      File "/deployed_ws/ros1/key_external_notifications/lib/python3/dist-packages/key_external_notifications/dropbox_api.py", line 97, in create_shared_link_with_password
        links = dbx.sharing_list_shared_links(file_name, direct_only=True)
      File "/usr/lib/python3/dist-packages/dropbox/base.py", line 5190, in sharing_list_shared_links
    ...

    Hi Vlopez1,

    How did you decide that a exception thrown when you list possible shared link(s) is related somehow to files upload? 🤔

    You may post some code snippet that reproduces the issue... Also, check your network connectivity.

  • Vlopez1's avatar
    Vlopez1
    Explorer | Level 3
    2 years ago

    Hi,

     

    My process is the following:

    1. Check if exists

    2. If not exists, upload

    3. Check if shared link exists

    4. If not exists, create shared link

     

     

    When I have been able to reproduce this, the problem appears after the upload, then step 3 fails, and executing the same process again fails at 1.

     

    import dropbox
    import os
    import hashlib
    
    # Your Dropbox API access token
    ACCESS_TOKEN = ''
    REFRESH_TOKEN = ''
    
    APP_KEY = ''
    APP_SECRET = ''
    
    # Initialize the Dropbox client
    dbx = dropbox.Dropbox(
        app_key=APP_KEY,
        app_secret=APP_SECRET,
        oauth2_refresh_token=REFRESH_TOKEN)
    
    
    def file_already_uploaded(file_path, customer_name):
        """Check if the file has already been uploaded by comparing content hashes."""
        try:
            hasher = DropboxContentHasher()
            with open(file_path, 'rb') as f:
                while True:
                    chunk = f.read(1024)  # or whatever chunk size you want
                    if len(chunk) == 0:
                        break
                    hasher.update(chunk)
            file_hash = hasher.hexdigest()
    
            response = dbx.files_list_folder('/' + customer_name)
            for entry in response.entries:
                if isinstance(entry, dropbox.files.FileMetadata):
                    metadata = dbx.files_get_metadata(
                        entry.path_lower)
                    if metadata.name == os.path.basename(file_path):
                        if (metadata.content_hash == file_hash):
                            print(
                                f"File '{file_path}' has already been uploaded as '{entry.name}'.")
                            return True
                        else:
                            print(
                                f"File '{file_path}' has been uploaded but the content has changed, deleting it.")
                            dbx.files_delete_v2(entry.path_lower)
                            return False
        except dropbox.exceptions.ApiError as err:
            print(f"Failed to list folder contents: {err}")
        return False
    
    
    def upload_large_file(file_path, customer_name):
        CHUNK_SIZE = 4 * 1024 * 1024  # 4MB
        file_size = os.path.getsize(file_path)
        with open(file_path, 'rb') as f:
            if file_size <= CHUNK_SIZE:
                print("Uploading small file in one go...")
                dbx.files_upload(f.read(), '/' + customer_name)
            else:
                print("Uploading large file in chunks...")
                upload_session_start_result = dbx.files_upload_session_start(
                    f.read(CHUNK_SIZE))
                cursor = dropbox.files.UploadSessionCursor(session_id=upload_session_start_result.session_id,
                                                           offset=f.tell())
                commit = dropbox.files.CommitInfo(
                    path='/' + customer_name + '/' + os.path.basename(file_path))
    
                while f.tell() < file_size:
                    print("Uploading chunk... {}%".format(
                        f.tell() / file_size * 100))
                    if ((file_size - f.tell()) <= CHUNK_SIZE):
                        dbx.files_upload_session_finish(
                            f.read(CHUNK_SIZE), cursor, commit)
                    else:
                        dbx.files_upload_session_append_v2(
                            f.read(CHUNK_SIZE), cursor)
                        cursor.offset = f.tell()
    
        print("File uploaded successfully to {}.".format(
            customer_name + '/' + os.path.basename(file_path)))
    
    
    def create_shared_link_with_password(file_path, customer_name, password):
        file_name = '/' + customer_name + "/" + os.path.basename(file_path)
    
        try:
            print("Checking if shared link with password already exists...")
            links = dbx.sharing_list_shared_links(file_name, direct_only=True)
            for link in links.links:
                if link.link_permissions.require_password:
                    print(f"Shared link with password already exists: {link.url}")
                    return link.url
        except dropbox.exceptions.ApiError as err:
            print("Link not found")
    
        shared_link_metadata = dbx.sharing_create_shared_link_with_settings(
            file_name,
            dropbox.sharing.SharedLinkSettings(
                requested_visibility=dropbox.sharing.RequestedVisibility.password, link_password=password)
        )
        print(f"Shared link with password: {shared_link_metadata.url}")
        return shared_link_metadata.url
    
    
    def upload_and_share(file_path, customer_name, password):
        if not file_already_uploaded(file_path, customer_name):
            upload_large_file(file_path, customer_name)
        shared_link = create_shared_link_with_password(
            file_path, customer_name, password)
        return shared_link
    
    
    if __name__ == '__main__':
    
        file_path = '/home/user/Downloads/attachments_20240716_13012111.zip'
        password = 'your_passwordd'
        upload_and_share(file_path, "test_customer", password)
  • Jay's avatar
    Jay
    Icon for Dropbox Community Moderator rankDropbox Community Moderator
    2 years ago

    As you're using a custom app you created, I'm moving this to the appropriate section of the Community.

  • Vlopez1's avatar
    Vlopez1
    Explorer | Level 3
    2 years ago

    Regarding internet connectivity.


    I have this running on a remote device, I am connected to it through ssh.

    I can run the tests from my office through the internet, and I see the failures live, so the internet connection is not stopped at any point. It is a cell network with upload/download 1Mbps.

     

    I am not sure if this may be a scenario like this: https://www.dropboxforum.com/t5/Dropbox-API-Support-Feedback/Connection-timeouts/m-p/375263/highlight/true#M21071

     

    Where I am getting blacklisted for a couple of minutes.


    Running the tests locally I have been able to upload 5 times without issue, but on the remote device I had it fail 50% of the time in 6 tests.

  • Здравко's avatar
    Здравко
    Legendary | Level 20
    2 years ago

    Unfortunately, I cannot reproduce your issue Vlopez1. In spite of some imperfection in your code, listing shared links works.

    There is only one error:


    Vlopez1 wrote:

    ...

     

    ...
            hasher = DropboxContentHasher()
    ...
    
    def upload_large_file(file_path, customer_name):
    ...
                dbx.files_upload(f.read(), '/' + customer_name)
    ...

     


    Where and how did this DropboxContentHasher come in your code? Also, is the file path formatted correctly everywhere (did you test with small files)? 🙂

    You should jump in deep what's going on in your case. Try inspect your network traffic and see if there is some "noise" during the failing methods calls and what exactly.

    Good luck.

  • Greg-DB's avatar
    Greg-DB
    Icon for Dropbox Community Moderator rankDropbox Community Moderator
    2 years ago

    Vlopez1 Given the description of the issue, such as how it works for you locally and only fails sometimes when using the remote device, it does sound like a transient network issue.

     

    While it wouldn't fix any network issues, you can configure the timeout for the Dropbox client using the "timeout" parameter on the Dropbox object. For example, you can increase it if you want the call to wait longer, in case the connection recovers, or you can decrease it if you want the call to fail sooner (e.g., so you can retry it sooner).

  • Vlopez1's avatar
    Vlopez1
    Explorer | Level 3
    12 months ago

    Thank you.

     

    Yes it seems that the network at some point has increased latency or decreased bandwidth, I will experiment with the timeout parameter.

     

     

About Dropbox API Support & Feedback

Node avatar for Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.6,039 PostsLatest Activity: 5 hours ago
415 Following

The Dropbox Community team is active from Monday to Friday. We try to respond to you as soon as we can, usually within 2 hours.

If you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X or Facebook.

For more info on available support options for your Dropbox plan, see this article.

If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!