Need to see if your shared folder is taking up space on your dropbox 👨💻? Find out how to check here.
Forum Discussion
Vlopez1
2 years agoExplorer | Level 3
Continuous timeout errors for a few minutes when uploading files
I have timeouts which may be related to uploading files using upload sessions.
For instance after uploading a 200MB file I get a read operation timeout when trying to list shared links to that file...
Здравко
2 years agoLegendary | Level 20
Vlopez1 wrote:I have timeouts which may be related to uploading files using upload sessions....File "/deployed_ws/ros1/key_external_notifications/lib/python3/dist-packages/key_external_notifications/dropbox_api.py", line 97, in create_shared_link_with_passwordlinks = dbx.sharing_list_shared_links(file_name, direct_only=True)File "/usr/lib/python3/dist-packages/dropbox/base.py", line 5190, in sharing_list_shared_links...
Hi Vlopez1,
How did you decide that a exception thrown when you list possible shared link(s) is related somehow to files upload? 🤔
You may post some code snippet that reproduces the issue... Also, check your network connectivity.
Vlopez1
2 years agoExplorer | Level 3
Hi,
My process is the following:
1. Check if exists
2. If not exists, upload
3. Check if shared link exists
4. If not exists, create shared link
When I have been able to reproduce this, the problem appears after the upload, then step 3 fails, and executing the same process again fails at 1.
import dropbox
import os
import hashlib
# Your Dropbox API access token
ACCESS_TOKEN = ''
REFRESH_TOKEN = ''
APP_KEY = ''
APP_SECRET = ''
# Initialize the Dropbox client
dbx = dropbox.Dropbox(
app_key=APP_KEY,
app_secret=APP_SECRET,
oauth2_refresh_token=REFRESH_TOKEN)
def file_already_uploaded(file_path, customer_name):
"""Check if the file has already been uploaded by comparing content hashes."""
try:
hasher = DropboxContentHasher()
with open(file_path, 'rb') as f:
while True:
chunk = f.read(1024) # or whatever chunk size you want
if len(chunk) == 0:
break
hasher.update(chunk)
file_hash = hasher.hexdigest()
response = dbx.files_list_folder('/' + customer_name)
for entry in response.entries:
if isinstance(entry, dropbox.files.FileMetadata):
metadata = dbx.files_get_metadata(
entry.path_lower)
if metadata.name == os.path.basename(file_path):
if (metadata.content_hash == file_hash):
print(
f"File '{file_path}' has already been uploaded as '{entry.name}'.")
return True
else:
print(
f"File '{file_path}' has been uploaded but the content has changed, deleting it.")
dbx.files_delete_v2(entry.path_lower)
return False
except dropbox.exceptions.ApiError as err:
print(f"Failed to list folder contents: {err}")
return False
def upload_large_file(file_path, customer_name):
CHUNK_SIZE = 4 * 1024 * 1024 # 4MB
file_size = os.path.getsize(file_path)
with open(file_path, 'rb') as f:
if file_size <= CHUNK_SIZE:
print("Uploading small file in one go...")
dbx.files_upload(f.read(), '/' + customer_name)
else:
print("Uploading large file in chunks...")
upload_session_start_result = dbx.files_upload_session_start(
f.read(CHUNK_SIZE))
cursor = dropbox.files.UploadSessionCursor(session_id=upload_session_start_result.session_id,
offset=f.tell())
commit = dropbox.files.CommitInfo(
path='/' + customer_name + '/' + os.path.basename(file_path))
while f.tell() < file_size:
print("Uploading chunk... {}%".format(
f.tell() / file_size * 100))
if ((file_size - f.tell()) <= CHUNK_SIZE):
dbx.files_upload_session_finish(
f.read(CHUNK_SIZE), cursor, commit)
else:
dbx.files_upload_session_append_v2(
f.read(CHUNK_SIZE), cursor)
cursor.offset = f.tell()
print("File uploaded successfully to {}.".format(
customer_name + '/' + os.path.basename(file_path)))
def create_shared_link_with_password(file_path, customer_name, password):
file_name = '/' + customer_name + "/" + os.path.basename(file_path)
try:
print("Checking if shared link with password already exists...")
links = dbx.sharing_list_shared_links(file_name, direct_only=True)
for link in links.links:
if link.link_permissions.require_password:
print(f"Shared link with password already exists: {link.url}")
return link.url
except dropbox.exceptions.ApiError as err:
print("Link not found")
shared_link_metadata = dbx.sharing_create_shared_link_with_settings(
file_name,
dropbox.sharing.SharedLinkSettings(
requested_visibility=dropbox.sharing.RequestedVisibility.password, link_password=password)
)
print(f"Shared link with password: {shared_link_metadata.url}")
return shared_link_metadata.url
def upload_and_share(file_path, customer_name, password):
if not file_already_uploaded(file_path, customer_name):
upload_large_file(file_path, customer_name)
shared_link = create_shared_link_with_password(
file_path, customer_name, password)
return shared_link
if __name__ == '__main__':
file_path = '/home/user/Downloads/attachments_20240716_13012111.zip'
password = 'your_passwordd'
upload_and_share(file_path, "test_customer", password)- Jay2 years ago
Dropbox Community Moderator
As you're using a custom app you created, I'm moving this to the appropriate section of the Community.
- Vlopez12 years agoExplorer | Level 3
Regarding internet connectivity.
I have this running on a remote device, I am connected to it through ssh.I can run the tests from my office through the internet, and I see the failures live, so the internet connection is not stopped at any point. It is a cell network with upload/download 1Mbps.
I am not sure if this may be a scenario like this: https://www.dropboxforum.com/t5/Dropbox-API-Support-Feedback/Connection-timeouts/m-p/375263/highlight/true#M21071
Where I am getting blacklisted for a couple of minutes.
Running the tests locally I have been able to upload 5 times without issue, but on the remote device I had it fail 50% of the time in 6 tests.- Greg-DB2 years ago
Dropbox Community Moderator
Vlopez1 Given the description of the issue, such as how it works for you locally and only fails sometimes when using the remote device, it does sound like a transient network issue.
While it wouldn't fix any network issues, you can configure the timeout for the Dropbox client using the "timeout" parameter on the Dropbox object. For example, you can increase it if you want the call to wait longer, in case the connection recovers, or you can decrease it if you want the call to fail sooner (e.g., so you can retry it sooner).
- Здравко2 years agoLegendary | Level 20
Unfortunately, I cannot reproduce your issue Vlopez1. In spite of some imperfection in your code, listing shared links works.
There is only one error:
Vlopez1 wrote:...
... hasher = DropboxContentHasher() ... def upload_large_file(file_path, customer_name): ... dbx.files_upload(f.read(), '/' + customer_name) ...Where and how did this DropboxContentHasher come in your code? Also, is the file path formatted correctly everywhere (did you test with small files)? 🙂
You should jump in deep what's going on in your case. Try inspect your network traffic and see if there is some "noise" during the failing methods calls and what exactly.
Good luck.
About Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.
The Dropbox Community team is active from Monday to Friday. We try to respond to you as soon as we can, usually within 2 hours.
If you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X, Facebook or Instagram.
For more info on available support options for your Dropbox plan, see this article.
If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!