cancel
Showing results forĀ 
ShowĀ Ā onlyĀ  | Search instead forĀ 
Did you mean:Ā 
Announcements
Whatā€™s new: end-to-end encryption, Replay and Dash updates. Find out more about these updates, new features and more here.

View, download, and export

Need support with viewing, downloading, and exporting files and folders from your Dropbox account? Find help from the Dropbox Community.

cancel
Showing results forĀ 
ShowĀ Ā onlyĀ  | Search instead forĀ 
Did you mean:Ā 

Re: How to Workaround "Zip File is Too Large" Error

How to Workaround "Zip File is Too Large" Error

Levi11
Helpful | Level 5

I'm sharing a multitude of large files with colleagues via a Dropbox link that are nearly 120GB in size. However, they're telling me Dropbox doesn't allow them to download it due to an error saying the "zip file is too large". Confused, I tried the link on an incognito browser to see for myself, and I see the error too. Why is there a download file size limitation?

 

My colleagues don't have Dropbox accounts of their own to transfer these files to (and I doubt they'd be willing to to make paid accounts to do it), and I'm not giving my account login info to them (as a few threads on this issue suggested) to be able to download it via the Desktop app.

 

I saw in a thread made a few years ago that Dropbox is easily capable of raising the file size download limit. They did it some months after someone complained about there being a 1 Gigabyte download size limit. Why don't they raise the download size limit much higher? (Or better yet, not have a limit at all?) I'm a paying customer, and the very reason I subscribed to Dropbox in the first place was to have a cloud service to store and share large files with people. If I wanted to deal with sharing smaller file sizes, I wouldn't even be paying for Dropbox's 2TB plan to begin with. I'd be using the Dropbox free version, or sites like Google Drive.

8 Replies 8

Elixir
Super User

HI @Levi11, the size limitation only applied to folders (there are some technical reason: when you download a folder the server needs to compress the files into a single ZIP folder, and Dropbox sets some limits). However, there is no limit on downloading a file. If you compress (make ZIP) of your 120GB data, upload and share that file via link, others should be able to download it. But remember there is daily bandwidth limit, for Plus, Family, and Professional accounts: 400 GB/day. If you exceed that daily limit, your link will be banned. 

Levi11
Helpful | Level 5

If this is the case like you say, why does Dropbox set limits on downloading large folders, but downloading large files are no problem to them? They're making their cloud software unnecessarily convoluted with these specific restrictions. Again, I'd understand these limits if I were using a free version of Dropbox, but I'm not. I'm paying for a large plan only for them to still have limits on my account.

 

You mentioned compressing them into a zip file. How do I make Dropbox turn my files into a ZIP? And how do the people downloading it turn it back to normal when they download it on their ends? Also, some of the files I'm sending over are videos. If I compress them into a ZIP, won't that ruin the quality of my videos?

vladgrayson
New member | Level 2

Just make it work! The customers should not be finding workarounds for limitations you are putting on them! If the customers are paying, you have to deliver the service!

 

vladgrayson
New member | Level 2

Same issue here. Just ranted to support about this (who are useless). What a bunch of BS....

eddiejag
Helpful | Level 6

This is exactly why I left Dropbox for Google Drive. They try to force EVERYONE to have a pro plan, it's unbelievable. I've had no issues with Google Drive for my clients for 2 years. Now I am forced to come back to DB because a vendor I'm editing for uploaded 400gb worth of footage there. I can't download the folder, I have to download each individual file, using the link

 

It's insane what their tech team comes up with. I hope this business dies. Pros should be leaving in droves.

nigthfire
Explorer | Level 4

For anybody seeing this still later:

 

I made myself a python script to be able to download files from a shared folder which would exceed the zip file size. Without necessity to add them to your own  Dropbox account of course.

 

https://github.com/rehroman/dropbox-file-downloader

Š—Š“рŠ°Š²ŠŗŠ¾
Legendary | Level 20

@eddiejag wrote:

... I can't download the folder, I have to download each individual file, using the link

...


If somebody want to download a shared folder with link as is without compression/decompression (and the associated limits), I maid a shell script that once installed can just be run and everything pointed by a link provided will download keeping the folder structure as is. Supported are all types of shared links: www.dropbox.com/s/... www.dropbox.com/sh/... www.dropbox.com/scl/fo/... www.dropbox.com/scl/fi/...

The proper shells are typically available preinstalled on Mac and Linux. cURL and jq are the only external tools used inside and may need to be installed if not yet available on the system. Here it goes:

#!/bin/bash
##############################################################################
#        Downloads file and/or folders pointed by Dropbox shared link
#        ------------------------------------------------------------
# Just make it executable (if need, using
#   $ chmod a+x download_link
# ) and run it.
# Author: Š—Š“рŠ°Š²ŠŗŠ¾
#   www.dropboxforum.com/t5/user/viewprofilepage/user-id/422790
##############################################################################

# Get from your application's settings:
#   https://www.dropbox.com/developers/apps/
# Required scopes: files.metadata.read and sharing.read
readonly APP_KEY=""
readonly APP_SECRET=""

fatal() {
  for msg in "$@"; do echo "$msg" >&2; done
  exit 1;
}

if [[ $# -lt 1 || $# -gt 2 ]]; then
  fatal "download_link:" \
  "  Required 1 or 2 command arguments." \
  "    First mandatory argument represents the pointed link and target content\
 to download." \
  "    Second optional argument represents desired local folder where content\
 to be downloaded. If missing, the download happens in current working\
 directory." \
  "  Found $# arguments."
fi
LINK="$1"
if [ $# -eq 2 ]
then FOLDER="$2"
else FOLDER=`pwd`
fi
readonly AUTORIZATION="$APP_KEY:$APP_SECRET"
readonly CONTENT_TYPE="Content-Type: application/json"
readonly LIST_FOLDER="https://api.dropboxapi.com/2/files/list_folder"
readonly LIST_FOLDER_CONTINUE="https://api.dropboxapi.com/2/files/list_folder/continue"
readonly SHARED_METADATA="https://api.dropboxapi.com/2/sharing/get_shared_link_metadata"
if ! which curl > /dev/null; then
  fatal "curl:" \
  "  Required for proper application work - trace and download,\
 but found missing."
fi
getList() {
  curl -f -X POST $LIST_FOLDER -u "$AUTORIZATION" -H "$CONTENT_TYPE" -d\
   "{\"shared_link\":{\"url\":\"$LINK\"},\"path\":\"$1\"}" 2> /dev/null
  local err=$?
  if [ $err -ne 0 ]; then
    fatal "download_link:" \
    " Inconsistency found while trying link \"$LINK\" at position \"$1\"." \
    " Error curl: $err"
  fi
}
getListContinue() {
  curl -f -X POST $LIST_FOLDER_CONTINUE -u "$AUTORIZATION" -H "$CONTENT_TYPE" -d\
   "{\"cursor\":\"$1\"}" 2> /dev/null
  local err=$?
  if [ $err -ne 0 ]; then
    fatal "download_link:" \
    " Inconsistency found while trying link \"$LINK\" with cursor \"$1\"." \
    " Error curl: $err"
  fi
}
getMetadata() {
  local IN_PATH=""
  if [ $# -eq 1 ]; then
    IN_PATH=",\"path\":\"$1\""
  fi
  curl -f -X POST $SHARED_METADATA -u "$AUTORIZATION" -H "$CONTENT_TYPE" -d\
   "{\"url\":\"$LINK\"$IN_PATH}" 2> /dev/null
  local err=$?
  if [ $err -ne 0 ]; then
    fatal "download_link:" \
    " Inconsistency found while trying link \"$LINK\" at position \"$1\"." \
    " Error curl: $err"
  fi
}
if ! which jq > /dev/null; then
  fatal "jq:" \
  "  Required for proper application work - communication formatting,\
 but found missing."
fi
metadata_type() {
  echo "$1" | jq -r '.[".tag"]'
}
metadata_url() {
  echo "$1" | jq -r .url
}
metadata_name() {
  echo "$1" | jq -r .name
}
metadata_client_modified() {
  echo "$1" | jq -r .client_modified
}
readonly URI_REGEX='^(([^:/?#]+):)?(//((([^:/?#]+)@)?([^:/?#]+)(:([0-9]+))?))?((/|$)([^?#]*))(\?([^#]*))?(#(.*))?$'
#                    ā†‘ā†‘            ā†‘  ā†‘ā†‘ā†‘            ā†‘         ā†‘ ā†‘            ā†‘ā†‘    ā†‘        ā†‘  ā†‘        ā†‘ ā†‘
#                    ||            |  |||            |         | |            ||    |        |  |        | |
#                    |2 scheme     |  ||6 userinfo   7 host    | 9 port       ||    12 rpath |  14 query | 16 fragment
#                    1 scheme:     |  |5 userinfo@             8 :...         ||             13 ?...     15 #...
#                                  |  4 authority                             |11 / or end-of-string
#                                  3  //...                                   10 path
makeLinkDownloadable() {
  if [[ "$1" =~ $URI_REGEX ]]; then
    echo -n "${BASH_REMATCH[1]}${BASH_REMATCH[3]}${BASH_REMATCH[10]}?"
    local POST_URL="${BASH_REMATCH[15]}"
    local QUERY="${BASH_REMATCH[13]}"
    (
      local DL="dl=1"
      IFS='&'
      for param in $QUERY
      do
        if [[ "${param:0:1}" == "?" ]]
        then
          param="${param:1}"
        else
          echo -n '&'
        fi
        if [[ "${param:0:3}" == "dl=" ]]
        then
          echo -n "dl=1"
          DL=""
        else
          echo -n "$param"
        fi
      done
      echo -n "$DL"
    )
    echo "$POST_URL"
  else
    fatal "download_link:" \
          "  Cannot recognize link format: \"$1\"."
  fi
}
fileDownload() {
  if [ -f "$2" ]; then
    local LOCAL_TIME=`date -r "$2" +%s`
    local CLIEN_MODIFIED=`metadata_client_modified "$1"`
    local REMOTE_TIME=`date -d "$CLIEN_MODIFIED" +%s`
    if [[ "$LOCAL_TIME" -ge "$REMOTE_TIME" ]]; then
      return
    fi
  fi
  local URL=`metadata_url "$1"`
  local DOWNLOADABLE=`makeLinkDownloadable "$URL"`
  curl -fL --create-dirs -o "$2" "$DOWNLOADABLE" 2>/dev/null
  local err=$?
  if [ $err -ne 0 ]; then
    fatal "download_link:" \
    " Unexpected error while download \"$DOWNLOADABLE\" to \"$2\"." \
    " Error curl: $err"
  fi
}
processList() {
  local FOLDER="$1"
  local REMOTE_PATH="$2"
  local LIST=`echo "$3" | jq -c '.entries[] | {".tag", name}'`
  local PIDS=""
  IFS=$'\n'
  for entry in $LIST; do
    local NAME=`echo "$entry" | jq -r .name`
    local TYPE=`echo "$entry" | jq -r '.[".tag"]'`
    case "$TYPE" in
      "file")
        echo -n . >&2
        (
          fileDownload "`getMetadata "$REMOTE_PATH/$NAME"`" "$FOLDER/$NAME"
          echo -n . >&2
        ) &
        PIDS="$PIDS $!"
        ;;
      "folder")
        folderDownload "$FOLDER/$NAME" "$REMOTE_PATH/$NAME" &
        PIDS="$PIDS $!"
        ;;
      *)
        echo "'$NAME' residing in '$REMOTE_PATH' is neither file nor folder."\
          "Ignored!" >&2
        ;;
    esac
  done
  echo "$PIDS"
}
folderDownload() {
  echo -n . >&2
  local FOLDER="$1"
  local REMOTE_PATH="$2"
  local LIST=`getList "$REMOTE_PATH"`
  local PIDS=`processList "$FOLDER" "$REMOTE_PATH" "$LIST"`
  while [[ `echo "$LIST" | jq .has_more` == "true" ]]; do
    local CURSOR=`echo "$LIST" | jq -r .cursor`
    LIST=`getListContinue "$CURSOR"`
    PIDS="$PIDS `processList "$FOLDER" "$REMOTE_PATH" "$LIST"`"
  done
  for pid in $PIDS; do wait $pid 2> /dev/null; done
  if [[ "$REMOTE_PATH" == "" ]]; then echo . >&2; fi
}
METADATA=`getMetadata`
TYPE=`metadata_type "$METADATA"`
NAME=`metadata_name "$METADATA"`
case "$TYPE" in
  "file")
    fileDownload "$METADATA" "$FOLDER/$NAME"
    ;;
  "folder")
    folderDownload "$FOLDER/$NAME" ""
    ;;
  *)
    fatal "download_link:" \
    "  Can handle only file or folder link, but found \"$TYPE\"."
    ;;
esac

Just put it on a place accessible in your PATH, register your script following the link pointed at the script beginning and put application credentials (no user authentication needed - no access token etc.). That's it, just run it.

Good luck to all. šŸ˜‰

Azimuth1
Explorer | Level 4

Unfortunately your script is not working for me... I repeatedly got an error of "your link is not correct" even if the dlkey part is present. 

Need more support?