However, if it does, it seems I have to go any copy file by file, and then the process becomes EXTREMELY slow and often times out (perhaps running afoul of the API performance limits)
We run this process every 5 mins to make sure we are snycing up any straggling files. How can I efficiently run this ONLY when we are sure there are files that have changed/need to be synced?
Note: I can't simply delete the dest dir and recopy because there are people who are syncing these dirs to their local Dropbox account and that causes a massive amount of churning. (There could be 100dirs and 5000 files in the extreme case).
I'm using the PHP API in Laravel but what I'm after is the *correct* way of doing this. If I coudl simply compare last folder modified date or something simple, I could then only recopy folders that have changed.
That covers the options for efficiently detecting and listing just the changes that have occurred since you last checked, e.g., using the "cursor" functionality on /2/files/list_folder[/continue]. For cases where you can't just copy the folder in its entirety, you could use that to determine what specific items need to be copied.
Also, for the actual copying, when you have multiple items to copy, you can use the /2/files/copy_batch_v2 endpoint to do so in bulk: