Hi, I need to iterate through a few tens of thousands of files on Dropbox using python, more specifically, doing something like this:
mynewdata = 
for subdir, dirs, files in os.walk(dropbox_folder): for file in files:
with open(os.join(subdir.file)) as csvfile:
df = pd.read_csv(csfile)
My intent is to not change any file, just open it, load into pandas, extra a tiny bit of information, close and move onto the next file. All of the files are already synced on both cloud and laptop. But "someone" is touching the file during the python loop, and causing Dropbox to sync, which slows everything down. If I open the file in read_only will that solve the problem, or am I doomed?