cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements
Share your feedback on the Document Scanning Experience in the Dropbox App right here.

Discuss Dropbox Developer & API

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Re: DropBox sdk error in chunked upload “cannot access a closed file”

DropBox sdk error in chunked upload “cannot access a closed file”

ishan_PGT
Helpful | Level 5
Go to solution

So I am using the below code to ChukUpload a 150MB file and I am getting an error at

var byteRead = fileStream.Read(buffer, 0, (int)ChunkSize);

the exception says its an "cannot access a closed file.". The first chunk is uploading fine but when I enter the loop

"for (ulong idx = 0; idx < numChunks; idx++)"  2nd time I am getting the error at fileStream.Read(buffer, 0, (int)ChunkSize);

 

 

using(var fileStream = File.Open(Path.Combine("downloads", title), FileMode.Open)) {
  Console.WriteLine("chunked Upload");
  //ChunkUpload(remotePath, fileStream, (int)ChunkSize,dbx);
  ulong numChunks = (ulong) Math.Ceiling((double) fileStream.Length / (int) ChunkSize);
  byte[] buffer = new byte[(int) ChunkSize];
  string sessionId = null;
  for (ulong idx = 0; idx < numChunks; idx++) {
    var byteRead = fileStream.Read(buffer, 0, (int) ChunkSize);

    using(var memStream = new MemoryStream(buffer, 0, byteRead)) {
      if (idx == 0) {
        try {
          var result = dbx.Files.UploadSessionStartAsync(false, body: fileStream).Result;
          sessionId = result.SessionId;
          Console.WriteLine(sessionId);

        } catch (Exception ex) {
          Console.WriteLine(ex.Message);
        }
      } else {
        var cursor = new UploadSessionCursor(sessionId, (ulong)(int) ChunkSize * idx);

        if (idx == numChunks - 1) {
          FileMetadata fileMetadata = dbx.Files.UploadSessionFinishAsync(cursor, new CommitInfo(remotePath, mode: WriteMode.Overwrite.Instance), memStream).Result;
          Console.WriteLine(fileMetadata.PathDisplay);
        } else {
          dbx.Files.UploadSessionAppendV2Async(cursor, false, memStream).Wait();
        }
      }
      //}
    }
  }
}

 

1 Accepted Solution

Accepted Solutions

ishan_PGT
Helpful | Level 5
Go to solution

I found my mistake I was using the main filestream instead of the chunked memory stream in the 1st iteration.

View solution in original post

1 Reply 1

ishan_PGT
Helpful | Level 5
Go to solution

I found my mistake I was using the main filestream instead of the chunked memory stream in the 1st iteration.

Need more support?