We are aware of the issue with the badge emails resending to everyone, we apologise for the inconvenience - learn more here.
Forum Discussion
Julien_Do
6 years agoExplorer | Level 3
chunk upload error if only one chunk c#
Hi,
I'm using the chunk upload and it works fine. I want to use it for all the files not just for the big ones because it is easy to trace my upload. I'm facing un error when the file is smaller than the chunk:
InnerException = {"Cannot access a closed Stream."}
Message = "Error while copying content to a stream."
my code is:
using (var stream = new MemoryStream(File.ReadAllBytes(path)))
{
int numChunks = (int)Math.Ceiling((double)stream.Length / chunkSize);
byte[] buffer = new byte[chunkSize];
string sessionId = null;
for (var idx = 0; idx < numChunks; idx++)
{
var byteRead = stream.Read(buffer, 0, chunkSize);
using (MemoryStream memStream = new MemoryStream(buffer, 0, byteRead))
{
if (idx == 0)
{
var result = await client.Files.UploadSessionStartAsync(body: memStream);
sessionId = result.SessionId;
if (idx == numChunks - 1)
{
UploadSessionCursor cursor = new UploadSessionCursor(sessionId, (ulong)(chunkSize * idx));
await client.Files.UploadSessionFinishAsync(cursor, new CommitInfo(folder + "/" + fileName), memStream);
}
}
else
{
UploadSessionCursor cursor = new UploadSessionCursor(sessionId, (ulong)(chunkSize * idx));
if (idx == numChunks - 1)
{
await client.Files.UploadSessionFinishAsync(cursor, new CommitInfo(folder + "/" + fileName), memStream);
}
else
{
await client.Files.UploadSessionAppendV2Async(cursor, body: memStream);
}
}
}
}
}
Is there any way to upload a a chunk size file with chunk upload or the only way is to use the normal upload for the files smaller than the chunk?
Thanks in advance!
You can use upload sessions (a.k.a. "chunk upload") to upload small files. Using the normal upload is more efficient for small files though, since you only need to make one call as opposed to two, so I would recommend using that, despite the bit of added code complexity.
If you do want to use upload sessions to upload small files, you would need two calls:
- `UploadSessionStartAsync`: to start the upload session and upload the data.
- `UploadSessionFinishAsync`: to finish the upload session and commit the file. Make sure you send the correct offset.
(You can technically send the file data on either of the calls actually.)
For the error you are getting though, I'd first check what line it's occurring on. It looks like you're attempting to read from a stream, presumably one of the two `MemoryStream`s you open, when the stream is already closed. I recommend stepping through with the debugger to track down the issue.
- Greg-DBDropbox Staff
You can use upload sessions (a.k.a. "chunk upload") to upload small files. Using the normal upload is more efficient for small files though, since you only need to make one call as opposed to two, so I would recommend using that, despite the bit of added code complexity.
If you do want to use upload sessions to upload small files, you would need two calls:
- `UploadSessionStartAsync`: to start the upload session and upload the data.
- `UploadSessionFinishAsync`: to finish the upload session and commit the file. Make sure you send the correct offset.
(You can technically send the file data on either of the calls actually.)
For the error you are getting though, I'd first check what line it's occurring on. It looks like you're attempting to read from a stream, presumably one of the two `MemoryStream`s you open, when the stream is already closed. I recommend stepping through with the debugger to track down the issue.
- Julien_DoExplorer | Level 3
Yes, it was a problem with the offset.
Thank you!
About Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.
5,877 PostsLatest Activity: 12 months agoIf you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X or Facebook.
For more info on available support options for your Dropbox plan, see this article.
If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!