We are aware of the issue with the badge emails resending to everyone, we apologise for the inconvenience - learn more here.
Forum Discussion
ncw
5 years agoCollaborator | Level 8
Rate limiting when uploading files with rclone
I've received complaints from rclone users that file uploads are progressing really slowly.
Digging into it what I see is this:
2020-09-03 11:44:33 DEBUG : too_many_requests/: Too many re...
- 5 years ago
Thanks for the detailed feedback! I'm sharing this with the team.
Greg-DB
5 years agoDropbox Staff
Thanks for the report Nick! This may be due to some changes about how we return rate limit/lock contention errors and their Retry-After time windows. We'll look into it, but for reference, can you let me know, to the best of your knowledge, when you started seeing this?
Also, does rclone ever submit multiple uploads for the same "namespace" at the same time? If so, I recommend reviewing the Performance Guide, if you haven't already. In particular check out the "Batch Upload" section for guidance on how to most efficiently upload multiple files.
By the way, for support for issues like this, you can always find us on the forum here, or contact us directly by opening an API ticket here. Either way is fine. If you would additionally like to consider becoming technology partner, you can find more information and an application form here.
- ncw5 years agoCollaborator | Level 8
> We'll look into it, but for reference, can you let me know, to the best of your knowledge, when you started seeing this?
Unfortunately these errors don't show up in the integration tests logs otherwise I'd have an accurate timeline for you.
I received the first user report about this on the 2nd September 2020 and I verified it for myself today. The user that reported it to me was a new user though and it is possible/likely that the problem existed before then.
> Also, does rclone ever submit multiple uploads for the same "namespace" at the same time?
Yes it does. My understanding that a namespace could be a folder so rclone does uploads lots of file to the same folder and this is indeed the problem area that I'm seeing.
It looks like batch uploads could be helpful. Will they make a lot of performance difference?
The API doesn't look hard to implement but batch uploads are really bad fit architecturally for rclone - I'll need to think how that might work.
I ran my tests again now and it looks like rclone is rate limited to uploading 2 ish files per second. After about 100 uploads I start seeing the `too_many_requests` errors with a Retry-After of 15s.
Does batch uploading work-around that?
Thanks for the contact links - I couldn't find the developer ticket option earlier!
- Greg-DB5 years agoDropbox Staff
Thanks for the information! I'll follow up here once I have an update on this.
That batch upload functionality can improve performance overall, especially when you are seeing contention issues, but it will vary from case to case. It helps by only taking a lock once per batch, as opposed to once per file.
- ncw5 years agoCollaborator | Level 8
I have implemented batching, and yes it does make a HUGE difference. In fact for my test directory with 200MB of small images (average size 400k) it transfers it 20x faster!
I think the major disadvantage for rclone is that it can't check the hashes of the uploaded files any more. This is because the batch completes after rclone has finished with the input file due to the architecture of rclone. However the user can run "rclone check" after transfers and that is what I'll recommend to users using the batching feature.
So thank you for pointing me at that feature. It makes a huge difference.
About Dropbox API Support & Feedback
Find help with the Dropbox API from other developers.
5,877 PostsLatest Activity: 12 months agoIf you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X or Facebook.
For more info on available support options for your Dropbox plan, see this article.
If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!