cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements
Want to know what we learned at IBC? Check out our learnings on media, remote working and more right here.

Dropbox API Support & Feedback

Find help with the Dropbox API from other developers.

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

API rate limiting on file download

API rate limiting on file download

icydog
Explorer | Level 4

I am using Duplicacy for backing up data to my Dropbox Business account. It is a Dropbox API app and writes to the Apps directory. The issue I am having is that when downloading (for restoring or verification of a backup), I'm being rate-limited to an effective download rate of about 20 MB/s, which is a lot slower than I'd like, especially in a disaster recovery scenario.

 

Duplicacy seems to be making one `/2/files/get_metadata` and one `/2/files/download` per chunk being downloaded. Each chunk (file) tends to be around 5-20 MB. Duplicacy does respect the Retry-After and backs off as requested.

 

I have two questions:

  1. Is the Dropbox API rate limit based on number of API calls, throughput, or both?
  2. How could I optimize this to maximize overall throughput on download?
3 Replies 3

Greg-DB
Dropbox Staff

1. The Dropbox API rate limiting system operates on the number of calls per time period, not bandwidth/throughput.

2. There may be some optimizations that could be made, but they would need to be made by the programmer of the app. It sounds like you're an end-user of Duplicacy, and not the programmer, so you may want to reach out to them regarding this. For example though, based on your description, I would suggest they investigate if they actually need to make one /2/files/get_metadata call per "chunk". For a large number of files, that could add up to a large number of calls. They may want to look into using /2/files/list_folder[/continue] instead, as that enables apps to list multiple files/folders under any particular path using fewer calls (as each call can return multiple entries).

icydog
Explorer | Level 4

Wow, it's nice to see you're still at Dropbox, Greg! Thanks for the very informative and concise answer as always, especially on #1.

 

Duplicacy is open-source so it can be changed, but its architecture makes removing the get_metadata call here difficult (for one thing, Duplicacy is generic and needs to work with a variety of storage backends). list_folder doesn't work either because successive chunks are unlikely to be stored in the same directory, and enumerating the entire storage with the recursive option is infeasible for large data sets (I have millions of files). I appreciate the pointers!

 

-davidz

Greg-DB
Dropbox Staff

Thanks for the additional context! In that case, I'm sending this along as a feature request for a batch version of /2/files/get_metadata, but I can't promise if or when that might be implemented.

Need more support?
Who's talking

Top contributors to this post

  • User avatar
    Greg-DB Dropbox Staff
  • User avatar
    icydog Explorer | Level 4
What do Dropbox user levels mean?