We are aware of the issue with the badge emails resending to everyone, we apologise for the inconvenience - learn more here.

Forum Discussion

lazyCoder's avatar
lazyCoder
Explorer | Level 3
7 months ago

API-libcurl cannot use CURLOPT_RANGE or any other functions to resume to download files

Hi,

I use libcurl to call the dropbox api to achieve some functions.

When calling file/download, the download will be disconnected when the file exceeds 2G, so I want to implement the function of file continuation.

Locally use curl to send

 

 

curl -X POST <url> \
<rest code>
--header'Range: {"bytes": "xxx-"} '

 

 

to dropbox to achieve continuation;

But when using libcurl, it will not, and a curl 33 error (CURLE_RANGE_ERROR) will appear

Here's my C code(Code is omitted):

 

 

<curl init code>
FILE *fp = fopen(tmp_file,"a");
if(resume_flag)
        wd_DEBUG("[DEBUG] add resume download header\n");
        char *resume_dl = NULL;
        resume_dl=(char *)malloc(strlen(filename) + 128);
        memset(resume_dl,0,strlen(filename)+128);
        snprintf(resume_dl, strlen(filename)+128, "Range: {\"bytes\": \"%lld-\"}", tmp_file_size);
        headerlist=curl_slist_append(headerlist, resume_dl);
        free(resume_dl);
}
curl_easy_setopt(curl,CURLOPT_WRITEDATA,fp);
res=curl_easy_perform(curl);
<rest code>

 

 

 I also tried CURLOPT_RANGE, CURLOPT_RESUME_FROM and CURLOPT_RESUME_FROM_LARGE  all the same problem, files always start downloading from 0.

I wonder if the dropbox api supports continued downloading of files, or if it's the code I wrote

  • First, for reference, can you elaborate on what you mean when you say "the download will be disconnected when the file exceeds 2G"? Do you mean you're seeing an issue from the Dropbox API servers that causes this, or are you referring to some local constraint in your environment?

     

    The Dropbox API servers should support downloading files larger than 2 GB via the /2/files/download endpoint. I just tried that out and I was able to successfully download a 5 GB file using one request, but please let us know if that's not working for you and you're seeing some server error. Please note though that there is a time limit of about 1 hour on these /2/files/download requests so you may just be running in to that, in which case resuming the download using Range requests is the best practice.

     

    Anyway, the Dropbox API does support Range requests for file downloads, such as via /2/files/download. The syntax you're using to specify the byte range does not look correct though; the Range value is not specified as JSON. There are some examples in that linked RFC, but here's an example of how it would look to set it on curl on the command line:

    --header 'Range: bytes=668909568-'

     

  • Greg-DB's avatar
    Greg-DB
    Icon for Dropbox Staff rankDropbox Staff

    First, for reference, can you elaborate on what you mean when you say "the download will be disconnected when the file exceeds 2G"? Do you mean you're seeing an issue from the Dropbox API servers that causes this, or are you referring to some local constraint in your environment?

     

    The Dropbox API servers should support downloading files larger than 2 GB via the /2/files/download endpoint. I just tried that out and I was able to successfully download a 5 GB file using one request, but please let us know if that's not working for you and you're seeing some server error. Please note though that there is a time limit of about 1 hour on these /2/files/download requests so you may just be running in to that, in which case resuming the download using Range requests is the best practice.

     

    Anyway, the Dropbox API does support Range requests for file downloads, such as via /2/files/download. The syntax you're using to specify the byte range does not look correct though; the Range value is not specified as JSON. There are some examples in that linked RFC, but here's an example of how it would look to set it on curl on the command line:

    --header 'Range: bytes=668909568-'

     

    • Здравко's avatar
      Здравко
      Legendary | Level 20

      lazyCoder, You definitely demonstrate laziness when reading documentation (something matching to your nickname). 😁 When you're going to use some library, read the documentation of particular library first! 👈 Did you do that? 🧐 Sarcasm, of course.

      The libcurl documentation may be seen here. There are lots of options that cover everything the command line tool does and more. You may find useful following option. 🙋 Everything above (in the resume part) done in just a single line - as safe as possible and without additional memory allocation. 😉

      Hope this helps.

       

      Add: Just in case, to avoid confusion if your code is 32-bit (i.e. you're using 32-nit machine or building cross compiled 32-bit code), you'll need the extended "LARGE" version of the above option. The small option support up to 2GB while the large one - up to 8EB. They are functionally equivalent when your code is 64-bit.

    • lazyCoder's avatar
      lazyCoder
      Explorer | Level 3

      Greg-DB Thanks for the reply, I have found the reason, because the header information is not correctly added to the defined headerlist.

      As mentioned earlier, download more than 2GB of files, it will be broken.
      The reason is that the download time is too long, maybe it exceeds the 1-hour limit you mentioned;
      Will it be improved by adding the "Keep-alive" parameter to the header?

      Здравко Thank you for your suggestion. As mentioned earlier, several functions cannot be completed using CURLOPT_XXX. I will read the documentation carefully next time.

      • Здравко's avatar
        Здравко
        Legendary | Level 20

        HI lazyCoder,

        I have to read more carefully too... or not. 🤔 As seems on libcurl, headers formats are swapped by mistake. Request range there is formatted rather as for response. I have no idea why, just checked; now as seems documentation is incomplete or... there is some bug if we assume documentation is correct.

        Anyway, my test setup follows (if can be useful):

        /*****************************************************************************
         *                        Test curl partial download
         *                        ==========================
         *
         * Simulated breakage of the connection and next continuation from the place
         * where broke.
         *
         * range_test.c
         * Author: Здравко
         * https://www.dropboxforum.com/t5/user/viewprofilepage/user-id/422790
         *
         *****************************************************************************/
        
        #include <stdio.h>
        #include <stdlib.h>
        #include <string.h>
        #include <curl/curl.h>
        
        const char *access_token = ""; // Put here valid token for the test!
        
        // const char *download_point = "http://127.0.0.1:8080/"; // netcat -l -p 8080
        const char *download_point = "https://content.dropboxapi.com/2/files/download";
        const char *param_template = "Dropbox-API-Arg: {\"path\":\"%s\"}";
        
        int main(int argc, char **argv) {
          const char *dropboxPath;
          const char *localPath;
          char *param;
          CURL *curl;
          CURLcode res;
          struct curl_slist *param_list = NULL;
          FILE *output;
          long current_pos;
          char rangeBuf[48];
        
          if (argc != 3) {
            fputs("2 command arguments needed - dropbox file path (or file id) and "
                  "local file path!", stderr);
            return EXIT_FAILURE;
          }
          dropboxPath = argv[1];
          localPath = argv[2];
          param = (char*)malloc(strlen(param_template) + strlen(dropboxPath) + 1);
          sprintf(param, param_template, dropboxPath);
          printf("Will try read from '%s'\nand write to '%s'\n\n",
                 dropboxPath, localPath);
        
          curl_global_init(CURL_GLOBAL_ALL);
        
          puts("Starting break simultaion...");
          output = fopen(localPath, "wb");
          if (output == NULL) {
            free(param);
            curl_global_cleanup();
            fputs("Cannot open target file for write - first step!", stderr);
            return EXIT_FAILURE;
          }
          curl = curl_easy_init();
          if (curl == NULL) {
            fclose(output);
            free(param);
            curl_global_cleanup();
            fputs("Cannot open cURL handle - first step!", stderr);
            return EXIT_FAILURE;
          }
          param_list = curl_slist_append(NULL, param);
          param_list = curl_slist_append(param_list, "Range: bytes=0-199");
          curl_easy_setopt(curl, CURLOPT_URL, download_point);
          curl_easy_setopt(curl, CURLOPT_POST, 1L);
          curl_easy_setopt(curl, CURLOPT_HTTPAUTH, CURLAUTH_BEARER);
          curl_easy_setopt(curl, CURLOPT_XOAUTH2_BEARER, access_token);
          curl_easy_setopt(curl, CURLOPT_HTTPHEADER, param_list);
          curl_easy_setopt(curl, CURLOPT_MIMEPOST, NULL); // Set empty post
          curl_easy_setopt(curl, CURLOPT_WRITEDATA, output);
          res = curl_easy_perform(curl);
          curl_slist_free_all(param_list);
          param_list = NULL;
          curl_easy_cleanup(curl);
          if (res != CURLE_OK) {
            fclose(output);
            free(param);
            curl_global_cleanup();
            fprintf(stderr, "Error while download - first step: %s!\n",
                    curl_easy_strerror(res));
            return EXIT_FAILURE;
          }
          printf("Current file size: %ld\n", ftell(output));
          fclose(output);
          puts("Simultaion completed.\n");
        
          puts("Starting resume download...");
          output = fopen(localPath, "ab");
          if (output == NULL) {
            free(param);
            curl_global_cleanup();
            fputs("Cannot open target file for write - second step!", stderr);
            return EXIT_FAILURE;
          }
          current_pos = ftell(output);
          printf("Resume position is %ld\n", current_pos);
          curl = curl_easy_init();
          if (curl == NULL) {
            fclose(output);
            free(param);
            curl_global_cleanup();
            fputs("Cannot open cURL handle - second step!", stderr);
            return EXIT_FAILURE;
          }
          param_list = curl_slist_append(NULL, param);
          sprintf(rangeBuf, "Range: bytes=%ld-", current_pos);
          param_list = curl_slist_append(param_list, rangeBuf);
          curl_easy_setopt(curl, CURLOPT_URL, download_point);
          curl_easy_setopt(curl, CURLOPT_POST, 1L);
          curl_easy_setopt(curl, CURLOPT_HTTPAUTH, CURLAUTH_BEARER);
          curl_easy_setopt(curl, CURLOPT_XOAUTH2_BEARER, access_token);
          curl_easy_setopt(curl, CURLOPT_HTTPHEADER, param_list);
          curl_easy_setopt(curl, CURLOPT_MIMEPOST, NULL); // Set empty post
          curl_easy_setopt(curl, CURLOPT_WRITEDATA, output);
          res = curl_easy_perform(curl);
          curl_slist_free_all(param_list);
          param_list = NULL;
          curl_easy_cleanup(curl);
          if (res != CURLE_OK) {
            fclose(output);
            free(param);
            curl_global_cleanup();
            fprintf(stderr, "Error while download - second step: %s!\n",
                    curl_easy_strerror(res));
            return EXIT_FAILURE;
          }
          printf("Current file size: %ld\n", ftell(output));
          fclose(output);
          puts("Download resume completed.\n");
        
          curl_global_cleanup();
          free(param);
          return EXIT_SUCCESS;
        }

        Good luck.

About Dropbox API Support & Feedback

Node avatar for Dropbox API Support & Feedback

Find help with the Dropbox API from other developers.

5,877 PostsLatest Activity: 5 minutes ago
325 Following

If you need more help you can view your support options (expected response time for an email or ticket is 24 hours), or contact us on X or Facebook.

For more info on available support options for your Dropbox plan, see this article.

If you found the answer to your question in this Community thread, please 'like' the post to say thanks and to let us know it was useful!