You might see that the Dropbox Community team have been busy working on some major updates to the Community itself! So, here is some info on what’s changed, what’s staying the same and what you can expect from the Dropbox Community overall.

Forum Discussion

homopoluza's avatar
homopoluza
Helpful | Level 6
9 months ago

AssertionError: Expected content-type to be application/json, got 'application/grpc'

Hi, everybody!

I have a simple Python script to create an archive and store it in Dropbox. 

 

import subprocess
import os
import dropbox
import shutil
from datetime import datetime

class DropboxUploader:
    def __init__(self, dropbox_token, CHUNK_SIZE):
        self.dropbox_token = dropbox_token
        self.CHUNK_SIZE = CHUNK_SIZE
        self.dbx = dropbox.Dropbox(dropbox_token)

    def upload(self, file_name, file_size):
        with open(f"{pwd}/{file_name}", 'rb') as f:
            upload_session_start_result = self.dbx.files_upload_session_start(f.read(self.CHUNK_SIZE))
            cursor = dropbox.files.UploadSessionCursor(session_id=upload_session_start_result.session_id, offset=f.tell())
            commit = dropbox.files.CommitInfo(path=f"{pwd}/{file_name}")

            while f.tell() < file_size:
                if ((file_size - f.tell()) <= self.CHUNK_SIZE):
                    self.dbx.files_upload_session_finish(f.read(self.CHUNK_SIZE), cursor, commit)
                else:
                    self.dbx.files_upload_session_append_v2(f.read(self.CHUNK_SIZE), cursor.session_id, cursor.offset)
                    cursor.offset = f.tell()

#Change those variables 
root_dir = '/home/user/site_name'
site = 'site'
database = 'database'
dropbox_token = 'xxxxxxxxxxxxxxx'

CHUNK_SIZE = 8 * 1024 * 1024  # 8MB
archive_name = site + '_' + datetime.now().strftime("%Y%m%d_%H%M%S")
database_dump_name = database + '_' + datetime.now().strftime("%Y%m%d_%H%M%S") + '.sql'
pwd = os.getcwd()

archive_path = shutil.make_archive(archive_name, 'tar', root_dir)
archive_name = os.path.basename(archive_path)

command = f"mysqldump -u root {database} > {database_dump_name}"
subprocess.run(command, shell=True)

archive_size = os.path.getsize(f"./{archive_name}")
database_dump_size = os.path.getsize(f"./{database_dump_name}")

# Only for file sizes less than 150 MB 
# dbx = dropbox.Dropbox(dropbox_token)
# with open(f"./{archive_name}.tar", 'rb') as f:
#     dbx.files_upload(f.read(), f"{pwd}/{archive_name}.tar")
# with open(f"./{database_dump_name}.sql", 'rb') as f:
#     dbx.files_upload(f.read(), f"{pwd}/{database_dump_name}.sql")

uploader = DropboxUploader(dropbox_token, CHUNK_SIZE)
uploader.upload(archive_name, archive_size)
uploader.upload(database_dump_name, database_dump_size)

 

 But there is an error:

 

Exception has occurred: AssertionError
Expected content-type to be application/json, got 'application/grpc'
  File "/home/user/dropbox/dropbox_backup.py", line 15, in upload
    upload_session_start_result = self.dbx.files_upload_session_start(f.read(self.CHUNK_SIZE))
  File "/home/user/dropbox/dropbox_backup.py", line 54, in <module>
    uploader.upload(archive_name, archive_size)
AssertionError: Expected content-type to be application/json, got 'application/grpc'

 

 
I saw this post Python upload big file example - Dropbox Community (dropboxforum.com) my code is practically the same, and I don't get why mine isn't working. 

  • I just renewed the token, and all is good. Thank you for your time Greg-DB

  • Greg-DB's avatar
    Greg-DB
    Icon for Dropbox Staff rankDropbox Staff

    Thanks for the report! This error indicates that the Dropbox API servers did not respond with the expected format.

     

    I just tried this out myself though, and I wasn't able to reproduce this. This kind of issue could just be a result of temporary server issues. Are you still seeing this problem?

     

    If you are still seeing this, please let me know:

    • about what percent of your calls fail like this
    • if this code was previously working for you, and if so, when you saw this problem start occurring
    • whether or not other calls, such as `self.dbx.users_get_current_account()` also fail like this for you
    • homopoluza's avatar
      homopoluza
      Helpful | Level 6

      Sorry for the late reply.

      • about what percent of your calls fail like this
        All of them
      • if this code was previously working for you, and if so, when you saw this problem start occurring
        No, this is a new script for me, I wrote it with the help of this post
         https://www.dropboxforum.com/t5/Dropbox-API-Support-Feedback/python-upload-big-file-example/td-p/166626
        It worked for others without problems.
      • whether or not other calls, such as `self.dbx.users_get_current_account()` also fail like this for you
        Unfortunately yes. The token was provided to me by my colleague. Could it be restrictions of trial account perhaps?
        print(dbx.users_get_current_account()) AssertionError: Expected content-type to be application/json, got 'application/grpc'

        Thank you for your assistance 
      • Greg-DB's avatar
        Greg-DB
        Icon for Dropbox Staff rankDropbox Staff

        Thanks for following up. The API functionality you're attempting to use is available to all Dropbox plans/trials, and even if an API error is returned, it shouldn't be sent with that Content-Type.

         

        It's unclear what would be causing this, so we'll need to look into this further. Can you please try running the following at least 5 times and send me the output? This will enable more verbose output which may be helpful for investigating this issue:

        import http.client
        http.client.HTTPConnection.debuglevel = 1
        
        import dropbox
        dbx = dropbox.Dropbox('ACCESS_TOKEN_HERE')
        print(dbx.users_get_current_account())

         

        Be sure to replace ACCESS_TOKEN_HERE with the access token you're using. As long as that access token starts with "sl." and it has been more than four hours since the access token has been created, please leave it in the output as it would be expired anyway and may be helpful for investigating this.