Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I upload to a shared drive in Python with Google Drive API v3?

How do I upload to a shared drive using the Python version of Google Drive API v3?

like image 976
cydonian Avatar asked Aug 20 '19 21:08

cydonian


People also ask

How do I directly upload to Google Drive from URL?

Step-1: First go to the website of Copy URL to Google Drive. Step-2: Now paste the link and select the target folder that you want to save to and click on the “Save& Copy to Google Drive” button. Step-3: In the third step& you have to click on the “Save URL to Google Drive” button in the new window to finish the task.


1 Answers

You just need to add supportsAllDrives=True to the Files: Create request.

# In case you need background on the credential/scoping code, it was copied from
#   https://developers.google.com/drive/api/v3/quickstart/python
# Basis for upload code from:
#   https://developers.google.com/drive/api/v3/manage-uploads

import pickle
import os.path
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
from apiclient.http import MediaFileUpload

# If modifying these scopes, delete the file token.pickle.
SCOPES = ['https://www.googleapis.com/auth/drive']

credentials_json = 'oauth-credentials.json'
credentials_pickle = 'token.pickle'

def get_creds():
    creds = None
    # Obtain OAuth token / user authorization.
    if os.path.exists(credentials_pickle):
        with open(credentials_pickle, 'rb') as token:
            creds = pickle.load(token)
    # If there are no (valid) credentials available, let the user log in.
    if not creds or not creds.valid:
        if creds and creds.expired and creds.refresh_token:
            creds.refresh(Request())
        else:
            flow = InstalledAppFlow.from_client_secrets_file(
                credentials_json, SCOPES)
            creds = flow.run_local_server(port=0)
        # Save the credentials for the next run
        with open(credentials_pickle, 'wb') as token:
            pickle.dump(creds, token)
    return creds


def main():
    creds = get_creds()

    # Build the drive service.
    drive_service = build('drive', 'v3', credentials=creds)

    # Get the drive ID of the first shared drive. You can introspect the
    # 'results' dict  here to get the right shared drive if it's not the first
    # one.
    results = drive_service.drives().list(pageSize=10).execute()
    shared_drive_id = results['drives'][0]['id']

    # Create the request metatdata, letting drive API know what it's receiving.
    # In this example, we place the image inside the shared drive root folder,
    # which has the same ID as the shared drive itself, but we could instead
    # choose the ID of a folder inside the shared drive.
    file_metadata = {
        'name': 'wakeupcat.jpg',
        'mimeType': 'image/jpeg',
        'parents': [shared_drive_id]}

    # Now create the media file upload object and tell it what file to upload,
    # in this case, "wakeupcat.jpg"
    media = MediaFileUpload('/path/to/wakeupcat.jpg', mimetype='image/jpeg')

    # Upload the file, making sure supportsAllDrives=True to enable uploading
    # to shared drives.
    f = drive_service.files().create(
        body=file_metadata, media_body=media, supportsAllDrives=True).execute()

    print("Created file '%s' id '%s'." % (f.get('name'), f.get('id')))


if __name__ == '__main__':
    main()
like image 85
cydonian Avatar answered Oct 26 '22 08:10

cydonian