Hello and thanks for your time and consideration. I am developing a Jupyter Notebook in the Google Cloud Platform / Datalab. I have created a Pandas DataFrame and would like to write this DataFrame to both Google Cloud Storage(GCS) and/or BigQuery. I have a bucket in GCS and have, via the following code, created the following objects:
import gcp import gcp.storage as storage project = gcp.Context.default().project_id bucket_name = 'steve-temp' bucket_path = bucket_name bucket = storage.Bucket(bucket_path) bucket.exists()
I have tried various approaches based on Google Datalab documentation but continue to fail. Thanks
from google.cloud import storage import os import pandas as pd # Only need this if you're running this code locally. os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = r'/your_GCP_creds/credentials.json' df = pd.DataFrame(data=[{1,2,3},{4,5,6}],columns=['a','b','c']) client = storage.Client() bucket = client.get_bucket('my-bucket-name') bucket.blob('upload_test/test.csv').upload_from_string(df.to_csv(), 'text/csv')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With