Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Save Pandas data frame to Google Cloud bucket

I want to upload a pandas data frame from local machine directly to Google Cloud Storage, thus, I am not in a Cloud Function. I tried different ways using write-a-pandas-dataframe-to-google-cloud-storage-or-bigquery. But I am not able to save.

Note: I can use google.cloud package only

Below is the code I tried

from google.cloud import storage
import pandas as pd
input_dict = [{'Name': 'A', 'Id': 100}, {'Name': 'B', 'Id': 110}, {'Name': 'C', 'Id': 120}]
df = pd.DataFrame(input_dict)

Try:1

destination = f'gs://bucket_name/test.csv'
df.to_csv(destination)

Try:2

storage_client = storage.Client(project='project')
bucket = storage_client.get_bucket('bucket_name')
gs_file = bucket.blob('test.csv')
df.to_csv(gs_file)

I am getting below errors

for option 1 : No such file or directory: 'gs://bucket_name/test.csv'

option 2: 'Blob' object has no attribute 'close'

Thanks,

Raghunath.

like image 811
Raghunath Avatar asked Jun 14 '19 11:06

Raghunath


1 Answers

from google.cloud import storage
import os
from io import StringIO # if going with no saving csv file

# say where your private key to google cloud exists
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = 'path/to/your-google-cloud-private-key.json'

df = pd.DataFrame([{'Name': 'A', 'Id': 100}, {'Name': 'B', 'Id': 110}])

Write it to a csv file on your machine first and upload it:

df.to_csv('local_file.csv')
gcs.get_bucket('BUCKET_NAME').blob('FILE_NAME.csv').upload_from_filename('local_file.csv', content_type='text/csv')

If you do not want to create a temp csv file, use StringIO:

f = StringIO()
df.to_csv(f)
f.seek(0)
gcs.get_bucket('BUCKET_NAME').blob('FILE_NAME.csv').upload_from_file(f, content_type='text/csv')
like image 63
Ali Khosro Avatar answered Sep 18 '22 01:09

Ali Khosro