Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Exporting data from BigQuery into a local PostgreSQL/MySql database

I'm BigQuery and the Python API for BigQuery - virgin.

My task is: pull data from the google open dataset using BigQuery and Python API, load the data into the MySql/PostgreSQL local database (the one installed on my PC)

The way i see it (after my 2-hour research) is: pull data into a local csv file (using this kind of a sample code), load the data from the file into the database.

Does this resemble the right way of performing the task?

If not, could you please provide some advice/instruction?

Thanks in advance.

like image 354
Denys Avatar asked Oct 31 '22 17:10

Denys


1 Answers

The API method

The steps you listed above are correct. In summary, a developer would:

  1. Use the BigQuery API's "Jobs: insert" method to insert a data Extraction job - this results in the data being sent to a Google Cloud Storage bucket.

  2. Use the Google Cloud Storage API's "Objects: get" method to download the file locally.

  3. For extra credit, I would query the original BQ table using the "Tables: get" method to get the original BigQuery table schema, and use this to build the schema for the MySQL import step.

The WEB UI method

Why not just run whatever query you need using the Google BigQuery Web UI, then export the data into a Google Cloud Storage bucket via the UI? Why do you need to use Python?

like image 103
Michael Manoochehri Avatar answered Nov 15 '22 04:11

Michael Manoochehri