Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to read BigQuery table using python pipeline code in GCP Dataflow

Could someone please share syntax to read/write bigquery table in a pipeline written in python for GCP Dataflow

like image 306
Aditya Dixit Avatar asked Jan 22 '18 16:01

Aditya Dixit


People also ask

What is dataflow in Python?

Dataflow is a managed service for executing a wide variety of data processing patterns. These pipelines are created using the Apache Beam programming model which allows for both batch and streaming processing.

What is PCollection in dataflow?

PCollection. A PCollection represents a potentially distributed, multi-element dataset that acts as the pipeline's data. Apache Beam transforms use PCollection objects as inputs and outputs for each step in your pipeline.


1 Answers

Run on Dataflow

First, construct a Pipeline with the following options for it to run on GCP DataFlow:

import apache_beam as beam

options = {'project': <project>,
           'runner': 'DataflowRunner',
           'region': <region>,
           'setup_file': <setup.py file>}
pipeline_options = beam.pipeline.PipelineOptions(flags=[], **options)
pipeline = beam.Pipeline(options = pipeline_options)

Read from BigQuery

Define a BigQuerySource with your query and use beam.io.Read to read data from BQ:

BQ_source = beam.io.BigQuerySource(query = <query>)
BQ_data = pipeline | beam.io.Read(BQ_source)

Write to BigQuery

There are two options to write to bigquery:

  • use a BigQuerySink and beam.io.Write:

    BQ_sink = beam.io.BigQuerySink(<table>, dataset=<dataset>, project=<project>)
    BQ_data | beam.io.Write(BQ_sink)
    
  • use beam.io.WriteToBigQuery:

    BQ_data | beam.io.WriteToBigQuery(<table>, dataset=<dataset>, project=<project>)
    
like image 170
Robbe Avatar answered Sep 23 '22 21:09

Robbe