Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Scheduling cron jobs on Google Cloud DataProc

I currently have a PySpark job that is deployed on a DataProc cluster (1 master & 4 worker nodes with sufficient cores and memory). This job runs on millions of records and performs an expensive computation (Point in Polygon). I am able to successfully run this job by itself. However, I want to schedule the job to be run on the 7th of every month.

What I am looking for is the most efficient way to set up cron jobs on a DataProc Cluster. I tried to read up on Cloud Scheduler, but it doesn't exactly explain how it can be used in conjunction with a DataProc cluster. It would be really helpful to see either an example of cron job on DataProc or some documentation on DataProc exclusively working together with Scheduler.

Thanks in advance!

like image 238
Alabhya Mishra Avatar asked Nov 18 '19 11:11

Alabhya Mishra


People also ask

How do I schedule a cron job in Google cloud?

For this, we use Cloud Scheduler to create a new job, which runs once a day. Cron jobs are scheduled at recurring intervals specified using the Unix cron format. You can define the schedule so that your job runs multiple times a day, or runs on specific days and months.

What types of jobs can be run on Google Dataproc?

What type of jobs can I run? Dataproc provides out-of-the box and end-to-end support for many of the most popular job types, including Spark, Spark SQL, PySpark, MapReduce, Hive, and Pig jobs.

What is Dataproc used for?

Dataproc is a managed Spark and Hadoop service that lets you take advantage of open source data tools for batch processing, querying, streaming, and machine learning. Dataproc automation helps you create clusters quickly, manage them easily, and save money by turning clusters off when you don't need them.


1 Answers

For scheduled Dataproc interactions (create cluster, submit job, wait for job, delete cluster while also handling errors) Dataproc's Workflow Templates API is a better choice than trying to orchestrate these yourself. A key advantage is Workflows are fire-and-forget and any clusters created will also be deleted on completion.

If your Workflow Template is relatively simple such that it's parameters do not change between invocations a simpler way to schedule would be to use Cloud Scheduler. Cloud Functions are a good choice if you need to run a workflow in response to files in GCS or events in PubSub. Finally, Cloud Composer is great if your workflow parameters are dynamic or there's other GCP products in the mix.

Assuming your use cases is the simple run workflow every so often with the same parameters, I'll demonstrate using Cloud Scheduler:

I created a workflow in my project called terasort-example.

I then created a new Service Account in my project, called [email protected] and gave it Dataproc Editor role; however something more restricted with just dataproc.workflows.instantiate is also sufficient.

After enabling the the Cloud Scheduler API, I headed over to Cloud Scheduler in Developers Console. I created a job as follows:

Target: HTTP

URL: https://dataproc.googleapis.com/v1/projects/example/regions/global/workflowTemplates/terasort-example:instantiate?alt=json

HTTP Method: POST

Body: {}

Auth Header: OAuth Token

Service Account: [email protected]

Scope: (left blank)

You can test it by clicking Run Now.

Note you can also copy the entire workflow content in the Body as JSON payload. The last part of the URL would become workflowTemplates:instantiateInline?alt=json

Check out this official doc that discusses other scheduling options.

like image 137
tix Avatar answered Sep 26 '22 05:09

tix