Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using custom docker containers in Dataflow

From this link I found that Google Cloud Dataflow uses Docker containers for its workers: Image for Google Cloud Dataflow instances

I see it's possible to find out the image name of the docker container.

But, is there a way I can get this docker container (ie from which repository do I go to get it?), modify it, and then indicate my Dataflow job to use this new docker container?

The reason I ask is that we need to install various C++ and Fortran and other library code on our dockers so that the Dataflow jobs can call them, but these installations are very time consuming so we don't want to use the "resource" property option in df.

like image 252
Jonathan Sylvester Avatar asked Jun 09 '17 19:06

Jonathan Sylvester


2 Answers

Update for May 2020

Custom containers are only supported within the Beam portability framework.

Pipelines launched within portability framework currently must pass --experiments=beam_fn_api explicitly (user-provided flag) or implicitly (for example, all Python streaming pipelines pass that).

See the documentation here: https://cloud.google.com/dataflow/docs/guides/using-custom-containers?hl=en#docker

There will be more Dataflow-specific documentation once custom containers are fully supported by Dataflow runner. For support of custom containers in other Beam runners, see: http://beam.apache.org/documentation/runtime/environments.


The docker containers used for the Dataflow workers are currently private, and can't be modified or customized.

In fact, they are served from a private docker repository, so I don't think you're able to install them on your machine.

like image 152
Pablo Avatar answered Nov 21 '22 04:11

Pablo


Update Jan 2021: Custom containers are now supported in Dataflow.

https://cloud.google.com/dataflow/docs/guides/using-custom-containers?hl=en#docker

like image 41
Travis Webb Avatar answered Nov 21 '22 03:11

Travis Webb