Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Google Dataflow - Failed to import custom python modules

My Apache beam pipeline implements custom Transforms and ParDo's python modules which further imports other modules written by me. On Local runner this works fine as all the available files are available in the same path. In case of Dataflow runner, pipeline fails with module import error.

How do I make custom modules available to all the dataflow workers? Please advise.

Below is an example:

ImportError: No module named DataAggregation

    at find_class (/usr/lib/python2.7/pickle.py:1130)
    at find_class (/usr/local/lib/python2.7/dist-packages/dill/dill.py:423)
    at load_global (/usr/lib/python2.7/pickle.py:1096)
    at load (/usr/lib/python2.7/pickle.py:864)
    at load (/usr/local/lib/python2.7/dist-packages/dill/dill.py:266)
    at loads (/usr/local/lib/python2.7/dist-packages/dill/dill.py:277)
    at loads (/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py:232)
    at apache_beam.runners.worker.operations.PGBKCVOperation.__init__ (operations.py:508)
    at apache_beam.runners.worker.operations.create_pgbk_op (operations.py:452)
    at apache_beam.runners.worker.operations.create_operation (operations.py:613)
    at create_operation (/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py:104)
    at execute (/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py:130)
    at do_work (/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py:642)
like image 565
Karthik N Avatar asked Jul 10 '18 09:07

Karthik N


1 Answers

The issue is probably that you haven't grouped your files as a package. The Beam documentation has a section on it.

Multiple File Dependencies

Often, your pipeline code spans multiple files. To run your project remotely, you must group these files as a Python package and specify the package when you run your pipeline. When the remote workers start, they will install your package. To group your files as a Python package and make it available remotely, perform the following steps:

  1. Create a setup.py file for your project. The following is a very basic setup.py file.

    setuptools.setup(
        name='PACKAGE-NAME'
        version='PACKAGE-VERSION',
        install_requires=[],
        packages=setuptools.find_packages(),
    )
    
  2. Structure your project so that the root directory contains the setup.py file, the main workflow file, and a directory with the rest of the files.

    root_dir/
        setup.py
        main.py
        other_files_dir/
    

See Juliaset for an example that follows this required project structure.

  1. Run your pipeline with the following command-line option:

    --setup_file /path/to/setup.py
    

Note: If you created a requirements.txt file and your project spans multiple files, you can get rid of the requirements.txt file and instead, add all packages contained in requirements.txt to the install_requires field of the setup call (in step 1).

like image 65
Michael Butler Avatar answered Nov 03 '22 02:11

Michael Butler