I have a very simple trainer that follows the sample directory structure:
/dist
__init__.py
setup.py
/trainer
__init__.py
task.py
Under the /dist directory, runs fine locally:
$ gcloud ml-engine local train
--package-path=trainer
--module-name=trainer.task
Now, when trying to deploy it, under the /dist directory and this command:
$ gcloud ml-engine jobs submit training testA
--package-path=trainer
--module-name=trainer.task
--staging-bucket=$JOB_DIR
--region us-central1
It gives me an error "No moduled name trainer"
INFO 2017-04-13 12:28:35 -0700 master-replica-0 Installing collected packages: pyyaml, scipy, scikit-learn, trainer
INFO 2017-04-13 12:28:38 -0700 master-replica-0 Successfully installed pyyaml-3.12 scikit-learn-0.18.1 scipy-0.18.1 trainer-0.1
INFO 2017-04-13 12:28:38 -0700 master-replica-0 Running command: python -m trainer.task
ERROR 2017-04-13 12:28:38 -0700 master-replica-0 /usr/bin/python: No module named trainer
EDIT: here is the content of setup.py
from setuptools import find_packages
from setuptools import setup
REQUIRED_PACKAGES = [
'pyyaml',
'scipy==0.18.1',
'scikit-learn'
]
setup(
name='trainer',
version='0.1',
install_requires=REQUIRED_PACKAGES,
include_package_data=True,
description='Classifier test'
)
What am I doing wrong?
Thanks,
M
You are missing an important line in your setup.py, the packages
argument to the setup
function invocation (cf these instructions). Try this:
from setuptools import find_packages
from setuptools import setup
REQUIRED_PACKAGES = ['pyyaml','scipy==0.18.1','scikit-learn']
setup(
name='trainer',
version='0.1',
install_requires=REQUIRED_PACKAGES,
packages=find_packages(),
include_package_data=True,
description='Classifier test'
)
I've updated the CloudML Engine docs (may take a few days to propagate).
I replicated your command using --package-path=trainer
and the above changes, and things run properly in the cloud.
Finally, although it is harmless, the __init__.py
in dist/
is unnecessary and can safely be removed.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With