I am trying to implement a Google Cloud function that is based on code in a Git style repository. I have the code linked to Google Cloud Platform's "Source Repositories" and my function runs fine when I copy and paste the code into the GCP Function "Inline editor". When I switch to the "Cloud Source repository" option, I can tell that it is reading from that repository; I worked through other errors prior to this one. However, after resolving prior/other issues, now I'm getting this error:
Function load error: File main.py that is expected to define function doesn't exist
my files are in a structure similar to this, with main.py
in the root directory:
.
├── package
| ├──script1.py
| └──script2.py
├── package2
├── ...
├── main.py
└── requirements.txt
It's reading fine from requirements.txt
(some of the prior errors came from that reading process), but why is it not reading from main.py
? My setup in the GCP Function looks like this:
I have tried to move main.py to another directory in the project and setting the "Directory with source code" to that directory, but that gave me an error saying that it couldn't find that directory. Any constructive ideas?
I am using a branch from my repository other than master
, and I am using a Google Cloud Pubsub topic trigger for this function.
Cloud Functions deployment can fail if the entry point to your code, that is, the exported function name, is not specified correctly. Your source code must contain an entry point function that has been correctly specified in your deployment, either via Cloud console or Cloud SDK.
If you're trying to upload your code zip using a GCS bucket or the file upload function, make sure that you don't zip the folder that contains your code but just the code files.
CodeFolder
├── package
| ├──script1.py
| └──script2.py
├── package2
├── ...
├── main.py
└── requirements.txt
Do NOT create a Zip file from the CodeFolder
.
Instead, create a zip file from main.py
and requirement.txt
and package
.
Source
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With