I have a repository structure hosted in a Google Cloud Repository that looks like this for Google Cloud Functions:
.
module.py
/common
module1.py
module2.py
/cloudfunction1
main.py
requirements.txt
/cloudfunction2
main.py
requirements.txt
Where each of the cloudfunction directories is deployed as a separate cloud function.
What I'd like to do is import modules from either the common directory, or from the root, however utilising a sys.path.append('..')
approach doesn't appear to work. I presume this is because the cloud function deployment process only includes the files in the directory in which main.py is located?
How can I resolve this?
In order to import the sibling module in A.py wee need to specify the parent directory in A.py which can be done by using path.append () method in the sys module. Passing ‘..’ in append () method will append the path of the parent directory in A.py
As we have discussed earlier it is not possible to import a module from the parent directory, so this leads to an error something like this. File “C:/Users/sai mohan pulamolu/Desktop/parentdirectory/subdirectory/temp.py”, line 2, in <module> In order to import a module, the directory having that module must be present on PYTHONPATH.
To import a module using the import statement, we will first have to declare the directory a package by adding the __init__.py file in the parent directory. Once the parent directory is declared a package, we can import the module using the relative package approach. Suppose we have the following directory tree.
You are going to get the following error: Starting from Python 3.3, implicit relative references are allowed no more. That means that the ability to reference a module in the parent directory is not possible and becomes a major limitation.
If you find yourself needing to modify sys.path
or otherwise import "beyond the top level package", this is generally a code smell in Python that your project is not correctly structured.
In this example of Cloud Functions, one thing you can do is structure your project like this:
.
├── common
│ ├── module1.py
│ └── module2.py
├── main.py
└── requirements.txt
Where main.py
contains both functions:
from common import module1, module2
def cloudfunction1(request):
...
def cloudfunction2(request):
...
And you deploy those functions either directly by name:
$ gcloud functions deploy cloudfunction1 --runtime python37 --trigger-http
$ gcloud functions deploy cloudfunction2 --runtime python37 --trigger-http
Or by entrypoint:
$ gcloud functions deploy foo --runtime python37 --entry-point cloudfunction1 --trigger-http
$ gcloud functions deploy bar --runtime python37 --entry-point cloudfunction2 --trigger-http
Note that this has some downsides:
requirements.txt
file needs to contain all the dependencies for both functionscommon
directory, you'll need to redeploy both functionsThat said, if your functions are so related that they share common code and often need to be deployed together, a better option might be to make them part of a single App Engine app. (This only applies if they both use HTTP triggers).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With