I have a python Cloud function code which reads .txt file from GCS, parses it and writes the rows into bigquery. When I try to deploy this cloud function into Google cloud from my MacOS, it gives me below error
I have verified Bigquery API is enable in my GCP project.
gcloud functions deploy sql_upload --runtime python37 --trigger-bucket test-bucket --entry-point load_sql
Deploying function (may take a while - up to 2 minutes)...failed.
ERROR: (gcloud.functions.deploy) OperationError: code=3, message=Function failed on loading user code. Error message: Code in file main.py can't be loaded.
Detailed stack trace: Traceback (most recent call last):
File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 305, in check_or_load_user_function
_function_handler.load_user_function()
File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 184, in load_user_function
spec.loader.exec_module(main)
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/user_code/main.py", line 24, in <module>
from google.cloud import bigquery
ImportError: cannot import name 'bigquery' from 'google.cloud' (unknown location)
Dependencies in Python are managed with pip and expressed in a metadata file called requirements.txt. This file must be in the same directory as the main.py file that contains your function code.
If this code is working on your computer then just create the requirements.txt file by:
pip freeze > requirements.txt
otherwise you first have to install the dependency for bigquery and then create the requirements file:
pip install --upgrade google-cloud-bigquery
pip freeze > requirements.txt
see the docs https://cloud.google.com/functions/docs/writing/specifying-dependencies-python https://cloud.google.com/bigquery/docs/reference/libraries#client-libraries-install-python
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With