Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

GCS with GKE, 403 Insufficient permission for writing into GCS bucket [duplicate]

Currently I'm trying to write files into Google Cloud Storage bucket. For this, I have used django-storages package.

I have deployed my code and I get into the running container through kubernetes kubectl utility to check the working of GCS bucket.

$ kubectl exec -it foo-pod -c foo-container --namespace=testing python manage.py shell

I can able to read the bucket but if I try to write into the bucket, it shows the below traceback.

>>> from django.core.files.storage import default_storage
>>> f = default_storage.open('storage_test', 'w')
>>> f.write('hi')
2
>>> f.close()
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/google/cloud/storage/blob.py", line 946, in upload_from_file
    client, file_obj, content_type, size, num_retries)
  File "/usr/local/lib/python3.6/site-packages/google/cloud/storage/blob.py", line 867, in _do_upload
    client, stream, content_type, size, num_retries)
  File "/usr/local/lib/python3.6/site-packages/google/cloud/storage/blob.py", line 700, in _do_multipart_upload
    transport, data, object_metadata, content_type)
  File "/usr/local/lib/python3.6/site-packages/google/resumable_media/requests/upload.py", line 98, in transmit
    self._process_response(result)
  File "/usr/local/lib/python3.6/site-packages/google/resumable_media/_upload.py", line 110, in _process_response
    response, (http_client.OK,), self._get_status_code)
  File "/usr/local/lib/python3.6/site-packages/google/resumable_media/_helpers.py", line 93, in require_status_code
    status_code, u'Expected one of', *status_codes)
google.resumable_media.common.InvalidResponse: ('Request failed with status code', 403, 'Expected one of', <HTTPStatus.OK: 200>)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<console>", line 1, in <module>
  File "/usr/local/lib/python3.6/site-packages/storages/backends/gcloud.py", line 75, in close
    self.blob.upload_from_file(self.file, content_type=self.mime_type)
  File "/usr/local/lib/python3.6/site-packages/google/cloud/storage/blob.py", line 949, in upload_from_file
    _raise_from_invalid_response(exc)
  File "/usr/local/lib/python3.6/site-packages/google/cloud/storage/blob.py", line 1735, in _raise_from_invalid_response
    raise exceptions.from_http_response(error.response)
google.api_core.exceptions.Forbidden: 403 POST https://www.googleapis.com/upload/storage/v1/b/foo.com/o?uploadType=multipart: Insufficient Permission
>>> default_storage.url('new docker')
'https://storage.googleapis.com/foo.appspot.com/new%20docker'
>>>

Seems like it was completely related to the bucket permissions. So I have assigned Storage admin , Storage object creator roles to google cloud build service account (through bucket -> manage permissions) but still it shows the same error.

like image 239
Avinash Raj Avatar asked Nov 03 '17 05:11

Avinash Raj


1 Answers

A possible explanation for this would be if you haven't assigned your cluster with the correct scope. If this is the case, the nodes in the cluster would not have the required authorisation/permission to write to Google Cloud Storage which could explain the 403 error you're seeing.

If no scope is set when the cluster is created, the default scope is assigned and this only provides read permission for Cloud Storage.

In order to check the clusters current scopes using Cloud SDK you could try running a 'describe' command from the Cloud Shell, for example:

gcloud container clusters describe CLUSTER-NAME --zone ZONE

The oauthScopes section of the output contains the current scopes assigned to the cluster/nodes.

The default read only Cloud Storage scope would display:

https://www.googleapis.com/auth/devstorage.read_only

If the Cloud Storage read/write scope is set the output will display:

https://www.googleapis.com/auth/devstorage.read_write

The scope can be set during cluster creation using the --scope switch followed by the desired scope identifier. In your case, this would be “storage-rw”. For example, you could run something like:

gcloud container clusters create CLUSTER-NAME --zone ZONE --scopes storage-rw

The storage-rw scope, combined with your service account should then allow the nodes in your cluster to write to Cloud Storage.

Alternatively you if you don't want to recreate the cluster you can create a new node pool with the new desired scopes, then delete your old node pool. See the accepted answer for Is it necessary to recreate a Google Container Engine cluster to modify API permissions? for information on how to achieve this.

like image 188
neilH Avatar answered Nov 18 '22 11:11

neilH