I am running a Wordpress site on Google Appengine. It is using the GAE plugin for wordpress. The media library works on appengine server but not locally. Same for most of the images unless they have hardcoded links. I get tons of 404 errors...
http://localhost:8080/_ah/gcs/<BUCKET_NAME>/image.png Failed to load resource: the server responded with a status of 404 (Not Found)
where as this link on gae server works:
http://<BUCKET_NAME>.storage.googleapis.com/image.png
I am running my app locally like this:
dev_appserver.py --php_executable_path=/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/php55/php-cgi .
It seems google's python script is not correctly forwarding the link to the actual real bucket.... any ideas?
This is expected behaviour:
The link http://<BUCKET_NAME>.storage.googleapis.com/image.png
is the production link, and the request is served from the production server.
The link http://localhost:8080/_ah/gcs/<BUCKET_NAME>/image.png
is a local link, and the devserver has to actually contain <BUCKET_NAME>/image.png
. It's likely that the object is only stored in production, leading to a 404 when attempting to find it on the local machine.
As a solution, you might consider simply running your local tests against the production cloud storage, to save the cost of transferring (and maintaining the sync of) all the storage items to local, which will almost certainly cost more in network transfer and dev-time than this simpler alternative.
Not sure if this is the same exact issue, but I found that all of my gcs requests from dev_appserver.py were getting routed to
http://localhost:8080/_ah/gcs/<BUCKET_NAME>/<OBJECT_NAME>
... as opposed to the "real" gcs url, e.g.
https://www.googleapis.com/storage/v1/b/<BUCKET_NAME>/<OBJECT_NAME>
...therefore resulting in 404 errors when making GET requests for gcs objects.
I was able to remedy this issue by simply setting an access token, e.g.
cloudstorage.common.set_access_token("<TOKEN>")
*See the set_access_token
docstring below for how to acquire an access token.
Once that I did that, all of my gcs requests were properly routed.
After digging through the source code of appengine-gcs-client
, it looks like you must first set an access token if you would like to use dev_appserver to access live/remote content in gcs.
In cloudstorage.common:
def set_access_token(access_token):
"""Set the shared access token to authenticate with Google Cloud Storage.
When set, the library will always attempt to communicate with the
real Google Cloud Storage with this token even when running on dev appserver.
Note the token could expire so it's up to you to renew it.
When absent, the library will automatically request and refresh a token
on appserver, or when on dev appserver, talk to a Google Cloud Storage
stub.
Args:
access_token: you can get one by run 'gsutil -d ls' and copy the
str after 'Bearer'.
"""
Some additional hints/options in cloudstorage.storage_api:
def _get_storage_api(retry_params, account_id=None):
"""Returns storage_api instance for API methods.
Args:
retry_params: An instance of api_utils.RetryParams. If none,
thread's default will be used.
account_id: Internal-use only.
Returns:
A storage_api instance to handle urlfetch work to GCS.
On dev appserver, this instance will talk to a local stub by default.
However, if you pass the arguments --appidentity_email_address and
--appidentity_private_key_path to dev_appserver.py it will attempt to use
the real GCS with these credentials. Alternatively, you can set a specific
access token with common.set_access_token. You can also pass
--default_gcs_bucket_name to set the default bucket.
"""
google-api-python-client==1.6.4
GoogleAppEngineCloudStorageClient==1.9.22.1
Google Cloud SDK 200.0.0
alpha 2018.04.30
app-engine-python 1.9.69
app-engine-python-extras 1.9.69
beta 2018.04.30
bq 2.0.33
cloud-datastore-emulator 1.4.1
core 2018.04.30
gsutil 4.31
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With