Goal
We would like users to be able to upload images to Google Cloud Storage.
Problem
We could achieve this indirectly with our server as a middle man -- first, the user uploads to our server, then our privileged server can upload to Cloud Storage.
However, we think this is unnecessarily slow, and instead would like the user to upload directly to Cloud Storage.
Proposed Solution
To achieve a direct upload, we generate a Signed URL on our server. The Signed URL specifies an expiration time, and can only be used with the HTTP PUT verb. A user can request a Signed URL, and then - for a limited time only - upload an image to the path specified by the Signed URL.
Problem with the Solution
Is there any way to enforce a maximum file upload size? Obviously we would like to avoid users attempting to upload 20GB files when we expect <1MB files.
It seems like this is an obvious vulnerability, yet I don't know how to address it while still using SignedURLs.
There seems to be a way to do this using Policy Documents (Stack Overflow answer), but the question is over 2 years old now.
You can use Google Cloud Storage to store data in Google's cloud. Cloud Storage is typically used to store unstructured data. You can add objects of any kind and size, and up to 5 TB.
Uploading files to Google Cloud Storage from a URL is possible, but there are a few things to keep in mind. First, you'll need to create a Google Cloud Storage bucket and give it a name. Next, you'll need to create a file object in the bucket and provide the URL of the file you want to upload.
Options for generating a signed URL Simply specify Cloud Storage resources, point to the host storage.googleapis.com , and use Google HMAC credentials in the process of generating the signed URL.
My worked code in NodeJS was following https://blog.koliseo.com/limit-the-size-of-uploaded-files-with-signed-urls-on-google-cloud-storage/. You must use the version v4
public async getPreSignedUrlForUpload(
fileName: string,
contentType: string,
size: number,
bucketName: string = this.configService.get('DEFAULT_BUCKET_NAME'),
): Promise<string> {
const bucket = this.storage.bucket(bucketName);
const file = bucket.file(fileName);
const response = await file.getSignedUrl({
action: 'write',
contentType,
extensionHeaders: {
'X-Upload-Content-Length': size,
},
expires: Date.now() + 60 * 1000, // 1 minute
version: 'v4',
});
const signedUrl = this.maskSignedUrl(response[0], bucketName);
return signedUrl;
}
In the Frontend, We must set the same number of the Size in the header X-Upload-Content-Length
export async function uploadFileToGCP(
signedUrl: string,
file: any
): Promise<any> {
return new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.withCredentials = process.env.NODE_ENV === 'production';
xhr.addEventListener('readystatechange', function () {
if (this.readyState === 4) {
resolve(this.responseText);
}
});
xhr.open('PUT', signedUrl, true);
xhr.setRequestHeader('Content-Type', file.type);
xhr.setRequestHeader('X-Upload-Content-Length', file.size);
xhr.send(file);
});
}
And also don't forget to config the responseHeader
in the GS CORS
gsutil cors get gs://asia-item-images
[{"maxAgeSeconds": 3600, "method": ["GET", "OPTIONS", "PUT"], "origin": ["*"], "responseHeader": ["Content-Type", "Access-Control-Allow-Origin", "X-Upload-Content-Length", "X-Goog-Resumable"]}]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With