I've just noticed that my app's storage was bumped almost to it's 5GB limits of free usage within the last few weeks. After checking it in more details, it appeared that this was caused by the "artifacts" bucket.
I saw this SO question which says that the "artifacts" bucket is related to Node 10 environment.
I indeed moved to Node 10 a month ago, but after figuring out that the logs are no longer structured in the firestore functions console, I've reverted back to Node 8 a few days later and only using Node 8 since then.
However I can see that the "artifacts" storage keeps increasing by about ~800Mb every week which worries me to say the least (please check the screenshots below)
I assume this is related to firestore functions deploys (or not?), but is this really expected? Can I cleanup this artifacts safely?
It looks very strange to me that it increased that dramatically within just a few weeks while previously I was using functions for a few years and never had any issues like that.
Appreciate any suggestions on how to safely handle storage size in this case and to keep its consumption at minimum.
I'm also using pubsub.schedule
function in case it matters here.
I've also noticed that "artifact's" bandwidth spiked quite unexpectedly which I guess also has cost implications and I'd appreciate any input on possible approaches to minimize such spikes as well (about 22GB out of 22.5GB came from the "artifacts" bucket):
Storage for your Hosting content is at no cost up to 10 GB. If you are not on the Blaze plan, and you reach the 10 GB limit of no-cost Hosting storage, you won't be able to deploy new content to your sites.
Basically the artifacts are used to help build the final image to be stored in the "gcf-sources" bucket. "you are free to delete the contents in "XX. artifacts", but please leave the bucket untouched, it will be used in the following deployment cycles."
The bandwidth measured is the number of bytes that are read from the bucket, so typically files you application code downloads (uploads are not charged). So if you check the file size of each file, and multiply each by the number of times it was read, you'll end up with the bandwidth that you used.
Cloud Storage for Firebase is a powerful, simple, and cost-effective object storage service built for Google scale. The Firebase SDKs for Cloud Storage add Google security to file uploads and downloads for your Firebase apps, regardless of network quality.
Figured out a solution - it appeared there is a way to setup an auto deletion rule in google cloud console for those images that clutter the storage.
go to the google cloud console, select your project -> storage -> browser https://console.cloud.google.com/storage/browser
Select the "artifacts" bucket
Under the "lifecycle" tab add a rule to auto delete old images (in my case I put "delete after 1 day since update" which works fine for me)
Storage is safe now!
NOTE: if you face any deployment issues later, like if you deploy several days in a row and if it gives you an error on deploy, just delete the whole "container" folder manually in the artifacts which should solve it and then redeploy again. (make sure not to delete the artifacts bucket itself!)
Hope the firebase team will improve that - the current behavior looks confusing as it easily leads to an unexpected bill unless you take extra steps to prevent that. But you'll never know that it will happen until it does.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With