Since this morning, our Firebase application has a problem when writing data to the Realtime Database instance. Even the simplest task, such as adding one key-value pair to an object triggers
Error: TRIGGER_PAYLOAD_TOO_LARGE: This request would cause a function payload exceeding the maximum size allowed.
It is especially strange since nothing in our code or database has changed for more than 24 hours.
Even something as simple as
Database.ref('environments/' + envkey).child('orders/' + orderkey).ref.set({a:1})
triggers the error.
Apperently, the size of the payload is not the problem, but what could be causing this?
Database structure, as requested
environments
+-env1
+-env2
--+orders
---+223344
-----customer: "Peters"
-----country: "NL"
-----+items
------item1
-------code: "a"
-------value: "b"
------item2
-------code: "x"
-------value: "2"
The TRIGGER_PAYLOAD_TOO_LARGE error is part of a new feature Firebase is rolling out, where our existing RTDB limits are being strictly enforced. The reason for the change is to make sure that we aren't silently dropping any Cloud Functions triggers, since any event exceeding those limits can't be sent to Functions.
You can turn this feature off yourself by making this REST call:
curl -X PUT -d "false" https://<namespace>.firebaseio.com/.settings/strictTriggerValidation/.json?auth\=<SECRET>
Where <SECRET>
is your DB secret
Note that if you disable this, the requests that are currently failing may go through, but any Cloud Functions you have that trigger on the requests exceeding our limits will fail to run. If you are using database triggers for your functions, I would recommend you re-structure your requests so that they stay within the limits.
Ok I figured this out. The issue is not related to your write function, but to one of the cloud functions the write action would trigger.
For example, we have a structure like:
/collections/data/abcd/items/a
in JSON:
"collections": {
"data": {
"abc": {
"name": "example Col",
"itemCount": 5,
"items": {
"a": {"name": "a"},
"b": {"name": "b"},
"c": {"name": "c"},
"d": {"name": "d"},
"e": {"name": "e"},
}
}
}
}
Any write into an item was failing at all whatsoever. API, Javascript, even a basic write in the console.
I decided to look at our cloud functions and found this:
const countItems = (collectionId) => {
return firebaseAdmin.database().ref(`/collections/data/${collectionId}/items`).once('value')
.then(snapshot => {
const items = snapshot.val();
const filtered = Object.keys(items).filter(key => {
const item = items[key];
return (item && !item.trash);
});
return firebaseAdmin.database().ref(`/collections/meta/${collectionId}/itemsCount`)
.set(filtered.length);
});
};
export const onCollectionItemAdd = functions.database.ref('/collections/data/{collectionId}/items/{itemId}')
.onCreate((change, context) => {
const { collectionId } = context.params;
return countItems(collectionId);
});
On it's own it's nothing, but that trigger reads for ALL items and by default firebase cloud functions send's the entire snapshot to the CF even if we don't use it. In Fact it sends the previous and after values too, so if you (like us) have a TON of items at that point my guess it the payload that it tries to send to the cloud function is way too big.
I removed the count functions from our CF and boom, back to normal. Not sure the "correct" way to do the count if we can't have the trigger at all, but I'll update this if we do...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With