Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

firebase batch updates and onWrite trigger synchronisation

I have an issue with synchronizing two Firebase cloud functions, the first one performing a batch update on multiple documents and the second one triggered by an onWrite trigger on one of those documents.

For illustration, let us say I have two documents A and B (in two separate collections).

  • A first cloud function updates both documents A and B with a firestore WriteBatch (both documents are successfully updated);
  • The write in document B triggers another cloud function (with an onWrite trigger). This function needs to read document A;
  • I have an error in this second function, that is because it read the old version of document A (before the write batch by the first function).

Is there a way to ensure the onWrite function is triggered only after both documents have been written ?

I could update them separately and await A to be written before I write B in the first function, but I want to keep the update of both in one transaction because these documents are linked, and I don't want to risk having one updated without the other.

like image 240
Lucas David Avatar asked Sep 03 '25 02:09

Lucas David


1 Answers

A batched write ensures that the writes are atomically completed: if one write fails, the other ones are not executed. On the other hand, to answer to one of your comments above, a batched write does not ensure that all the writes will be "instantaneous", which, by the way, is a notion that is difficult to define in IT, IMHO :-). AFAIK, a batched write does not ensure neither that the writes will be done in the order they were pushed to the batch.

So, if you want to trigger the second Cloud Function when all the writes composing the batched write are completed, you could use a Cloud Function triggered with Pub/Sub.

Concretely, do as follows in your index.js Cloud Functions file:

Declare a function that publishes a message:

async function publishMessage(messageConfig) {
    try {
        const pubSubClient = new PubSub();

        const topicName = messageConfig.topicName;
        const pubSubPayload = messageConfig.pubSubPayload;

        let dataBuffer = Buffer.from(JSON.stringify(pubSubPayload));
        await pubSubClient.topic(topicName).publish(dataBuffer);

    } catch (error) {
        throw error;
    }
}

In you Cloud Function that commits the batch, publish a message when the batched write is completed:

    await batch.commit();
    
    messageConfig = {
          topicName: 'your-topic',
          pubSubPayload: {
              docA_Id: '..........',  // Id of doc A
              docB_Id: '..........'   // Id of doc B
          }
    }
    await publishMessage(messageConfig);
    // ...

Write a pub/sub triggered Cloud Function that executes the desired business logic. If the same business logic needs to be triggered with an onWrite trigger, share the code between the two functions

    exports.updateDocB = functions.pubsub.topic('your-topic').onPublish(async (message) => {

       const docA_Id = message.json.docA_Id;   
       const docB_Id = message.json.docB_Id;   

       await updateDocB(docA_Id, docB_Id);
       // ...
    
    })


    async function updateDocB(docA_Id, docB_Id)  {
       // ......
    }
    // Call this function from the onWrite CF

If you want to avoid that the onWrite triggered Cloud Function is executed when the batched write is executed, you could flag docs A and B with the unique Cloud Function eventId via the batched write. If this flag is the same in A and B you don't execute the business logic in the onWrite triggered Cloud Function, because it will be handled by the pub.sub Cloud Function.


Of course, this is based on several assumptions and has to cope with the flow of events but it could be a possible solution!

like image 68
Renaud Tarnec Avatar answered Sep 05 '25 00:09

Renaud Tarnec