Everywhere I read about Firestore it is stated that it requires less denormalization than Realtime Firebase. I guess this is because it is a document database where you can point at specific documents and only retrieve that amount of data (?).
However, I wonder how to manage a situation where denormalization would still be useful (e.g., we can save a query to the document containing the full information data by storing that same value on other documents as well). If an update of that value is then required, is there something like the Realtime Firebase multi-path update (to update the value at every document) to solve this issue?
You can store your Cloud Firestore data in a multi-region location or a regional location. Note that if you already have an App Engine app with a location of either us-central or europe-west , your Cloud Firestore database will be considered multi-regional.
You can listen to a document with the onSnapshot() method. An initial call using the callback you provide creates a document snapshot immediately with the current contents of the single document. Then, each time the contents change, another call updates the document snapshot.
After you set your project's default GCP resource location, you cannot change it. If you set up Cloud Firestore or Cloud Storage, you're prompted to select your project's default GCP resource location in the Firebase console workflow.
A subcollection is a collection associated with a specific document. Note: You can query across subcollections with the same collection ID by using Collection Group Queries. You can create a subcollection called messages for every room document in your rooms collection: collections_bookmark rooms. class roomA.
I think what you wanted to say is that 'firestore requires less denormalisation than real time database' (both are Firebase products responsible for storing data)'. I don't think that this claim is necessarily true because it all comes down to the architecture of your data. Firestore enforces you to obey some good practises but that does not mean you can get similar architecture in the real time database.
Updating Denormalised Data
You can use batch writes to update denormalised data located at the different paths. Note however, that you can only update up to 500 entities in a single batch.
If you do not need to read any documents in your operation set, you can execute multiple write operations as a single batch that contains any combination of set(), update(), or delete() operations. A batch of writes completes atomically and can write to multiple documents.
Example from the Firebase Firestore Documentation
// Get a new write batch
var batch = db.batch();
// Set the value of 'NYC'
var nycRef = db.collection("cities").doc("NYC");
batch.set(nycRef, {name: "New York City"});
// Update the population of 'SF'
var sfRef = db.collection("cities").doc("SF");
batch.update(sfRef, {"population": 1000000});
// Delete the city 'LA'
var laRef = db.collection("cities").doc("LA");
batch.delete(laRef);
// Commit the batch
batch.commit().then(function () {
// ...
});
Note: It might not be clear form the code but none of the writes will be performed on Firestore until commit
method is invoked.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With