I'm working on an application backed by Core Data. Right now, I'm saving the Object Context as and when I add or delete an entity to and from the Context. I'm afraid it will affect the performance, so I was thinking of delaying the save. In fact, I could delay it all the way until the application is gonna terminate. Is it too risky to save the data only when the application is about to close? How often should I call the save on Object Context?
I was thinking of having a separate thread handle the save: it will wait on a semaphore. Every time any part of the application calls a helper/util method to save the Core Data, it will decrement the semaphore. When it is down to zero, the "save thread" will do a save once and it increments the semaphore to a, say, 5, and then sleep again.
Any good recommendation? Thanks!
Only Save A Context That Has Changes hasChanges : true if you have inserted, deleted or updated the object (a combination of the following properties). isInserted : true when you insert the object into a context. isUpdated : true when you change the object.
The next time you need to store data, you should have a better idea of your options. Core Data is unnecessary for random pieces of unrelated data, but it's a perfect fit for a large, relational data set. The defaults system is ideal for small, random pieces of unrelated data, such as settings or the user's preferences.
Core Data is a framework that you use to manage the model layer objects in your application. It provides generalized and automated solutions to common tasks associated with object life cycle and object graph management, including persistence.
All the variables and constants that you allocate in Swift are stored in memory, so they get lost when users quit the app. Persistence is saving data to a place where it can be re-accessed and retrieved upon restart of the device or app.
You should save frequently. The actual performance of the save operation has a lot to do with which persistent store type you're using. Since binary and XML stores are atomic, they need to be completely rewritten to disk on every save. As your object graph grows, this can really slow down your application. The SQLite store, on the other hand, is much easier to write to incrementally. So, while there will be some stuff that gets written above and beyond the objects you're saving, the overhead is much lower than with the atomic store types. Saves affecting only a few objects will always be fast, regardless of overall object graph size.
That said, if you're importing data in a loop, say, I would wait until the end of the complete operation to save rather than saving on each iteration. Your primary goal should be to prevent data loss. (I have found that users don't care for that very much!) Performance should be a close second. You may have to do some work to balance the frequency of saving against performance, but the solution you outline above seems like overkill unless you've identified a specific and significant performance issue.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With