I am working on a Browser application in which I use a UIWebView for opening web pages. I run the Instruments tool with Memory Monitor. I am totally confused by the terms which are used in Instruments and why they're important. Please explain some of my questions with proper reasons:
Live Bytes is important for checking memory optimization or memory consumption? Why ?
Why would I care about the Overall Bytes/ Real Memory, if it contains also released objects?
When and why are these terms used (Live Bytes/ Overall Bytes/Real Memory)?
Thanks
On the first screen, a button, on tap, let’s us navigate to the second screen, where we intentionally create a memory leak using a retain cycle. We open the Xcode leaks instrument first and press the record button When the app automatically launches, we press on the navigate button that presents our leaking view controller
We open the Xcode leaks instrument first and press the record button When the app automatically launches, we press on the navigate button that presents our leaking view controller Finally, we observe the effect in the leaks instrument and memory graph. The source code is available on GitHub.
Keep in mind that iOS limits both physical and virtual memory. Memory warnings, this footprint value, and so on, are all about physical memory usage. However, it’s not uncommon to bump into the virtual memory limit as well.
No. Calculating the memory impact of a process is a complex job and there’s no API that’s guaranteed to return the same value as the memory debug gauge. What are you planning to use this value for?
"Live Bytes" means "memory which has been allocated, but not yet deallocated." It's important because it's the most easily graspable measure of "how much memory your app is using."
"Overall Bytes" means "all memory which has ever been allocated including memory that has been deallocated." This is less useful, but gives you some idea of "heap churn." Churn leads to fragmentation, and heap fragmentation can be a problem (albeit a pretty obscure one these days.)
"Real Memory" is an attempt to distinguish how much physical RAM is in use (as opposed to how many bytes of address space are valid). This is different from "Live Bytes" because "Live Bytes" could include ranges of memory that correspond to memory-mapped files (or shared memory, or window backing stores, or whatever) that are not currently paged into physical RAM. Even if you don't use memory-mapped files or other exotic VM allocation methods, the system frameworks do, and you use them, so this distinction will always have some importance to every process.
EDIT: Since you're clearly concerned about memory use incurred by using UIWebView, let me see if I can shed some light on that:
There is a certain memory "price" to using UIWebView at all (i.e. global caches and the like). These include various global font caches, JavaScript JIT caches, and stuff like that. Most of these are going to behave like singletons: allocated the first time you use them (indirectly by using UIWebView) and never deallocated until the process ends. There are also some variable size global caches (like those that cache web responses; CFURL typically manages these) but those are expected to be managed by the system. The collective "weight" of these things with respect to UIWebView is, as you've seen, non-trivial.
I don't have any knowledge of UIKit or WebKit internals, but I would expect that if you had a discussion with someone who did, their response to the question of "Why is my use of UIWebView causing so much memory use?" would be two pronged: The first prong would be "this is the price of admission for using UIWebView -- it's basically like running a whole web browser in your process." The second prong would be "system framework caches are automatically managed by the system" by which they would mean that, for instance, the CFURL caches (which is one of the things that using UIWebView causes to be created) are managed by the system, so if a memory warning came in, the system frameworks would be responsible for evicting things from those caches to reduce the memory consumed by them; you have no control over those, and you just have to trust that the system frameworks will do what needs to be done. (That doesn't help you in the case where whatever the system cache managers do isn't aggressive enough for you, but you're not going to get any more control over them, so you need to attack the issue from another angle, either way.) If you're wondering why the memory use doesn't go down once you deallocate your UIWebView, this is your answer. There's a bunch of stuff it's doing behind the scenes, that you can't control.
The expectation that allocating, using, and then deallocating a UIWebView is a net-zero operation ignores some non-trivial, inherent and unavoidable side-effects. The existence of such side-effects is not (in and of itself) indicative of a bug in UIWebView. There are side effects like this all over the place. If you were to create a trivial application that did nothing but launch and then terminate after one spin of the run loop, and you set a breakpoint on exit()
, and looked at the memory that had been allocated and never freed, there would be thousands of allocations. This is a very common pattern used throughout the system frameworks and in virtually every application.
What does this mean for you? It means that you effectively have two choices: Use UIWebView and pay the "price of admission" in memory consumption, or don't use UIWebView.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With