I’ve been doing a lot of work trying to figure out why after loading a lot of pages much of your memory seems to disappear. I’ve tested all sorts of things — disabling extensions, plugins, images, etc. I’ve run leak tools over and over looking for things we might be leaking. Occasionally I’ll find something small we’re actually leaking but more often than not I don’t see any real leaks. This lead me to wonder where our memory went. Firefox has a lot of caches internally for performance reasons. These include things like the back/forward cache (which helps speed up loading pages when you hit back), the image cache (keeps images in memory to help load them faster), font cache, textrun cache (short lived, but used to cache computed glyph indicies and metrics and such), etc. We also introduced in Gecko 1.9 the cycle collector which hopes to avoid cycles in XPCOM objects that we might hit. We’ve also got the JS garbage collector. All of these things mean we could be holding on to a bunch of objects that could be taking up space so we want to eliminate those from the picture. I released the RAMBack extension earlier this week which clears most of these things.
So, if it is none of these things, what is going on? Why after a while do we end up using more memory than we should be if we aren’t leaking and our caches are clear? At least part of it seems to be due to memory fragmentation.
Let me give you some examples (with pictures!):
Loading the browser with about:blank as my homepage:
This represents a heap size of 12,589,696. This is made up of a total of 11,483,864 bytes of used blocks and 1,105,832 bytes of free blocks in varying sizes.
Each block in the image represents 4096 bytes in memory. Things range from solid black which are completely used to white which are mostly free.
Loading a bunch of windows and closing them and clearing my caches
Although you can get similar results on many sites, schrep gave me this TripAdvisor hotel search page which opens up lots of windows with lots of pages. To generate this image, I loaded the URL, waited for all of the pages to open, closed them all, loaded about:blank, and then ran RAMBack. At the end of that, here is the result:
Our heap is now 29,999,872 bytes! 16,118,072 of that is used (up 4,634,208 bytes from before… which caches am I forgetting to clear?). The rest, a whopping 13,881,800 bytes, is in free blocks! These are mostly scattered in between tiny used blocks. This is bad.
Light green blocks are completely free pages. I’ve highlighted those because the OS could page them out if it wanted to. You’ll notice there aren’t very many light green squares…
So.. What does this mean?
Well, it means that any allocations >4k are going at the end because we can’t really fit them anywhere earlier. This is bad for a variety of reasons including performance. It makes it very difficult for us to get big chunks of contiguous memory to give back to the OS. This makes us look big!
Yeah, duh, I already knew fragmentation was bad.. Now what?
I’ll be filing bugs and posting more details shortly.
Thoughts, suggestions, and comments welcome!
Edit: I found a small bug in the code I used to generate my images which resulted in fewer light green (empty) blocks than there should have been. I’ve updated the images to show properly.