Less fragmentation coming in Firefox 3

I’ve had a lot of people ask whether the memory improvements that I’ve been doing recently will make it in to Firefox 3 or if they’ll only appear in a more future release.

The basic answer: many of these fixes will be included in Firefox 3, but not all of them.

The more complicated answer is that we’re still analyzing the problem and working on solutions. At this point, we’re still digging through the data and finding hotspots. We’ve already identified quite a few places where we will be able to make improvements — some big and some small — and we’re evaluating each for overall invasiveness and impact so we can make the best decisions possible about how and where to implement these fixes.

As we’re already in the beta phase for Firefox 3 we have to be very careful not to add too much risk to the process, so we’re prioritizing memory improvements to get the biggest improvements for the least additional risk. This isn’t to say that we won’t work on fixes with higher potential risk, but we do have to be very careful. What this means is that we won’t be able to address every single issue, but should be able to knock out the big ones in time for Firefox 3.

Eliminating memory fragmentation entirely is almost impossible. We’ve got some amazing tools built now to debug the issues and test our progress. We’ve got several big issues on our radar that we believe will give us big wins. The current plan of attack is to reduce numbers of allocations, group allocations of similar lifetimes together in to pools, move areas with similar size allocations in to their own areas in memory, and to look at general malloc replacement solutions. We’re looking at all of these things in parallel and have some data on each but not enough to report anything useful yet. I hope to have some good data on each of these areas by early next week. We’ve built some pretty amazing tools for testing our progress and will be able to show visually how we’re improving.

With as many of these fixes going in to Firefox 3 as possible, Firefox 3 should provide significant improvement in long term memory use over previous versions.

19 thoughts on “Less fragmentation coming in Firefox 3

  1. sam

    Don’t take this the wrong way, but why is all the fuss about memory happening so close to release? Is it because you guys wanted to wait until all the new features were finalized?

  2. pavlov Post author

    sam: take a look at my previous post. Basically we’ve been working on memory things for a while now and had hoped that other fixes would reduce the memory use. They haven’t, so here we are. That it happens to be close to release is really nothing more than a coincidence.

  3. Ikim

    I’ve tried one of the FF3 nightly builds and I’m very pleased with the results (I sometimes have over 50 tabs open and FF2 is very memory hungry).

    Thanks for the great work.

  4. Noel Grandin

    – add some debug code that tracks the location of each allocation
    – add some more debug code that prints out the memory map and allows you to dig into the allocation site of each allocation

    Now find those allocation sites that are predominantly responsible
    for preventing pages from being reclaimed and move them into their own arenas/pools.

    It’s the longer-lived items that need to be segregated, not the shorter ones.

  5. Steve Parkinson

    Can you tell me if I have this right:

    Some people complained about memory usage. Some people complained about performance. You say “we use memory to improve performance” (via caches, etc). Now the investigation seems to be that there is some amount of fragmentation, but the upshot is that you’re going to surgically reduce memory usage to cut down fragmentation, and improve performance.

    But the primary goal is to improve performance, right?

    Here’s a theory: Firefox’s heavy use of GDI objects is contributing to the performance problems.

    If you use Procmon or CPUmon (or even the windows task manager View->Select Columns), you can show GDI object usage for each process. When Firefox starts acting bloated, task manager shows it’s consistently using about 3-10 times as many object as other processes (except I recently installed iTunes, which is also a GDI hog).

    So, while I notice that memory usage is correlated with performance problems, GDI usage is also correlated. Assuming that the above is true, an explanation might be that whatever windows data structure holds GDI objects doesn’t scale well.

    I’ve done some googling and I don’t see anyone giving a a rule of thumb e.g. ‘over 1000 GDI objs is bad’

  6. pavlov Post author

    Steve: Both memory and performance are primary goals. We want to be as small as possible and as fast as possible. It is obviously a careful balance.

    As for GDI objects, I’ve actually looked quite a bit at GDI usage and haven’t seen much correlation between >1000 GDI objects and performance. We primarily use GDI objects for fonts and images. With images, in most cases we’ve seen that optimizing our image data to be in GDI objects results in faster rendering for roughly the same system resources. That said, we have seen in some cases with some set of video card drivers that things are actually slower. We’ve been unable to figure out exactly what the combination of things is that causes things to be slower, but it doesn’t seem to be common.

    In Firefox 3, our average GDI object usage may be a bit higher than Firefox 2. That said, we’re actually more restrictive over our use and shouldn’t grow unbounded. Now that we don’t hold on to decoded images as long, you should also see a decrease in their use if you’ve got lots of tabs open in the background.

    XP supposedly has a per-process 10000 GDI object limit but in reality the limit depends on what the objects are, their sizes and other things (all of which are completely undocumented). We’ve done our best to balance usage and make sure we don’t whatever the magic limit is that causes things to stop drawing.

  7. pd

    In the few times I’ve experienced serious memory bloat/performance lag after running Firefox for a while, the browser has actually been reasonably stable under the ‘load’.

    Do you think it’s possible people are experiencing load issues in Firefox that are not triggering crashes and therefore talkback/breakpad incidents?

    Wouldn’t it be very useful to write a tool not unlike Hendrix or Reporter that allowed users experiencing load issues to report it? This could provide both a useful developer tool and a ‘circuit breaker’ for frustrated users. It could also help improve the realism of quality reports to mozilla. For example instead of saying we have X crash conditions/reports, you could say we have X bloat issues and X crash conditions/reports.

  8. Pingback: Firefox 3 Beta 1 « pavlov.net

  9. Jack

    Great news.

    I am currently trialling the beta and have noticed no real problems. The memory footprint was a major problem for me, and am keeping an eagle eye out for a recurrence.

    Thanks for the effort and the astonishingly open communications.


  10. Adrenalin

    I think you must try to discard unused info from unused tabs. Stop flash scripts and javascripts..etc

    It’s the same as opening 50 Ms Word Application, and asking why it start to use too much memory ?

    Sometimes I open 50 tabs (because www is all about link-to-link, site-to-site), and while I’m in the 50th tab, I don’t really need the 4th tab, so why not just discard memory used by the 4th tab ? Why must 4th tab eat the memory what can be used for the next 51th tab ? %)

    You know, in fact I saw this principle in Google Android.. Cuz mobiles phone have limited memory. So when user open one more application, Android platform “hibernate” one unused. Yeh, so unfortunately it’s not me who invented that ;o)

    And why the heck do I need the “back caching” of 10 pages on all 50 tabs ? I think this is an overhead, why just don’t make the number o “back caching” pages lower and lower when the number of tabs grow ?

    I think you forgot to put the amount of tabs into count..

    Forget about fragmentation and just imagine you have only 64mb of ram for everything.

  11. DrkShadow

    I’ve been under the belief for a _long_ time now that images have been the primary culprit. If I browse a great many Photobucket albums with a great many images, memory usage will inevitably go through the roof — if I have a session where I don’t do that, it takes significantly longer.

    Combine that with _animated gifs_ and things just get out of control.. on a machine with 2 gigs ram and 2 gigs swap, I regularly see FF using over 25% of system memory. That’s with about 50 tabs.

    To back up my suspicion, I finally found a setting so that animated gifs animate only once. The memory usage seems to remain about _half_ what it always was before. (image.animation_mode=”once”)

    I’ve been trying to get time to build a debug mozilla and dig in.. and I got some time to work on it today — initial results are various assertions failing when I start my debug build, so I think I need to get the config file from a gentoo build to start with :-/


  12. pavlov Post author

    DrkShadow: I’ve done a lot of testing with images turned on/off. For beta 2 you’ll see animated GIFs taking up much less memory than before. That said, there is no reason I can think of why they would continue to grow while animating. They shouldn’t take up any additional memory resources. I assume you’re on Linux — We’re currently adjusting how we use X resources for images which should help some of your problems as well. Most of these things will be in beta 2.

    That said, I’ve done a lot of testing and concluded that images aren’t really contributing to our fragmentation problems. They do use up a lot of memory though and we’ve done a lot to combat that (expiring the uncompressed version after a while, storing animations more efficiently, etc). We have some additional plans for reducing image memory consumption but they’re much bigger scope changes and probably won’t make Firefox 3.

  13. pavlov Post author

    Adrenalin: We don’t back cache 10 pages on all 50 tabs. There is a max version of back pages that is across all tabs and windows. “Hibernating” applications on mobile devices isn’t a new thing, but we’re a desktop application. Swapping out to disk and having to page back in to memory can be very slow.

    In your example, it is very difficult for us to know that you don’t want the 4th tab anymore. You might want to switch back to it immediately and if it is swapped out the response to it loading back could be very slow. I’m not sure this is the behavior most users want. We’re working on various techniques to reduce memory in documents (tabs, windows) that haven’t been recently accessed, but I don’t think we’ll be completely swapping them out.

    As we look at mobile, we may have no other choice but to do something like this, but for a desktop browser, there are better ways.

  14. DrkShadow

    Ok, if fragmentation is the real issue.. than the real solution should be to remove fragmentation.

    Lets suppose we have 1000 items, deallocate 500 at random — then we have a lot of very small gaps. Why not create a routine to allocate new memory, copy the object to the new memory, and deallocate the new memory?

    The point here would be that you grab new memory with only the existing objects, free _all_ of the fragmented memory, and you’ve solved the fragmentation problem for a certain time.

    Of course, you then have to worry about pointers, so the hard part would perhaps be encapsulating everything inside an object — so only the one object has to be modified for the effect to work — or implement a callback system, so everything with a pointer to the object can be told the new address of the object.

    The former would still have some fragmentation, as you can’t reallocated the containing objects. This may or may not be worse that what is being seen. The latter would be great, but would likely greatly increase complexity, and requiring more memory to implement things as objects so that they _can_ receive a callback and change memory references.

    .. just some thoughts. I’m curious, is the Boehm garbage collector that I saw recently in use by default? Can I check it? does it improve fragmentation?


  15. pavlov Post author

    DrkShadow: We don’t use the Boehm garbage collector for anything except possibly some Mac Java stuff. Converting all of our uses of pointers to handles so they could be moved around isn’t really feasible. Using pools, fewer allocations, and a better general allocator as I mentioned in my post are probably going to give us the best most realistic wins. Certainly as more of our front end moves in to Javascript, which is garbage collected, that could help.

  16. Adrenalin

    pavlov, ok, thanks for answers.

    Maybe that’s just me, but I really hate to close tabs/applications, cuz I’m lazy. anyway, one time in a while, when my system starts to freeze, I start to loose time waiting, I just stop and close everything, and then it starts all again, it grow, grow, and I run again and again “the garbage collector” %)

    Or, doh, maybe it will be easier just to buy another 2gb of ram, hehe

    But I’ll really like, when I for example run Photoshop (or any other expensive program), all other processes go to sleep, as I don’t need them atm, and allow me work very fast at the task what I want to do right now.

    Maybe I just need the Android think on my desktop, not sure ;o)

    Ok, thanks for listening, I really trust you guys, hope you’ll make all the best to make the best browser in the world even better.

    Anyway, I still think, if our computers were slower than today, firefox were more optimized..

    Anyway, it visible what in last time there’s more activity, more blogs, more concerning about firefox’s performance, and I’m very glad about that.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s