Everyone Loves an Old Fox, Right?

Michael J. Fox 2012 (cropped) (2).jpg
By Paul Hudson (original)
Supernino (derivative work) – Flickr (original), CC BY 2.0, Link

Everyone loves an old Fox, right?  Well, actually no.  Unlike the Michael J. variety, Mozilla Firefox has slipped into almost complete obscurity as a desktop browser.  Depending on your source of data, Firefox’s marketshare has slumped from a peak in 2009 of just over 30%, to a lowly 7.69% in August, 2016.  Although, it has regained some ground since August with 12.22% in December 2016 and is holding steady.

“Grandpa Fox, You Smell Funny!”

There is all sorts of speculation on why Firefox is losing its user base.  One obvious contributing factor is Firefox’s age.  It’s just not the sprightly, cheeky, youthful fox it once was.

Jen Simmons of Mozilla explains that:

… Firefox runs on Gecko. Which is old. Like really old. I think it’s the oldest rendering engine still in wide use.

Old Fox
by taa on DeviantArt

While Firefox may have the oldest rendering engine (the bit that draws the web pages), it is also wise.  Many of those who contributed to Firefox, used that wisdom to build a web browser anew, that we know as Google Chrome.

Thus, from the ground-up, Google Chrome’s build profited from the many lessons learned by Firefox, and of course from others in related fields.  Furthermore, one key advantage to building a brand new browser at the time, was the opportunity to take advantage of the advancing multi-processor/multi-core computing architecture.  Multi-core processors were becoming more and more commonplace for desktop computers at the time of Google Chrome’s conception.  While the genesis of Firefox occurred at a time where multiprocessing existed primarily in the realm of server hardware, and very high-end workstations.

Thus, Firefox always executed as a single process application.  This means that Firefox can only execute its instructions sequentially across all windows and tabs, even if your computer sports a quad-core hyperthreaded processor.  Essentially, Firefox does a little work on this tab, then a little on this tab, and so on in a round-robin type fashion, only ever making use of a single processor core at any point in time.  An oversimplification – but I’m sure you get the gist.

By contrast, the likes of Google Chrome create a separate process or execution pathway for each tab that you open.  This means that Chrome can perform tasks in parallel.  Your computer can be composing/rendering and updating multiple Chrome tabs concurrently.  This is what powers the fresh and responsive experience you have with Chrome.

There’s an App Addon For That

An innovative aspect of Mozilla Firefox has been its addons framework.  The ability for people to customise their browsing experience, independent of the Webmaster’s intent was truly revolutionary at the time.  The Mozilla Addons eco-system emerged as a result, where people could publish their Firefox Addons for others to use.  Addons to change the behaviour of web pages, and the behaviour of Firefox itself were abound.  Perhaps not quite to the extent of the App Store concept of Apple, but it was not uncommon to hear the expression, “there is a Firefox add-on for that.”

Yet, this very innovation, would in time, become Firefox’s Achilles’ heel.  Giving addon developers access to the internal intricacies of the Firefox browser, began to constrain Firefox developers in evolving the core of the software to keep up with its competitors.  Andy McKay puts it simply:

Because there is no layer between many of the XUL and XPCOM internals of Firefox, its really hard for Firefox developers to change things without being hampered by the effect on add-on developers. It’s really hard to move Firefox forward when every change might break the experience for an unknown number of users.

Firefox is stuck between a rock and a hard place.  What to do?

Can you teach an old fox, new tricks?

You can’t teach an old dog new tricks…
by Renardette on DeviantArt

Oh my, yes of course.  Mozilla’s achievements haven’t come from complacency.  2017 is going to be a big year for Mozilla and Firefox.  Firstly, let’s consider our single process dilemma.


What is this electrolysis you ask?

The removal of hair roots or small blemishes on the skin by…

Oh wait…

Chemical decomposition produced by passing an electric current through a liquid or solution containing ions.

That is a much better metaphor.  So in the context of Firefox, Electrolysis, or sometimes expressed as “e10s”, is the decomposition of the traditional monolithic Firefox process into a more modular architecture that means that it can be spread across multiple processes.

I won’t repeat all of the benefits to e10s here, as Dan Callahan of Mozilla has done a great job of that himself when he published, The “Why” of Electrolysis.

This new trick alone should see Firefox gain on Google Chrome and others in terms of performance.  I’m writing this very blog post with e10s switched on.  Firefox has never looked so spritely. 🙂

Before I move on however, I would like to draw attention to one difference between Firefox’s implementation of multi-processing, and that of Google Chrome.  I mentioned earlier that Chrome starts a separate process for each and every tab you have open.  Each time you start a new process, the instructions, or in computer science parlance, the text of the process is shared in memory with other processes executing the same code (other Firefox processes).  However, each process has its own working memory space that it uses to perform its task called the heap.  Each time a separate process is started in Chrome, it is assigned its own heap memory.  If you have many tabs open, this memory consumption can grow quite substantially.  Consider this graph from ghacks:

Because Firefox traditionally has been a single process application, every tab shares the same heap memory, reducing some duplication.  The overhead is much lower than a multi-process architecture as used by the others in the benchmark, including Chrome.

So you may wonder, won’t Electrolysis lead Firefox to the same fate?  Actually, no.  Unlike Chrome, Firefox’s multi-processing architecture does not do a one-to-one mapping of process to tab.  A single Firefox Web Content process can manage multiple tabs, exemplified by the first phase of Electrolysis being to “… split Firefox into a UI process and a content process” for rendering tab content.  This initial first phase was intentional to test the reliability of the implementation as it went out to millions of Firefox users.  But the number of Web Content processes that can be created by Firefox is configurable and will expand as Firefox’s new multiprocess model is validated as reliable.  I presently have my Firefox set to allow up to 8 Web Content processes.

Benchmarks conducted by Eric Rahm, part of the Memshrink Group of Mozilla have shown that 2 Web Content processes yield only a 10-20% increase in memory, compared to non-electrolysis usage.  With 8 Web Content processes, memory usage doubles compared to non-e10s usage.  This is the case no matter how many tabs you open.  So Mozilla have taken a sensible approach of mitigating the performance, stability, and memory consumption factors and optimised for all.  You can have your cake and eat it too it would seem.  Eric has done a benchmark comparison of multiple browsers, including e10s enabled Firefox, which shows Firefox still using half the memory of Chrome.

So Firefox has learned some new performance tricks.  What has this meant for Firefox Addons?


Firefox previously had two forms of Addons.  The legacy XML User Interface Language/Cross Platform Component Object Model Interfaces, or XUL/XPCOM for short, which gives the addon developer access to quite deep internals of Firefox.  The other takes the form of a Addon Software Development Kit (SDK).  Both of these are being retired in favour of a new API called Webextensions.  According to Mozilla:

WebExtensions are a cross-browser system for developing browser add-ons. To a large extent the system is compatible with the extension API supported by Google Chrome and Opera. Extensions written for these browsers will in most cases run in Firefox or Microsoft Edge with just a few changes. The API is also fully compatible with multiprocess Firefox.

Many have interpreted this description, as Mozilla attempting once again to mimic Google Chrome, perhaps in an effort to woo back users.  However, according to Aaron Klotz of Mozilla, this is not actually the case.

When I first heard rumors about WebExtensions in Whistler, my source made it very clear to me that the WebExtensions initiative is not about making Chrome extensions run in Firefox. In fact, I am quite disappointed with some of the press coverage that seems to completely miss this point.

Yes, WebExtensions will be implementing some APIs to be source compatible with Chrome. That makes it easier to port a Chrome extension, but porting will still be necessary. I like the Venn Diagram concept that the WebExtensions FAQ uses: Some Chrome APIs will not be available in WebExtensions. On the other hand, WebExtensions will be providing APIs above and beyond the Chrome API set that will maintain Firefox’s legacy of extensibility.

Please try not to think of this project as Mozilla taking functionality away. In general I think it is safe to think of this as an opportunity to move that same functionality to a mechanism that is more formal and abstract.

Goals of WebExtensions according to Potch of Mozilla include:

  • Cross-browser Interoperability
  • Common addon features made easier/simpler
  • Future-proofing your work from core Firefox changes
  • More secure extensions

There are plenty of online resources that talk about WebExtensions, and what they hope to achieve.  They all seem quite sensible to me.

Like the initial developers of Chrome who were able to take lessons from Firefox, Firefox too is taking lessons from others.  But what is Mozilla actually doing with Firefox that diverges from the crowd?

Old Firefox Showing Signs of Rust

Fox In Paso, Plate 1
© Jeremy Brooks – Fox in Paso, Plate 1 (CC BY-NC 2.0)

Mozilla labs, since 2009 has been working on an ambitious new programming language called Rust.  Before then, it was a pet project of Graydon Hoare, a Mozilla employee.  “So what?”, you might say.  New programming languages are a dime-a-dozen these days, and their impact can vary substantially.  What’s so special about Rust?

Programming software using low-level (or system-level) languages is a tough gig.  The trade off in terms of performance and optimisation offered by these languages, is their very unforgiving nature.  If you screw up, things go horribly wrong.  Most of Firefox is written in a pretty common low-level language these days called C++.  C++ as a language has existed for many years now, and is still evolving, with a new revision of the language standard, C++17, due for release this year (2017).  However, many of these unforgiving aspects of the language still exist.  I won’t go into details, but as an example, if you have ever had a program crash with the error “Segmentation Fault“, this is usually (but not always) a programming error associated with languages such as C++.  Dave Herman from Mozilla explains:

The Rust core team’s original vision—a safe alternative to C++ to make systems programmers more productive, mission-critical software less prone to memory exploits, and parallel algorithms more tractable—has been central to Mozilla’s interest in backing the Rust project and, ultimately, using Rust in production.

So all of this means more reliable software, and easier use of parallelism in modern multi-core computing architectures.  And Graydon Hoare got to scratch one serious itch.

Dave Herman shared a series of short videos that does a good job of explaining some of the finer points of Rust.

But, getting back to my original question, how does this help Firefox?


Old Rusty Screw by Charles Rondeau

Continuing with Dave’s original article, he goes on to explain how Mozilla is building Rust into Firefox, starting with Firefox’s multimedia stack in version 48.  Mozilla are calling this evolving initiative Oxidation.  Oh how I love Mozilla’s use of metaphors.

Oxidation (or rusting) occurs over time.  And this is what Mozilla plans for Firefox – to rust – to oxidise over time.  Oxidation, as a project is quite ambitious.  As Jen Simmons explains of Firefox’s rendering engine, Gecko:

The trick of the thing comes with figuring out how to switch from the old rendering engine to a new one. You can’t just do it all at once. It’s like figuring out how to replace a jet engine on a jet that’s still flying. I guess we could land the plane, let all the passengers disembark so they can wander over and take other planes, and not provide any service for a while while we change the engines out… but no — no, we can’t do that. Not gonna happen.

We could keep flying the current plane, while starting from scratch and building an entirely new plane on the ground — special ordering every nut, every bolt, designing a new cockpit, and waiting many years before we to get to fly that plane. But we don’t want to do that either. We already have a giant plane. And it’s a pretty good plane. We should keep using it. We just want a new engine on that plane. As soon as we can get it.

Enter Quantum, the codename for the project to figure out how to replace the engine on our still-flying plane. One piece of it is called Quantum Style (aka, Stylo) — that’s where we transition from having Gecko render all the CSS, to using Quantum for CSS. Quantum Style morphs Gecko and Servo together, asking each to do the job they do best. Because, actually, even though it’s been around for 20 years, Gecko does some pretty amazing things, and we want to keep leveraging The Good Parts. New isn’t always better.

Enter another project, Quantum, with the specific goal of augmenting the Gecko renderer with something called Servo.  “What the hell is Servo?” Glad you asked…


According to https://servo.org:

Servo is a modern, high-performance browser engine designed for both application and embedded use.

Sponsored by Mozilla and written in the new systems programming language Rust, the Servo project aims to achieve better parallelism, security, modularity, and performance.

So the longer term plan for Firefox is to replace parts of it’s jet engine (Gecko) with new shiny parts from Servo, all the while we continue to fly.  Pretty neat stunt I think.


So far, I have to say, things have been going quite well.

But is catching up technically, a good enough reason to switch, or switch back to Firefox?  I mean, why switch to Firefox – perhaps my current browser is already serving me well? In a follow-up article, I will address this question by focusing on one of Mozilla’s 10 principles – online privacy.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Blog at WordPress.com.

Up ↑

%d bloggers like this: