Let's face reality: While browser speeds are still a theoretically valid criteria in testing browsers, the latest browser generations have improved on web performance so drastically that some of these benchmarks hardly translate to real-world scenarios anymore. In fact, I'd argue that even IT pros don't realize how scores on these speed tests translate to effects on their day-to-day browsing.
Among all those very scientific benchmarks we forget one factor: the user. Our human eye is incapable of distinguishing the 221 ms load time of a JavaScript applet in IE9 from the 220 ms load time in Firefox 17. Do we even notice if Chrome opens a website in 778 ms versus IE's 953 ms? Humans think in seconds, not milliseconds. Do we really notice that our Facebook timeline appears on screen a fraction of a second sooner in a particular browser? Would anyone ever care? Of course, we geeks love our milliseconds, but we may get lost in perfection and this obsession has infected the entire industry: today, literally all browser makers push out raw numbers in order keep out-marketing each other.
And all users fall for it: from tech journalists to IT pros to the beginner. Again, scripting and rendering speeds may still be a valid criteria to web developers or in certain scenarios (I'm thinking browser-based automation), but it's starting to reach a point where the differences can't be made out by the user. Let's take things a bit further: What really determines how fast (or slow) a website is displayed on your screen is much more than the browser. In fact, the browser is just one link in a chain of many technologies, applications, and devices that determine how fast ITworld or Facebook appears on your screen. Among those are:
I'm sure I forgot half a dozen other things that lurk between the web server and your eyes!