How modern browsers cope with their primary responsibility - to display sites, without forcing users to wait unnecessarily?
I noticed that very often the pages are displayed unreasonably long - it seems that the title of the window has already appeared, and the page turns white. I look at the source code, it’s loaded, even the closing HTML tag is there.
I remember once, apologists of partisan browsers blamed IE for the fact that he was waiting for the full page load (with tables) before starting to display it. But years have passed, and browsers have forgotten their revolutionary ideals. And were they? It seems all the same were. Or "just" sites have become more difficult ...
')
In principle, I don’t really need to confirm this experience - for me it is obvious and on a good connection at work, and especially on the 33.6 modem at home. But I wanted to see:
- how much do browsers delay with rendering,
- whether all do so
- whether on all pages.
In addition, the formal experience is useful and to overcome the features of browsers, which even the source code (Ctrl + U) is read from the site again - even if they are right to some extent, but this hinders the assessment of the reality of the problem.
Das experiment
And I carried out such an experiment: I processed a
simple local proxy with a file, which, bibing, tells me when the main document returns (index.html) - this is the first stopwatch mark. Then, when the page starts to appear, I mark the second time.
The first time, loading time of bare HTML I took for 100%. The start time of rendering, which interests me, was counted relative to these 100%.
I used three sites from different categories:
-
based on WordPress (38 subdocuments: 6 css + 15 img + 8 JS),
-
old article on Habré (83 subdocs: 4 css + 59 img + 15 js),
-
Wikipedia article (30 subdocs: 8 css + 16 img + 6 js).
Result:
O pera, S afari, ie, K- meleon, Ch rome and Fx (Firefox).findings
- all experimental pages
were rendered
much later than the source code became available to them;
- none of them thought of starting drawing without waiting for the full HTML load (after all, 20-30-40 seconds is a serious time) - in this section, the gap between “as is” and “how convenient” becomes even more significant;
- all rendered basic HTML pages in one gulp, including CSS already (all without images, of course) - were they waiting for CSS, scripts? in any case, I don’t like it;
- There are no exceptions among the main, studied bro;
- among the web-pages of exceptions, by my estimations, too few.
Of course, browsers cope with long texts on lib.ru as I need, but still most of the next door sites are not so simple, not at all. I, so turned out, I visit generally "slow" sites; It would be nice, of course, to find out exactly how many in the internet those and others.
Technical Details
Browser versions, all under Windows XP:
- Opera 9.6
- Safari 3.2.1
- IE6
- K-Meleon 1.1.5 Gecko / 20080406
- Chrome 1.0.154.36
- Firefox / 3.0.5 Gecko / 2008120122
- excuse other neem
I cleaned the cache before each download. I drove each site once - I think this is enough and yes it’s tiring. Proksya multi-threaded.
The experiments were performed on an extremely low-speed Internet connection, without artificial restrictions :-). But, I think, this does not negatively affect the experiment - I don’t even give absolute numbers - if the agent’s user didn’t cope with the task in 30 seconds, he won’t cope with 3 (even on the contrary, it helped to control the stopwatch with minimal error). The error, according to my calculations, is less than 2% (0.5 s / 20 s).
UPD: Still, give absolute data for the second experiment. Without this, the results are not so apocalyptic (-:
opera 23 88
safari 33 100
ie 27 102
kmeleon 19 72
chrome 46 142
firefox 24 123
In the first line of HTML sucked at the 23rd second, and began to draw on the 88th.