📜 ⬆️ ⬇️

Google's web metric

image
In the framework of the project “Let's make the Internet faster”, Google has published some statistics on the size, amount of resources and other indicators of the pages on the World Wide Web. Statistics were collected from a sample of several billion web pages in the process of analyzing and indexing the search engine.
When processing these pages, the algorithm took into account not only the main html pages of the sites, but also attempts were made to detect and process other resources placed on the sites: style sheets, scripts and images.

The main parameters of the pages



Main disadvantages



During the analysis, popular sites were considered separately compared to all other sites from the sample. As it turned out on popular sites, the average number of resources and GET requests per page is less than the rest, while they use more unique host names, but fewer resources per host.
The average page of the site from the top was less than 8 kb in transmission over the network, but more than 100 kb in uncompressed form, while they were compressed much worse, all this is due to the fact that the resources on such sites were initially compressed with higher quality.
Pages of top sites contain an average of 2 unique images smaller than regular ones, the size of which, like the size of external scripts, is slightly smaller, while the style sheets on the top sites are one and a half times larger than the styles of regular websites.

Analyze the statistics yourself, as well as get acquainted with the project Let's make web faster, you can at . You can also find recommendations on the site that will make your applications faster, stay up to date with the latest news on web productivity, get acquainted with various tools that can help improve the performance of your site.

')

Source: https://habr.com/ru/post/98955/


All Articles