Home IT companies Web metrics from Google

Web metrics from Google

by admin

Web metrics from Google
As part of the "Let’s Make the Web Faster" project, Google has released some statistics about the size, number of resources and other page metrics on the World Wide Web. The statistics were gathered from a sample of several billion Web pages as the search giant’s "engine" analyzed and indexed them.
When processing these pages the algorithm took into account not only the main html pages of the sites, but it also tried to detect and process other resources placed on the sites: style sheets, scripts and images.

Main page parameters

  • Average web page size 320Kb (when transmitted over the network)
  • Only two-thirds of the compressed data was sufficiently compressed
  • 80% of all pages load 10 or more of their resources from one server
  • Most popular sites can eliminate more than 8 http requests per page if they merge all scripts on the same host into one and all style sheets on the same host into one.

Main disadvantages

  • All resources were processed by Googlebot, so they could fall under robots.txt restrictions. Some sites (like BBC) block CSS and JS.
  • Some sites may present a different set of resources for Googlebot and for regular users. For example, until recently Google’s own servers kept CSS and JS uncompressed for Googlebot, but compressed them for regular browsers.
  • If page resources on the server are different for Internet Explorer or Firefox, they will not be visible in WebKit.
  • Selection of pages for processing is not uniformly random and unbiased. For example, pages with higher PageRank values were much more likely to be included in this sample.

In the analysis we considered separately popular sites compared to all other sites in the sample. As it turned out the popular sites average number of resources and GET requests per page is less than the rest, while they use more unique host names, but fewer resources per host.
Average page from the top site was less by 8 kb when transmitted over the network, but more by 100 kb in uncompressed form, and they were much worse compressed, all this is a consequence of the fact that resources on such sites are initially compressed more qualitatively.
Top sites pages contain on the average 2 unique images less than usual, the size of which as well as the size of external scripts a little less, with style sheets on top sites in one and a half times more styles on usual sites.
You can analyze the statistics on your own, as well as get acquainted with Let’s make web faster project at You can also find recommendations that will make your applications faster, stay up-to-date on the latest web performance news, and learn about various tools that can help improve your site’s performance.

You may also like