📜 ⬆️ ⬇️

Plunging into performance



Hi, my name is Steve Souders and I work for Google on a team that “makes the internet faster”. This is incredibly cool. There are more than 50 people in the team and our task is to make the whole Internet faster, not only Google services. In addition, I help Airbnb, Twitter, Facebook, just sent a detailed description of the performance improvement system on Pinterest. We want to make the global network faster. I mainly write code, create various utilities and analyze what happens on web pages and browsers, and much more. And today I will talk about many of the things that I have already done.

This post is based on the speech of Steve Souders.


I have a lot of detailed materials on my site. Here I propose to plunge into the atmosphere of increased productivity.
')
Useful tools:

Metrics


Let's start by talking about metrics. In the context of performance, this means that we actually mean speed. I want web pages, web browsers to work faster. So, how can we measure it. After all, if we do not track changes, we will not be able to really understand what we are doing, and we will simply do something blindly. Another thing that I constantly repeat: you always need to fix indicators in a bad state of affairs. Be sure to do this before you start something to correct and measure, because everything you do should improve the result. People usually get so hung up on the product itself, try to fix something and eventually forget to track the indicators. And from this point of view, they lose a very significant part of the whole optimization. Also, many are afraid to show bad results to their boss. Then I recommend to show them without dynamics. Just fix at least, make one or two improvements, and then show how cool it has become and what excellent work you have done, and then you can ask the authorities and the premium.

When it comes to performance, we usually measure speed: how quickly certain processes happen, and usually we focus on the window ... There is one way to get these measurements from real users. It is called RUM - real user monitoring, which means “user monitoring”. Real people, real metrics. We simply put Javascript on your page and track the load time from the user.

Approximately, the Google Analytics tool has been running for about a year. At first, such measurements were included only for those who put the necessary tick, and after 6 months of testing, this function was turned on by everyone. Anyone who currently has Google Analytics on the site can open the settings, go to the "Behavior" block and see the information about the speed of loading the site. By default, the last month is displayed, but you can see the average. For example, I am interested in the server response time, because now I often swear with my hosting provider because of this parameter. Also here you can see various graphs, peaks over the last month, the response time of the HTML document. With Google Analytics you can monitor a lot. You can even choose nothing and not build any special diagrams, but just look at the basic values ​​and analyze.

Also here you can see the distribution by country (everyone loves to look at world maps with data). We immediately see in which countries our site loads slowly, and in which quickly. You can see how many exit points have been analyzed. Much, of course, depends on which country the server is in, and which country the user is in. But sometimes the speed is too small, for unknown reasons.

Now you know about this data view, it has long been available in your dashboard in Analytics. Also in Google Analytics you can track various download times. For example, you want to know how long the picture is loaded on your site. You can add an image and have GA track its load time.

Of course, Google Analytics is not a panacea, but just one of the options, there are other tools. For example, mPulse from Lognormal, which was acquired by Soasta. Or Insight from the company Torbit. People are involved in data analysis and performance improvements. And RUM-products are now a lot. There are those that are embedded in the browser, there are some third-party programs. On my website, you can find the code for the Episodes system, which I did for Relic a couple of years ago. It is open source, so you can download and watch.

Optimization


If I want to make the site faster, then I will try to halve the loading time of the site itself. So first measure the total load time, and then the time to connect to the server. And if everything is in order with the server, then start working with the site itself - optimize the database, check Javascript or CSS. After all, something also inhibits the work of the site.

As a tool to help optimize load time, I can recommend:

By the way, there is one subtlety. In Safari, we will not see the total load time or server response time. But the latter is one of the most important parameters. Most often, the server response time is 10% of the total time. If you get it much more, then you should focus on it.

I love statistics from real users. If I have a lot of visitors, then I can check the various parameters and find out why my site loads longer in one place or another. I love real data, because it is the behavior of real people, here you can see information about their equipment, connection, geographical location and much more.

One more thing. What if the page doesn't load after clicking on the link? Suppose a user has a slow Internet when it takes 60 seconds to load a page. In fact, after 20 seconds, people usually leave this page. In any case, the worst loading time is better to never take into account. I set the time for myself at 10 or 20 seconds, everything that is longer is cut off. True, it is worth looking at the general statistics, because if most users download in 40 seconds, it is stupid to cut them off with a limit of 20. In general, there is such an approach, but you should not use it thoughtlessly.

Synthetic tests


An alternative to testing on real users is synthetic tests. For example, you are in a lab where you use a tool to load pages every minute, or use a service like Keynote or Compuware. In fact, one of the best synthetic tools today is WebPagetest. He, of course, does not do everything that is needed, but if I need to give a person advice on where to start, then I would advise you to start with this service. This is an open source project.

You can check your site right now. Everything is pretty simple there. Shows the load time, the number of requests how much the page weighs, how long it takes to load certain elements, the table is especially good with all sources and time for each, etc. A lot of information that can be useful. This is only a synthetic test, but very convenient.

Useful tools:

Optimization


One more thing I want to talk about. Open the site www.airbnb.com in Internet Explorer. What will we see here? The site has about 5 scripts and three sets of styles in the document headers. Most likely, these scripts will be applied regardless of which page we are on. And what do you usually do when there is a header with some of these scripts, links, style sets and tags? Just dropping them, right? So, if we do this, then some processes will start a little faster. And this is important. In any case, we have a long page, two sets of styles, a script, a couple of pictures, another script, another pair of pictures, and so on. There is a kind of blocking that must be eliminated, and everything will work faster.

Take a look at the same site, but already in Chrome. The big picture loads pretty quickly. The boot sequence is different here. HTML document, script, style, style, script, script, script, script, image. Completely different download. And after all in the document the order of loading is not specified in any way. This is all Chrome's “initiative”. Since 2008, we have been working to ensure that sites load faster thanks to the so-called “speculative download”. Previously, everything happened like this: when you start downloading a script, nothing else can happen at this moment. If there are three scripts in a row, then the first one will load, execute, then the second one, execute, and so on. This is all very long, alas. As it is implemented in the above browsers: I start downloading the first script and can continue to parse the document, because the script can do something that will change the state of the DOM. But what if I do an advanced parsing, very easy, which only searches for a few specific tags, for example, IMG, link, IFRAME. And if I see them, then I just throw out these requests.

And now, in many modern browsers, advanced speculative parsing is presented, so that we can load several things in parallel. Look at all these scripts that are downloaded in parallel, and style sheets, and other files. This all happens due to speculative, forward parsing. This technology allowed an average of 30% faster download sites.

For the user is very important interaction with the page. Without this page, a person simply looks at a funny little form. In IE, his request number is 6, and in Chrome 18. This is how Chrome works: it does not preload any images before the first draw event. And the output to the screen is blocked when scripts and style sheets are downloaded. Scripts and CSS are preloaded, Chrome already has resources that block rendering, which in turn blocks the start of loading images. So, this image, according to the code, is blocked. What we really have to do, and I have already spoken to the Chrome developers, is to make the system smarter. This is a new code, so we learn while we do. In general, it is quite interesting, gives a 3% increase in speed. We should consider really big pictures, special backgrounds, and maybe make more resolutions for them and load them early.

I wrote one useful tool, Cuzillion . He has a great slogan “Because there are zillions of pages to check.” Suppose I just need to make a test page, because I am tired of doing a bunch of test pages in PHP. This script is loaded in a couple of seconds, another couple of seconds go to download the image. Items are loaded in staggered order. Of course, it could be done in parallel, but, according to the logic of things, it is still necessary. The total page load time will be, say, 4 seconds.

If we put the image above the script, we will get a solution to the problem and make the download faster. But we can not put the image in the beginning, but you can move the scripts down. Also here you can create scripts in javascript. But if you go this way and write everything on JS, then you can put it above the script. And now, if we create a page, the download will be parallel and the total time will be, for example, 2 seconds instead of 4. Suppose a page has an image that sets the tone for the entire page. What can you do with it? Put it dynamically to the very top, up to all the scripts, and they will be able to put the picture as the IMG value relevant for this page.

Useful links:


I also want to tell you about HAR. This is the HPR file format, JSON format. Today it is an industrial standard. The main idea of ​​the HAR file is to capture all the information from the diagram and provide the ability to re-create the diagram from the file. Diagrams saved in files can be sent by mail, published, and anyone can view them. Those who will do performance analysis systems can also take a HAR file and perform an analysis based on it. That is, HAR is a very useful tool that I would like to draw your attention to, because it can be used to record what is happening on the sites, including the speed of various processes.

Useful links:


But another project, HTTP Archive http://httparchive.org/ . It was launched about two years ago. This is part of an online archive with archive.org, we run it every two weeks. Now there are about 300,000 links. We took the Top 300,000 addresses in the world and analyze them, we get statistics. We even have a mobile version, but it is not very popular. We have only two iPhones, and other phones are much slower than computers, so you can only watch 5000 links. Here you can see statistics, charts and all page resources. Two years ago, we looked at only 10,000 pages. I hope this will be useful for your tasks.

By the way, I will very briefly touch on the topic of mobile phones. It's all very simple: on the phone the site is very slow. How long does it take to download? There was no way to find out. And then I made LoadTimer.

Let's pretend that you launched Loadtimer.org on your phone. You can select different addresses and click Start. It took me an hour to do this thing. It's just an iFrame, it loads your address in itself, it measures the time and shows you. There is nothing complicated here and this “browser” can load any page, you can check it on your phone even now. So far this is probably all that can be done interesting in the mobile web segment.

Useful links:

That's all. Thank.

Source: https://habr.com/ru/post/239829/


All Articles