Users do not like it when the online resource they need “slows down”. Survey data suggests that 57% of users leave the web page if it loads longer than three seconds, while 47% are ready to wait only two seconds. A one second delay can cost 7% conversion and a 16% decrease in user satisfaction.
Therefore, to increase the load and traffic surges need to prepare. And today we will talk about how to do it.
')
Note : the material is not aimed at system administration gurus, but rather at business site owners, therefore, it is more of an overview character.
1. Use caching
The more content on the site you can cache, so as not to load every time a user visits a page, the better. Usually, much of the content is static, and it just doesn't need to be constantly reloaded. Its caching is especially important when traffic spikes, and can not only help speed up the site, but also save money.
If you just have a website on conditional WordPress, then caching plugins like
Cache Enabler or
Cachify will work fine .
2. Process only useful traffic.
According
to research, in the modern Internet, almost 40% of all traffic is generated by bots. Bots can be as good - for example, search engine crawlers - and bad. These include, for example, all kinds of parsers that analyze and pump out data.
In the corporate Internet segment the situation is even worse - here the volumes of bad bot traffic can
exceed 42%. For companies, this is bad for two reasons. Firstly, bad bots can be launched by competitors to steal content or collect important business data, and secondly, bot traffic creates a serious additional load on the infrastructure.
Filtration systems help to get rid of the problem and reduce the load on the site, but for their correct operation it is necessary to perform calibration individually for each site. To do this, bots traffic can be modeled. A great tool for simulating such a load is the use of services like
Infatica , which allow renting resident proxies.
Many modern bots use resident IP, which means a high-quality test will require a large number of such addresses.
3. Balance the load
Examine the available load balancing options. There are three types of such solutions - working at the level of iron, cloud and software balancers.
With a high degree of probability, the “iron” options will turn out to be very expensive for small companies or creators of startups, so we will pay more attention to the other two options.
Among the popular cloud tools can be called
Cloudflare - it is often used by companies experiencing problems due to a surge in traffic. Of the software options you can call
Neutrino , serious load balancing capabilities are built into the
Nginx web server.
4. Optimize content delivery
Another step that will be useful in bursts of network activity is the use of a CDN or content delivery network. At its core, it is a set of servers around the world that can be used to deliver content to the user along the most optimal route.
Typically, site content is located on a certain main server in a single location, so when requests are received from different places, users may receive uneven answers - and this looks like a delay. The farther the user from the server with the site, the longer he needs to wait for a response.
CDN also caches files to different servers and “gets” them for transfer to the user from the server closest to it from the network. This allows you to build for each user your delivery route and seriously speeds up the work of the entire system as a whole.
The
link can find a whole list of CDNs that are suitable for use with websites.
5. Use compression
File compression is another tool to speed up site loading. A lot of high-load resources include
Gzip-compression in order to reduce the size of the site files for uploading and transferring them.
Gzip works like this - the tool searches for duplicate lines in a file and replaces the second one with a pointer to the previous line. When the browser unpacks the resulting file, it goes through the lines in it, reads the pointer and displays the “deleted” content. Thus, you can reduce the total weight of files up to 70%. Some hosting providers include Gzip compression by default, but it is better to check this setting manually.