Until the computer starts to think like a person, it cannot distinguish a bad site from a good one ... the way a person would do it. In fact, search engines have in their arsenals methods of data collection and analysis, with the help of which silicon brains easily plug into the belt of meat experts.
Immediately make a reservation - by a “good” site, we mean “worthy of taking a place in the search results for a specific key request”, we will not dive into the wilds of site-building aesthetics.
So, if you don’t go into details, now three approaches are used comprehensively in search engines: ranking of pages by authority (an example is Google’s PageRank algorithm), behavioral factors (analysis of actions of real visitors to real sites) and machine learning (an example is MatrixNet) Yandex, which teaches algorithms with sample assessments by assessors, well, in fact, it links and balances the first two approaches).
')
The ranking of credibility in the early stages of Internet development worked very well, but later on the “too mathematical” nature of this approach allowed optimizers to apply tricks using the weaknesses of the system found in the course of the experiments. The quality of the issue suffered, search engines introduced amendments, additional formulas and coefficients, filters and sanctions, but a truly major breakthrough was made when it became possible to rank sites based on the preferences of their real live visitors. The analysis of behavioral factors is more objective than any personal predilections (both expert and profane), since it works with the preferences of a large sample of the target audience.
How search engines collect data
1
Statistics systems (Google Analytics and Yandex.Metrica). Almost all site owners want to own information about attendance and all actions of the audience. The best, and yes even free tools for this are provided by search engines, but in return they receive huge amounts of data.
2
Browsers. I nternet Explorer works on Bing, Chrome on Google, and Yandex has its own product in this niche. Although Chrome, for example, is deeply deep in the settings, where only habrogolovye will be taken, provides the ability to uncheck the checkbox "Automatically send usage statistics and crash reports to Google", we cannot guarantee that this deprives the corporation of access to the data it needs . In general, the data flow from proprietary browsers is quite substantial, it allows you to additionally cover a segment of sites without statistical systems (or, more often, with competitor statistics systems).
3
Browser add-ons. You can assess the need of search engines in the data on attendance by aggressive promotion of Yandex. Bar. Turning any browser into a “proprietary” one, the add-on diligently sends attendance statistics to its native data center.
With the above products, search engines receive almost complete information about the audience behavior of each indexed site. The next logical step is to show above those sites that, other things being equal, cause a more positive reaction from visitors. Here, of course, there are some subtleties: in some subjects the page browsing time will be the main factor of a positive assessment, in others (for example, if the user needs to cast one look at the page to perform the required action) - does not play a special role. Somewhere the depth of viewing is very important, but if the site consists of one page, then sometimes it’s not worth refusing high positions. This is where the interpretation and segmentation of the data, as well as machine learning, come into play (if assessors constantly give high marks to quality one-page promo sites, the search engine will learn to exclude viewing depth from the list of important behavioral factors for similar resources).
Key behavioral ranking factors
1 Bounce rate - the percentage of visitors who left the site after viewing the login page. For sites that require several transitions to other pages - and most of them - is a very good criterion of quality and relevance to the subject. The visitor leaves the site either because he found the right thing and did what he intended (and what the site owner wanted), or because the site was not liked or irrelevant to the search query. Try to reduce the percentage of failures - increase the relevance, improve the design and UX, make the landing pages more understandable, attractive, and so on. Of course, there will never be one hundred percent "assimilation" of the audience, but it is necessary to strive for this. And not least because of the consideration of behavioral factors by search engines, but because of the conversion — the bounce rate is directly related to the site’s ability to turn “visitors” into “buyers”.
2
Time spent on the site . A good quality criterion in most cases. If the high rate is not achieved due to incomprehensibility of content and intricacy of navigation. You can increase the viewing time using the simplest everyday logic: give visitors what they are interested in and they will devote their time to studying these materials. These can be articles, photo galleries, videos, some services like mortgage calculators (in the subject of the site, of course), etc. All techniques for engagement should not harm the conversion, so you shouldn’t mechanically add everything to the page.
3
Depth of view. An important criterion for content projects. You can increase the depth due to thoughtful navigation and cross-references, and, of course, interesting content. Many sites try to increase the depth by splitting large articles into several parts located on different pages, but this practice is justified only in case of high motivation of visitors to read the whole article (this works well for reviews of computer components, but for the continuation of the “humanitarian” article on how we reorganize Rabkrin, many will refuse to move).
4
Return to re-search. If the visitor returns from the site back to the search, then he did not find the right one. This parameter can only be controlled through increasing the relevance of landing pages to requests, as well as maintaining a competitive price level for goods and services.
5
Return to the site not from the search. If a visitor has bookmarked a site or memorized an address, this will be a significant advantage in favor of the resource. Nevertheless, it is not worthwhile to intrusively suggest adding a site to your bookmarks; it needs to be done subtly and with taste.
6 The
nature of the movement of the mouse cursor and the pattern of movement on the site. Statistics systems collect data not only about where the visitor clicked, but also how he moved the cursor. This is necessary to build a “heat map of attention”, as well as eliminating attempts to wind up behavioral factors with scripts. The patterns of living visitors are quite difficult to emulate, so, by the way, many sites that tried to wind up user factors in the first months of their introduction quickly flew down in the list or were banned - search engines noticed that the cursor is controlled not by living people, but by programs. Analysis of the heat map and recording of viewing sessions will allow, with a sufficient amount of time and meticulousness, to identify and eliminate obstacles on the conversion routes.
7
Clickability snippet (CTR) . The more you click on your snippet (a short description with a link in the search results), the better your search engine is. This is logical: if the snippet is relevant to the request and attracts the attention of users, then the site is more likely to respond qualitatively to the request. There are ways to control the snippet, and this should be paid attention to - quick links, correct title, good text will help increase both attendance and position.
8
Social network buttons. If the installed buttons (it’s best not to add AddThis scripts, but the native buttons of the social networks themselves) click, this not only increases the number of subscribers to your pages in these social networks, but is also an important signal of the quality of the site for search engines. Install the buttons as early as possible - each subscriber will be a significant plus.
findings
Search engines objectively and qualitatively analyze data on behavioral ranking factors. Do not try to manipulate them directly (using scripts, buying untargeted traffic, etc.): this will lead to sanctions, but it will not bring any benefit. It is much more effective and more important to really increase the quality of the site, its attractiveness and conversion. Then users will behave as you need, and all of the above indicators and, accordingly, the position of the site will grow.