⬆️ ⬇️

Improving the quality of search or the subjectivity of Yandex?

We all have heard a lot from Yandex about the constant improvement of the quality of search. Algorithms AGS-17 and AGS-30, the struggle with the doorway and so on. Many people rejoice in the “good delivery”, many cry about the outgoing grids of their govnosaytov, which brought good profits in the bright past. But are these algorithms so good? Do they make mistakes? How Yandex "cleans" Runet?



I am engaged in the development of the service “Litter” - a network of sites offering a catalog of apartments for rent in 24 cities of Russia. The goal of the project is to simplify the interaction of apartment owners with their potential customers. Apartment owners post their offers on renting apartments with prices, photos, and so on. Visitors to the site can choose suitable housing from a variety of options and contact the owners directly. Such a project allows to make this chaotic market civilized, the owners find their clients, and those who want to rent an apartment get rid of the need to pay commissions to countless agencies and buy a “cat in a bag.”



The project was implemented on 24 domains (one domain - one city). You can choose another city from the domain (links are available), but to avoid problems with the draconian algorithms of Yandex, the city selection block was added to the noindex tag.



However, problems could not be avoided.

')

It all started back in the fall, when 5 cities fell under the AGS-17 filter. Only the main pages of these sites are left in the index. Upon the discovery of this fact, a letter was written to Yandex, to which a template response was received:

Not all pages of the site known for Yandex search are included in the index and

ranked highly. Their inclusion and position depend on the quality of the site and its

content from the point of view of users, and many other factors. In the set

criteria reflecting the quality of the site, included and placement on its pages

SEO links that we think are bad practices that harm search users.


It is worth noting that there are no SEO links on the site and there were not. The reason for the introduction of the filter is more or less obvious - 24 domains with the same design cannot pass past the sharp-sighted Yandex filter. What is “content from the point of view of users”? How do they determine that he does not suit them? Have you personally talked to everyone? Ya.Metriki sites have not been installed.



But since the main page, which attracted a large number of visitors from Yandex, was still in the index, the introduction of the filter did not significantly affect the work of the project as a whole.



Yandex struck another serious blow to the project on February 15, 2010. 6 sites completely out of the index. It did not seem to be a ban - the sites were remarkably added to addurl. But Yandex’s response was stunned:

After analyzing and classifying the pages of your site, our algorithms have taken

the decision not to include it in the search. Please note that not all sites

Known to Yandex searches are included in the index and are ranked high. Their inclusion

and position depend on the quality of the site and its content. The solution of the algorithm can

influence the use of search spam, the presence of pages on the site

intended for the robot indexer, and not for users to read,

placement of unique information and other factors.


What are the rules violated site? Where is spam, content for a robot indexer and other abominations posted on it? Yandex’s response was:

We do not enter into correspondence regarding specific techniques on the site

or addresses of poor-quality pages. Please read carefully

our recommendations:


The recommendations were carefully studied, but, of course, the sites did not fall under any point of violation. Another letter was written, in which I quoted and commented on the Yandex rules asking to respond on the merits or lift the sanctions on the sites. The answer was:

If you will develop your site for users, post on it

unique and useful information, it will have to appear in the search.


Wonderful! "You violated something, let's not say what, do that, don't know what."

The site is sufficiently self-sufficient and convenient - we often received positive feedback on our work, both from the owners and from the site visitors. What unique and useful information the site still needs, if it already fully meets all the requirements of visitors?



Some statistics of one of the domains:

image

image



As you can see, falling out of the Yandex index caused some damage to attendance, but not so significant as to hang a stick on the door. The figures show the quality of the content - the average residence time is more than 5 minutes, almost 4 pages of views per person on average. For such a service, these are good indicators - five minutes is enough to choose the right offer and decide.



Why is the service that exists on 24 domains, is certainly poor quality? This is just a form of work, nothing more. Yandex, apparently, considered this a duplication and hung the label "poor quality".

How is the usefulness of the site for users? Second look moderator resource? Does Yandex hire the best marketers in the world and put them in the search department and customer support?

Why should useful and relatively high-quality resources fall under the filters set up to destroy HS and doorways from the issue only by coincidence of extremely subjective parameters?



In my opinion, the search engine should provide the ability to search all the resources created for people. Killing doorways and GEs is one thing, but throwing normal “white sites” out of the index just because filters and Plato found the resource “poor-quality” and filled with uninteresting content — this is another thing.



UPD: Sites returned to index =)

Source: https://habr.com/ru/post/87018/



All Articles