Why not make search engines random? For example, now at some time, in some search engine, for the key phrase “sale of air” we have sites site1 site2 site3 site4 site5 ...
And they take their places (in any case, when requested from one computer) in a sequence that does not change until the next update. It is possible to introduce an element of chance, for example, site1 does not occupy 100% of the 1st place, but only with, say, 70% first place, 20% second, 5% third, etc., site2 likewise takes second place. , and a certain range of places (with the highest probability for place 2, etc.)
Why is all this necessary, you ask, since this brings no additional advantages to the user? The fact is that at the moment most optimizers are monitoring positions and draws many of their conclusions based on them. Of course, + -3 positions will not give a significant error in the approximate determination of the site position, but anyway, they will make a lot of confusion in the ranks of optimizers who can no longer reliably talk with the customer or their colleagues to say that the site is in a fixed place. ')
By the way, all the same, the positions are determined on the basis of algorithms that cannot uniquely and reliably rank sites, so that purely methodically showing dynamic (with an element of randomness) output is definitely not wrong.