I think most novice webmasters, like me, faced with the task of determining the position of the site, for some keywords in search results.
The first sane decision that I found was the site - allpositions.ru, but for some reason the data differed from those that were displayed to customers or those that I saw myself when viewing the output through the browser. Differences in most cases insignificant (1-3 positions up or down), but always present.
A thorough analysis of the ranking algorithms showed that the position of the site in the Google and Yandex search results, besides the search engine’s domain, the user's browser language, location, IP, is probably affected even by the moon phase. Accordingly, the position of the site on the search query for each user may differ and it is possible to determine only the average value of this value.
')
If you need to get position statistics on search queries for one or two sites, the service perfectly solves the problem, but in my case it turned out quite expensive (~ $ 136) to collect statistics for 100 domains.
The best I've found is a-parser.com, as well as paid + you need to buy a proxy once a month (~ 110 $) and pay for a droplet on digitalocean (~ $ 20), but as a result, apart from removing positions of my own and competitors, I decide many more SEO tasks with it. I think this is Mastv software, if it is expensive for you to contain semrush, ahrefs, wordtracker and other similar services.
In order not to violate the rules of Habra, here is a link to the solution of the problem with positions:
a-parser.com/threads/2051Which can be tested in
a-parser.com/pages/demo demo