📜 ⬆️ ⬇️

Assessment in terms of SEO: Is the site ready for promotion?

Evaluation terms


Companies engaged in the promotion of sites, daily faced with the task of "assess the resource." In fact, this means the need to find out how great the risks are and whether there are “pitfalls” associated with the upcoming optimization. The site may turn out to be either fully ready for promotion, or require serious financial and labor injections - otherwise the result will not be achieved. Most often, the result of the evaluation is a list of recommendations to site owners.
It is necessary to take into account that at the stage of the first acquaintance with the site, experts are usually very limited in the information about the site: there is no access to the CMS, no access to the server and its logs, to the statistics. There is no information whether the optimizers have worked with it before or not, what exactly they did, what promotion methods were used. The site owners, of course, ideally should have all this data, but in reality they are most often “the programmer, the programmer in the IT department, and the IT department in the egg, and the egg in the duck, and the duck flew to another company”. For a client who is not very knowledgeable in SEO (search engine optimization), the manipulations carried out with the site by its creators or contractors for promotion are often not visible. Up to the point that the client does not know about the placement of a catalog of links on the site, or (which is generally unacceptable!) Selling links from his site. The client can make mistakes himself when creating the site: usually it is borrowed content (texts taken from other sites “without permission” and other materials). In the understanding of many site owners, everything is simple: the more information on the topic the resource offers to users, the more popularity it will receive. But almost no one has any idea about the legal aspects of theft of materials, as well as the technical side of “pessimizing” search engines of sites with stolen content. Search engines themselves, regularly making changes to ranking algorithms, clearly indicate certain factors that only time can overcome. Based on this, we construct a classification of site evaluation characteristics that the optimizer is able to obtain and analyze with such input data.

Site characteristics set


Domain name
It is no secret that website promotion on a third-level domain is associated with great efforts and financial costs. There is a relationship with the parent domain - the more authoritative it is, the easier it is to work with the subdomain. There is also a certain specificity of work with third-level domains. However, their number is small compared to the total mass of interested customers who choose the classic second-only name for serious organizations, so we will not focus on such sites. Ultimately, the general principles will suit everyone. So, you should first analyze whois this domain. Both Yandex and Google prefer “experienced” domains, registered as early as possible. According to the "opinion" of search engines, the site had the opportunity to earn credibility, get feedback. At the moment, in the TOP-10 of Yandex in highly competitive topics it is not often possible to find the domain of 2007: most of the top places are occupied by reputable "old men". The optimizer cannot influence this factor, therefore its task is to warn the client about possible difficulties in promotion and to offer the best option - both from the financial side and ethically (you can’t promise what you can’t do). Possible options for the proposal are as follows: either LF (low-frequency inquiries) for which the site will be able to receive the target audience, or an inflated budget for reference factors and a huge risk.
Presence in the index search engines
There is always an opportunity to see which pages of the site already “know” this or that search engine. In Yandex, a query design like url = “domen *” will provide an opportunity to familiarize yourself with indexed pages. The same operation can be done using webmaster.yandex.ru. If the site is not indexed, and the domain has been in place for several months, then you need to find out the reason why Yandex still has not found this resource. For example, long-term development was carried out, and the site was closed from indexing. Or they registered the domain, and the site was able to be done only now. Along with such harmless cases, there are difficulties that can be identified immediately, for example, the site is banned (excluded by the search engine from the search results) due to link spam on its pages or due to unmoderated link catalog. Also, the site may be under the filter "you are the last" (located on the last places of issue for any requests) due to stolen or duplicated content. In some cases, it happens that due to failures in the hosting operation, the site fell out of the search or “caught” the filter of the affiliate (it is superimposed by a search engine on the websites of one company in one subject). Perhaps the search engine found it a mirror (a site that completely duplicates the content) of another site. If the site is not in the search, then further analysis can not be maintained. It is necessary to wait for its indexing by the search engine. It is necessary to provide the client with recommendations on how to accelerate indexing: for example, add all pages of the site to webmaster.yandex.ru, check the server operation, eliminate duplicate content. If the site is present in the index, then for further work it is necessary to check whether the main mirror of the site is defined by the search engine prefix www or without it. It often happens that the reference budget is spent inefficiently, since the issue of transferring the weight of all links from one mirror to another is incorrectly resolved. It happens that it is the site developers who do the wrong thing, indicating in the robots.txt file containing recommendations for search engine spider robots, making the main mirror with the www prefix, and making all the menus on the site without it. It is also known that the mirror-mirroring algorithm of Yandex is not ideal - this was repeatedly mentioned by well-known specialists in this field. Gluing "in the wrong direction" can cause significant difficulties. It is worth checking the navigation right away: if there is a sitemap (site map), then it is worth checking it out. Be sure to check the robots.txt file and formulate the correct version once and for all, minimizing the likelihood of error from the search engine. When looking at the site in the search results, you should pay attention to the total number of pages of the site. Recently, there has been a “disease” of 10 or 50 pages - in part it can be justified by punching (discarding 100% irrelevant pages) at the stage of generating search query results, but it is not uncommon for a site to have a sufficiently large number of pages by mistake. The most common ones are duplication of “pages for printing” and incorrectly written htaccess, which allows robots to get one and the same page across several URL variations. Particular attention should be paid to the pages with xss hacks. At the moment, links from such pages cannot benefit acceptors (sites and pages referenced), but don’t probably harm donors, but they can also be checked, at least visually, whether the titles (meta tags, Title) are the same for pages. Equal titles significantly complicate the ability of search engines to properly rank documents. It should immediately recommend that customers contact the site's developers to make changes and create unique meta tags with the right keywords on all pages of the site.
Content of the site pages
The content factor is very important for successful promotion. That is why it is necessary to determine whether the content is borrowed somewhere, or whether it is the client’s own development. It is important to find content borrowed from other resources, since recommendations on the prohibition of indexing pages for printing can be provided to the client and later, this is not so important. If duplicate content is found, then you should definitely check the site again - if the filter “you are the last” is not imposed on it. It is necessary to carefully understand who is the primary source: if your ward is so popular that it is published, then it is worth notifying the clients. They must make requests for copyright links to the publication or to achieve withdrawal of their content from other sites. If there is no content at all - for example, only a flash-version of the site is available, or the whole site is an Ajax interface, or, which is now a rarity, it’s all in frames. Here, of course, you should immediately let the client know that it is easier to promote text in search engines at the moment, and if he does not want to wait for the result of the work for three years, then changes should be made to the site. Namely, to make full-fledged text pages for users and for search engines on the server. There are cases when the first page of the site is not at all, and a 302 or 301 redirect to an additional page or a javascript redirect is used. This is completely unacceptable. This defect, usually associated with the use of CMS (content management system), must be corrected immediately. The first page is the face of the entire site, and if someone still thinks now that the more it turns around and turns around, the “cooler”, the sadder. The first page should be readable, prepared, convenient face of the site, which is generally able to characterize the company, in some cases, the section "About Us" will be visited very rarely. In the heritage from the times of ten years ago, in the minds of some unfortunate developers have left the first pages of the "Login" type, where you are invited to click on a single link or button - "Login". This greatly hinders search engine optimization and annoys most of the Internet audience.
Navigation
If on every page of the site there are links to ALL the other sections - this is bad. Not only that the search engine robot constantly stumbles upon the same code, you also make your website heavier. The text menu should contain the main sections of the site, from which any user can go to internal subsections (this is the subject of usability), but for the search engine, a competent presentation of this element also plays a very large role. Do not make a menu of 5000 links, it does not help the search.
Page Code
We must understand that the “lighter” the page, the faster the search engine will take it. If the pages of the site weigh several megabytes, not every search engine will be able to pick them up from the first run. Therefore, part of the page will not be indexed. You can often see sites with beautiful flash-animation, but they “weigh” quite a lot. Therefore, you should immediately notify the client about the possible need to remove the flash drive - at least from the most promoted first page. It does not improve the position in the ranking of documents and can cause a lot of problems to users with low-power computers and slow Internet. Next, you should look at how the code is written. The use of tabular or block layout is of no fundamental importance, although block is preferable. Often, HTML code spoils additional functionality in such a way that it is generally hard to understand. For example, the site has Ajax (somewhere inside the form, closed from indexing), but why connect this library to the download on the first page, on which it is not used at all? Also, there are old versions of the layout where styles and JavaScript are written in the head. Firstly, in this variant the search engine “strains” - it downloads more, and secondly, customers suffer - on each page they will have to download a bunch of identical information (code), and you can simply make an inclusion that will be successfully cached by the user's browser and reduce the load on the server. Yes, and the user will wait for download less, and search engines will receive only the necessary information. It is now very popular to use CMS with WYSIWYG editors, almost like Google Docs for a site, but almost no one checks what code this editor inserts into a page. Usually, he inserts a bunch of unnecessary font style = "" or span style = "" tags, which not only make it difficult to work with the code, but also make the page heavier. Do not forget about the hidden code - sometimes developers write a half of the page with a "comment" tag for later, and all this remains in the final version. It is necessary to remember about the property of the display = non element, which can often play a cruel joke: in this block there can be not only hidden “spam” text, but also links to external sites! You should immediately check the Title tags and additional meta tags. It is very easy to check and change them via CMS, but at this stage it is impossible for the reasons indicated at the beginning of the article, therefore at least a part should be reviewed manually. There are services that provide comprehensive information on validation and code optimization, and they should be used.
Has the optimization work been done before?
To find out, you need to determine indirect indicators: PR and CY of the resource, the number of external links - for example, through Yahoo, and along with the quality of links to the site. Often, the check makes it clear how real the site is. A large number of natural links from reputable resources will greatly help in further work, and a large number of spam links can spoil the reputation of the site, resulting in difficulties with promotion. MSN services (although sometimes inaccurate) will give an opportunity to look at outgoing links from this domain. It is important to check this, because if the site has a catalog of links, or links are being sold through brokers or manually, this can significantly affect the visibility of the site in search engines. The availability of links from blogs is also an indirect indicator of the popularity of the site. The site’s presence in Yandex.Catalog , in DMOZ or, for example, in the Yellow Pages directory it says that it has already passed the check by the catalog moderators and is of interest in its subject matter for the user.

Conclusion


In the article we did not list all the signs by which the site can be characterized, but we tried to focus on the most important ones. Save budgets, do not mislead the client, try to do your job perfectly! PS We talked about the quantitative characteristics, so we will try to put everything into a diagram.



The presented example of a quantitative assessment of characteristics allows you to familiarize yourself with the site and to summarize the degree of risk or difficulty of promotion of a resource. If the site scores maximum 50 points, then there should be no difficulties with it, and its promotion will be stable. If the site scores 0 or less points, then the risk of working with this project is great, and it is necessary to eliminate the shortcomings as much as possible and only then take up the work to promote it (for example, if the hosting does not sustain the load, then there is no point in attracting more people to the site, first change the host).
Sergey Karpovich , Search Engineer, Internet Agency Matik

')

Source: https://habr.com/ru/post/38673/


All Articles