Which websites do not like search engines?

In the moment, when we are trying to create a website, we hope that it will be interesting for users, regardless of which purpose the users have. And, if we want people to have access to our new website, first of all, search engine should be able to find it. The position of new generated website and hence the quantity of its visitors depend only on the decision of search engine, weather it “likes” site, or not.

Search engine – is just a robot, which only makes portraits of sites and evaluates it, considering a great amount of facts. Each and every retrieval system highly rank resources, which, on their mind, were made for people, and publicly dislike sites, which are completely opposite to this concept. There are a lot of reasons, why search engines may “dislike” the website, but if you cope with several main reasons, the following life of your site will much more easier.

At the first term, you should pay attention to the technical aspects which can be faced by web-masters during the moment of creating or actuating the project.


Get lost in the search results?

Let Promodo experts help your website index better by optimising your pages, improving your content, and enhancing your linking strategy.

By clicking this button you agree to our terms & conditions

Web Hosting

Anyway, you should start from choosing the hosting. Search engines really dislike sites, which are not available: if the project is not available – there is no sense in showing it to the users. One really useful advice – if you are choosing the hosting now, pay attention to its reliability and not to its price.


Alternative content

Unreadable elements which can’t be recognized by search engines can be a next problem for you. This is a list of non indexed elements: flash, frame and JavaScript. It’s not recommended to do realizations of significant site’s parts with their help, because from the first side, retrieval systems can’t define located there text or other elements. And, from the second side, such elements incredibly influence the speed of loading the site. It can be explained by the fact that they are extremely heavy, and for example, the time which we need for their uploading is much bigger than the time for uploading usual picture. From all the above, we can make a conclusion that using flash, JavaScript or frame is negative factor for the search engines.


Engine (CMS)

The speed of site loading depends on choosing CMS (content management system). Nowadays, there are some “heavy” CMS, which contain heavy pages code. During the page creating, lots of scripts, uploading files and other elements are being used. The search engines try to provide their users only with the best sites, which are loading easily and quickly, and don’t enforce visitors to wait. It’s obvious that preferences will be given to exactly such sites. Hence it’s better for you to choose optimized CMS, which automatically generated code won’t be an obstacle when indexing a site.

Let’s go on to the content of the documents. What should you pay attention at?



Content – is the main part of the site. Qualitative content – is half the battle. If your website includes a unique content, it is on the right way. The searching robot can’t mark your text from the perspective of usefulness and the level of informational content for users from the first scan, but it will definitely mark it immediately after the moment, when links, huskies and other citations are appeared.

From this information we can make a simple output, search engines dislike simple copy – paste. For such text borrowing you can get a lowering in the ranking or even ban.

This is not a secret, that if you want your page to become a relevant to certain key demands, you have to use this query a certain amount of times. Sometimes web-masters so enthusiastically use this factor, that as a result they have text, which is full of spam keywords. The principle “the more the better” in this case doesn’t work. Such text will be considered as over optimized, and will not rank by the search engine. Therefore, you should always monitor the ratio between the number of symbols and the number of inquiries, when use key queries in the text.

Search engines also consider attempt to hide the text on the site as fraud. You must openly place content, and in particular text, not hiding it in hidden fields, the color of the font must be different from the background color, and finally, you are not allowed to move the text outside of the user’s display.

Also, it should be mentioned, that searching robot won’t visit the site pretty often, if its content is not updated for a long time. The more often you update content on your site, the better searchers will treat to you. Adding thematic articles or news, you say that your project is developing and renewing.

Avoid empty pages, because they don’t carry any information to the user and will be excluded by the search engines. When you create a project in its infancy and not all content is already filled on the site, accesses to it have to be closed for a search robot with help of robots.txt. During the following filling the site and adding extra information to it, some problems connected with re-index of these pages can appear.


Meta Tags

Every page of the site should be unique; uniqueness is the sum of content, meta tags of the page, titles, code and other factors. The same basic content of meta tags as title or description will have negative impact on your project. Therefore, you should prescribe a unique meta tags for every page to reflect the main content of the page.

You are able to view the presence of similar meta tags in the web-master account panel.


URL of site pages

Addresses of pages may also become an obstacle for search robots in the moment of indexing the content of your page. Pages with dynamic URL may not rank in the SERPs and, what is more, may not even get into the index searcher.

If CMS is automatically formed by the URL system, it should be replaced by the NC. Here we can have an argument, because this factor is gradually leaving the discharge barriers, but there are several reasons due to which using NC will be much more effective. First of all, they are friendly to quick indexation. Secondly, they are readable and allow user easily navigate on the site. And, thirdly, needed key in the URL request will be considered in rankings in SERP.


Number of links on a page (external and internal)

A great amount of external links on the page lead it to the existence spam on it. In this case, all weight of the page flows into external resources. Generally, external links to the partners’ and friends’ sites don’t damage the site if subjects are similar, or if links are to a fairly authoritative resource such as Wikipedia. But if you are referring to dozens of different sites, you can be sure that a search robot will not approve it.

Recently, traffic exchange with partnership links blocs to external resources, which are often loaded with the help of JavaScript, has become actual. Such bloc will be not indexed, but it will affect the weight of the page negatively.

There isn’t any clear opinion relative to the count of external links on a page. Each and every search engine gives its own recommendation for this problem. The most important thing is not to overdo and not to place links on every page to all other pages. The more internal links on the page exist, the less weight on them is transmitted, and the longer page will be loaded.


Duplicate Content

Quite often young projects are faced with the problem of duplicate pages, and sometimes even with the duplicate sites. Search Engines punish such mistakes, because you create two pages with different addresses, but with the same content. None of these pages will be ranked well.

Not only user but a content management system may be reasons for duplication as well. The most common duplication types:

1)  The main page

We can easily find lots of situations, when the main page is available by several addresses at the same time, for example:

  • site.ua
  • site.ua/index.html
  • site.ua/main.htm

2) The inner pages

Usually web-masters don’t pay any attention to the fact, that pages can be available by two and more addresses. The most common mistake is the presence of page in the index, which is ended by “/” or without it:

  • site.ua/razdel/stranica/
  • site.ua/razdel/stranica

Such problems are solved by setting up redirects to a page.
You should pay attention on such trifles when creating the site in order to avoid penalties from search engines.


3) Creating mirrors of a site

Mirror of a site in the internet – is an accurate copy of another site. Mirror sites are most commonly used to provide multiple sources of the same information. Big or popular files are often situated on the several mirrors to download acceleration and load balancing.

When creating the site web-masters pretty often use such technologies for adding extra data, but in the same time forget to close clone from the search robot. The search engine finds two similar sites and can through away the main one from the index or decreases its rank. After it, restoration will take lots of time and forth. Mirror should be beware and you have to avoid their creation.

What conclusion we can make after all receiving information? Live peacefully, execute all demands of search engines, align with the leaders and you will be loved by search engines. Every website, which you create for people and which contains lots of useful information will get good attitude from the search engines, even despite the fact, that in its optimization and development can be permitted system errors.




Leave a Reply

Notify of

Recent posts

How to quadruple website traffic in the real estate niche in 8 months | Case Study

December 21, 2020

Marketing in the real estate niche has certain peculiarities. In this article, we share the case of Kampas, a Lithuanian online business and explain how SEO assets can help to increase visibility in the search results and quadruple the organic traffic! Check the whole story of cooperation on our website. Client Kampas.lt is a real Continue reading >

The future is here: a detailed overview of Google Analytics 4

December 18, 2020

The year 2020, among other things, has brought to the world the long-awaited Google Analytics 4, which is now officially released and available worldwide. What are the key changes and how Google Analytics 4 differs from Universal Analytics? Who will benefit from the transition to the new data model and how to get started with Continue reading >

How to enter a new market using PPC advertising | Case Study

December 1, 2020

How to enter a new market? This is the question lots of retailers who successfully promoted their products within certain regions frequently ask. Marketers tend to believe that display advertising is the easiest way to market a product when entering a new market. In this article, we share the case of an international player and Continue reading >

How to find expired domains: best places to buy expired domains with traffic

November 27, 2020

Imagine a perfect SEO world with traffic coming non-stop to a newly created eCommerce website or a related blog, with the backlinks placed right where you need them, relevant keywords carefully selected for your business and even with its name and the domain name known among the target audience. Making this real is possible with Continue reading >

Let's get the ball rolling

Please fill in this short form and we will be in touch with you soon

For any questions [email protected]


+44 (0) 20 313 766 81
+44 7852 537715

The Terrace AT5,
Grantham Street,

+1 347 809 34 86

Las Vegas
Suite 100,
NV 89113

Roosikrantsi 2-K230,
Kesklinna linnaosa,
Harju maakond,
Tallinn 10119

CEE Ecommerce Report 2019

Based on the analytics data of 292 websites

enter correct name, please
enter correct e-mail, please

Which websites do not like search engines?

start now

Start now