SEO for web developers

Category: SEO

SEO for developers In this article we will discuss an important question for web developers and owners of online projects for whom search traffic is not an empty phrase, but is a really important factor. We will show how to make every project friendlier to search engines, what to avoid and how to build a strong ground for future high positions in SERP.

First of all we need to simply understand how search engines work, how they gather information, and which factors are important for high ranking in search results.

Below is a short scheme of search engine work:

We can pick out the following important stages:
1. Moving of search engine spiders across links and collecting information from the found pages to a database.
2. Texts analysis and creation of index based on calculation of various factors (title, text, backlinks, etc.)
3. Analysis of a query typed by a user in a search string, showing a list of pages that match the query in descending order of relevance degree.

On the basis of this process we can divide the task of competent (in terms of search engine optimization) developing of an online project in two major parts:
1. Providing as full site indexing as possible + writing a unique text (the greater amount of a unique text you write, the better it is).
2. Providing the best combination of ranking factors to achieve higher positions.

Let`s examine the process on practice.

Site indexing

How do sites and pages get to an index? About 8 – 10 years ago you had to add your project to a queue to be crawled through a special form for each search engine to inform search engines about existence of our site.

Fortunately, you don`t have to do it nowadays. A search spider learns about appearing new sites and new pages from different resources:

1. Links from other pages

2. Information transfer from the Google Bar (when a user visits pages with help of a browser with installed GoogleBar)

3. Finding of a URL address mentioned in the text (we mean URL address without tags

<a rel=”nofollow” href=””></ a>)

Thus, to provide the fullest indexing of a recently created project it is necessary to:

1. Form a distinct internal linking (it is ideal to make text links kind of <a href = “”></a>

Alternatively you can create graphic links with possibility to change attributes

<img alt = “”>)

2. To get backlinks.

It makes sense to register your site in Google Webmaster Tools, to create a site map in XML format and to submit it to your account (you may read Google Webmaster Tools Help to learn how to do it). This work will make your site indexing easier.

The most frequent mistakes of web developers:
Flash. This mistake is not really actual, however even now we meet projects which use Flash for menu designing or even for whole page designing. The best way is to use this technology very accurate and gently. Try not to use Flash for designing of hyperlinks and text because search engines index such content really badly.

JavaScript / AJAX. These technologies are modern and convenient, but search engines are not able to estimate their convenience. Nevertheless, usage of them is quite acceptable and justified if it is correct. For example, site Digg.com represents 9 million pages for Google, and not only the main text of articles is available for indexing, but also many user comments can be indexed.

Content Duplication. Frequently, the same pages are available at several addresses. Also results of internal site search, sorting results (for example, if a site has a list of products with several fields accessible for sorting) and even administrative parts are available for indexation. You can spend money and time to promote one version of a page but search engine will index another one and rank it better than your target page.

It is desirable also to set up 301 redirect from the version “site.com” to “www.site.com” or vice versa (depending on what version is considered by search engines as primary to avoid different versions of the same pages in web index.

It is necessary to make sure that ID sessions don`t appear in URL-addresses as well.

Code Clearing. You should validate a code, make paired tags, take out CSS and Javascript in a separate file. If possible, you should set semantic parts of a page closer to the beginning of the page.

Incorrect server codes. Once, when we worked with one client project, we found that an error page returned code “200 OK” instead of “404 Not Found”. As a result Google index was mercilessly overloaded with garbage pages.
If you have such possibility it is better to use friendly URL (e. g. www.site.com/…/Canada/Toronto). Search engines already index dynamic addresses normally, but don`t forget about users who put links on their blogs/sites using copy-paste from a browser address bar. In this case we also get a link containing a keyword (we`ll learn soon why it’s important).
It will be ideal, if you develop a project with each page containing unique content, a unique friendly URL, an accurate correct code easily available for indexing because of competent internal link structure.

High page ranking

The next challenge is a little more complex than the inclusion to the search engines index. You need to think over the project carefully to maximize the influence of ranking factors.

So, what influences ranking? All factors can be divided into two blocks:

1) Internal factors.

a. Domain age

b. Domain name relevance to a query

c. Relevance of a URL address and of a document title to a query (travel / Canada / Toronto.php)

d. <title> tag

e. Presence of desired words in a text, <h1>, <h2> headers etc.)

f. <img alt = “”/> tag

g. Texts of internal links

What we should pay attention to while developing an online project?
1. Providing ability of flexible management of URL-addresses within the site – ability to change them easily, maybe even to create them manually.

2. Providing flexible management of title tags with an opportunity to set them manually.

3. Making as large volume of text available for indexation, as possible, including comments of users. In such case there are more chances that the text will be relevant to a low-frequency query of users.

4. Preferable usage of standard HTML tags for page design  (headings <h1>, <h2>, tags <strong> etc.)

5. Possibility to configure ALT attributes for images.

2) External factors. (Backlinks)

The more backlinks a page has, the more trustful the page is considered, and the more trustful a referring site is, the more weight a link has.
If a link to the site www.ebay.com has a text “Online Auction”, search engine can decide that the site ebay.com is relevant to a query «Online Auction». Moreover, a search engine will give more trust to a site which is more authoritative. Search engines consider both internal links and backlinks during page ranking.


Nowadays backlinks are the main factor of page ranking, so pay attention to them first of all. We can give you some pieces of advice:
1. Place pages that you are going to promote closer to the main page according to URL hierarchy (the best way if a user can get to this page clicking only once). The main page is the most authoritative (it has the biggest weight for search engines), and pages lose their weight with every click which make them more far from the homepage.
2. Text of internal links should contain keywords.
3. It`s better when your site has some pages with a rich content (blog, articles section etc).

This will attract natural links, as well as bring additional visits from SERPs to pages with articles, blog posts, etc.

Conclusion

Premeditate how users will search you at the stage of thinking over the concept of a site and planning its architecture. If you solve this issue at the very beginning of project developing, you will save a lot of time, money and efforts while promoting your site. Certainly, the best way is to consult with an expert at a stage of preparation of technical specifications. But if you don`t have such possibility, just use pieces of advice from this article.