Friday, December 21, 2012

SEO Basics


1. Introduction
One of the buzz-words of the latest 10 years in internet marketing is SEO. Everyone talks about SEO, everyone tries to apply it more or less successfully. If you are experienced in this theme you may skip the first chapters, otherwise read along to learn the very basics.
1.1. What is SEO
The term "SEO" is the abbreviation to "Search Engine Optimization". This is not about optimizing search engines, though. It is about optimizing websites for search engines. But why one needs to optimize a website? To answer this question we need to understand what a search engine is.
Search engines as the way to find info on the web appeared in the middle of 90's. They crawled websites and indexed them in their own databases marking them as having one or another keyword in its content. Thus, when someone put some query in the search box of that search engine, it quickly searched its database and found which indexed pages corresponded to that query.
So, the more keywords of a query a website had, the higher it was shown in the results of a search. We don't know who was the first guy realized that he can make some changes to the pages of his website to make it rank higher, but he was truly a diamond!
So, SEO is something that helps your site rank better in search engines. There are a number of ways and methods of SEO, some of them are legitimate, while others are restricted and considered as "blackhat" techniques. Search engines don't like blackhat SEO and the effect of its usage may be disastrous for your website. Anyways, we'll thoroughly cover this material later in this SEO FAQ.
1.2. Do I need SEO?
Well, the answer "Yes" is the first thing that comes to mind, does it? But let's think a bit more. Does SEO help... well.. umm.. say some oil-extracting company to sell their product? Does it help to promote a small local grocery in the neighbourhood of your home owned by an old chinese? Does it help Obama to rule his bureaucrats? Well, I guess you've got the idea. SEO is effective mostly for Internet businesses. Do you have one? Then you need SEO. Otherwise, SEO is only one of the possible channels to spread the word out about your product or service. And not necessarily the best one.
1.3. Should I hire someone or make it all myself?
One of the most frequent unspoken questions is: should I hire a SEO professional or save few bucks and do it myself? There is no one universal answer for all situations, so here are some pros and cons:


Pros
Cons
Hired SEO
You don't have to waste your time;
You don't have to learn SEO yourself;
SEO Pro's can be quite effective.
You still need to control a hired SEO yourself;
SEOs usually don't give any guarantees and actually you must be very cautious while choosing a SEO to hire;
Hired guy may be a SEO professional, but he is not necessarily a proffesional inyour theme;
Finally, you have to pay this guy.
Do it yourself
If you want it done right - do it yourself. You are the one who performs all the show, so you know best what is right and what is wrong about it;
You really saving some bucks out there;
You can constantly monitor the trends and apply changes in your SEO strategy on the fly.
You will need to spend some time reading SEO FAQs and tutorials like this one, posting stupid questions on forums and doing other things nubies always do. It doesn't kill, but still takes some time;
You can get very little SEO benefits for all of your efforts and time spent. After all, you are not a guru, right?

2. Basic concepts
2.1. Search engines
Before we start talking about search engine optimization we need to understand how search engines work. Basically, each search engine consists of 3 parts:
The Crawler (or the spider). This part of a search engine is a simple robot that downloads pages of a website and crawls them for links. Then, it opens and downloads each of those links to crawl (spider) them too. The crawler visits websites periodically to find the changes in their content and modify their rankings accordingly. Depending on the quality of a website and the frequency of its content updates this may happen from say once per month up to several times a day for a high popularity news sites.

The crawler does not rank websites itself. Instead, it simply passes all crawled websites to another search engine module called the indexer.
The Indexer. This module stores all the pages crawled by the spider in a large database called the index. Think of it as the index in a paper book: you find a word and see which pages mention this word. The index is not static, it updates every time the crawler finds a new page or re-crawls the one already presented in the index. Since the volume of the index is very large it often takes time to commit all the changes into the database. So one may say that a website has been crawled, but not yet indexed.

Once the website with all its content is added to the index, the third part of the search engine begins to work.
The ranker (or search engine software). This part interacts with user and asks for a search query. Then it sifts millions of indexed pages and finds all of them that arerelevant to that query. The results get sorted by relevance and finally are shown to a user.
2.2. Terminology
Here are the basic terms you need to know. All others will be explained along the way.

Anchor text
This is simply a text of a link. Let suppose you have a link like that:

<a href="seo-faq-tutorial.htm">The essentials of SEO - a complete guide<a>
The link would be looking as follows:
The essentials of SEO - a complete guide

The text "The essentials of SEO - a complete guide" - is the anchor text in this case. The anchor text is the key parameter in a link building strategy. You should always make sure that the anchor text of a link meets the theme of that page. If your page is about dogs, do not link to it with the "cats" anchor text. Obviously, you cannot control all and every link on the web, but at least you should make all links within your own website have an appropriate anchor text.

Inbound link
...or backlink is a link that points to your site. The more you have - the better. But in particular there are many exclusions from this rule, so read the Off-Page optimization section to learn more.

Keyword
One or more words describing the theme of a website or page. In fact, we should distinguish keyWORDS and keyPHRASES, but in SEO practice they all called keywords. For instance, the keywords for this page are: SEO FAQ, SEO tutorial, etc.

Short-tail and long-tail keywords
Easy one. Short-tail keywords are some general, common words and phrases like "rent a car", "seo", "buy a toy", "personal loan" and so on. Long-tail on the opposite precisely describe a theme: "rent bmw new york", "seo in florida", "buy a plush teddy bear" etc. The more precise a keyword is, the less it is popular, the less people type this exact query in the search box. But! The other side of the coin is: since each query is highly targeted, then once a visitor comes to your website from a search engine query and finds what he is looking for - it is very likely that such visitor will soon become a customer. This part is very important! Long-tail queries are not very popular, but the conversion rate for such queries is much much greater than for short-tail ones.

SERPs
You may heard this term, but didn't understand what is it. SERP means "Search Engine Result Page". If a user types some query and hit Enter he is redirected then to a SERP. Then he can click one of the results to open that website. Obviously, the results shown in the first positions get much more visitors than the ones from page #2-3 and lower. This is the purpose of SEO, actually: make a website move higher in SERPs.

Snippet
This is a short description shown by a search engine in the SERP listings. The snippet is often taken from a Meta Description tag, or it can be created by a search engine automatically basing on the content of a page.

Landing page
Landing page is a page opened when a visitor comes to the site clicking to a SERP. Here is an example query:
Free Monitor for Google


In this case, the page www.cleverstat.com/en/google-monitor-query.htm is a landing page for the "google monitor" query.

Link juice
This funny term means the value that passes from one page to another by means of a link between them. To be precise: the linked page (acceptor) gets a link juice from the linking page (donor). The more link juice flows into a page, the higher it is ranked. Let's imagine a page that is worth $10 - this is the value of that page. If a page has 2 links, each one costs $5 then - that is the amount of link juice passed to the linked page. If the first page has 5 links, then each one only passes $2 of the initial link juice. Here is a simple picture to illustrate this concept:
Link juice explanation. $5 value links.
Each link passes $5 value

Link juice explanation. $2 value links.
Each link passes only $2 value


This means, the more links a Page A has, the less value each linked Page B gains from that Page A. Obviously, the real link juice value is not measured in dollars.

Nofollow links
Nofollow link is a link that a search engine should not follow. To make a link nofollow you need the below code:

<a href="somepage.html" rel="nofollow">Some anchor text</a>

Google does not follow nofollow links and does not transfer the link juice across such links. You can read more about nofollow links 
here.

Link popularity
This term designates the amount of inbound links pointing to a site. Popular sites have more links. However, the number of inbound links is only a half of a pie. Read the 
off-page optimization section below to learn more.

Keyword stuffing
When you put a long list of keywords in a tag - this is keyword stuffing. For instance, a title tag for this page could look like: <TITLE>SEO guide, SEO FAQ, SEO tutorial, best seo faq, seo techniques, seo strategy guide</TITLE> and so on. This would be the keyword stuffing. Instead, the current title of this page (the one you're reading now) looks quite natural and adequately describes its contents. Do not use the keyword stuffing as a) it does not work; b) it is a bad practice that can hurt your rankings.

Robots.txt
robots.txt is a file intended to tell search engine spiders whether or not they are allowed to crawl the content of the site. It is a simple txt file placed in the root folder of your website. Here are some examples:

This one blocks the entire site for GoogleBot:
User-agent: Googlebot
Disallow: /

This one blocks all files withing a single folder except myfile.html for all crawlers:
User-agent: *
Disallow: /folder1/
Allow: /folder1/myfile.html


3. Ranking factors
In general, there are only two groups of them: on-page and off-page ranking factors. It's been argued which one is the most important, but we'll answer that question later in this FAQ. At this time you should understand that both are crucial and both need the proper attention.
3.1. On-Page ranking factors
There are many on-page ranking factors and even more has been spoken of since the first days of SEO. Some of them are really important, while others are said to be crucial for SEO, but actually are useless or even hurt your rankings. You know, search engines are evolving, they change their algos, and something that used to work in 2003 now has become a piece of useless garbage. So, here is the list of on-page ranking factors sorted by their importance and SEO value.
3.1.1. Important stuff
Title
This one seems to be one the most important on-page factors. You should pay a close attention to the title tag. Here are some tips on writing a good title:
a) Keep it precise and short enough. There is a popular myth saying that the title tag must be short, because Google (and others too) does not read it past first 60-70 symbols. That's not true (the proof link). Google will read nearly all that you offer to it in your title tag, but the weight of each keyword in the title would be much less in that case. It seems that only first 10-12 words get the benefit from being in the title, so keep it short. Also, a long and spammy title tag is hard to read by human visitors.
b) Do not stuff it with keywords, instead write in a normal human-oriented style. Instead of "Big gadgets, small gadgets, cheap gadgets, gadgets for sale" use more natural "Cheap gadgets of all sizes for sale". Hope you got the idea.
c) Use a unique title for each page of your website. Each title should accurately reflect the contents of the entitled page. Do not use the same title all over the website.
d) Make your title eye-catching! It is title that a visitor first see when screening the results of a search. It is the first step towards the sale - don't ignore it.

Content
The next important factor is the content of a page which seems pretty naive at the first glance, right? Wrong! Content is the king, as SEOs like to repeat. The quality content not only describes your product or service, it also converts your visitors to your customers and customers to returning customers. The quality content increases your ranking in search engines as they like a quality content. Moreover, the quality content even helps you get more inbound links to your website (see 
off-page ranking factors below)!

Basic tips for content are:
a) Write for humans, not for search engines! Remember: you offer you products for humans. It is human who reads the texts on your website and decides whether or not he is going to purchase the stuff from you. Yeah, technically speaking, search engines read your site too, but I never heard of a search engine that would buy something.
So you should create a content that is interesting and useful for your human visitors at the first place!
b) Suggest something valuable. A text merely describing your product is dull and useless. I don't want to know what features a product has. I want to know what is in it for me! Consider that when preparing the content of your website.
c) Share your experience. Write of something that is interesting to you. Share your experience. Offer some articles or reviews of related products or services (do not borrow them from article sites though - write your own instead). You know - content is the king - so if your site is interesting to your visitors they will link to it on their own.
d) The first 3 wasn't too SEOish, right? Here is a bit more technical one: keep the text on a page within one theme. Search engines are more about themes now, rather than about keywords as they used to be. So you should think the same way: in terms of themes, not keywords. For each page of your site choose ONE theme related to your business and fill that page with the content relevant to that theme. Focusing your efforts within one-theme-per-page strategy makes it much easier to create landing pages for long-tail queries and also make the whole website more structurized and easy to read.

Navigation and internal linking
Again an important ranking factor. It seems obvious to create a proper navigation so the search engine crawler could follow all the links on a website and indexed all of its pages then. However, this factor is still being highly underestimated. Creating a clear and easy plain-text navigation helps both search engines and human visitors.

Avoid using JavaScript or Flash links since they are hard to read by search engines. Always provide an alternative way to open any page at your website with simple text links. Do have a sitemap of your website available from any other page with one click.

Also keep in mind that quality internal linking spreads the link juice across the pages of your website, and this strongly helps your landing pages rank better in SERP for long-tail keywords. Use this wisely, though. Link only to pages that really need to be linked to.

Let's suppose you have two pages: one generates you $10 income for every visitor, while other one does only $0.1. Which one whould you link first? Think of it that way and link to the most important and valuable pages of your website, using a relevant anchor text for each link.
3.1.2. Helpful stuff
The below factors and techniques are not as crucial as the ones described above, but still they help a bit in gaining a higher rank in SERPs.
Headings
Once upon a time search engines paid a close attention to the heading tags (H1 through H6), but now those days are gone. Heading tags are easily manipulated, so their value is not very high nowadays. Nevertheless, you still want to use headings to mark the beginning of a text, to split an article into parts, to organize sections and sub-sections within your document. In other words, despite headers provide merely a small SEO value, they are still crucial for making your texts easiliy readable by human visitors.

Use the H1 tag for the main heading of the page, then the H2 for the article headings and the H3 to split the different parts of an article with sub-headers. That would be pretty good practice and is enough to make your site readable by humans. It also adds some SEO points which you should not neglect too.
Bold/Strong and Italic/Emphasized text
Both are nearly useless, but still have some SEO value (very little though). As with headers, you better use them for the benefit of your human visitors, emphasizing the key parts of the text. But do not put every 5th keyword in a bold text as it looks ugly while not giving any significant boost to your rankings anyway. Moreover, such page would be very hard to read.
Keyword placement
The value of keywords in a text depends on their placement across the page. Keywords placed near the top of the document get higher value than ones residing near the bottom. Important: when I say top or bottom I mean the source of the HTML document, not its visual appearance. That is why you want to put your navigation and supplemental texts near the bottom of the source file and all important and relevant content - near the top.

This rule also works in more specific cases: keywords placed in the beginning of the title tag are more important than ones placed 4th or 5th. Keywords placed in the beginning of the anchor text are more important and get more value too.
Keywords in filenames and domain name
An old trick with putting your target keywords into a filename or having them in a domain name. Still works, but don't expect too much boost from this one.
a) Keywords in a domain name do help a little, but it is much better to have a short, easy to remember domain name than something like www.all-of-my-target-keywords-i-so-much-want-to-rank-for.com
b) Keywords in a file or a folder name also help a bit and since you still want to name your documents, why not give them appropriate names? Though as I said before, do not expect any significant ranking boost. For a competitive query it won't help you much anyway. Also, if your page is written in other language than default English (or some other european language), it won't help you at all.

Image Alt attribute
This one was very popular in 2003, but now keyword stuffing of the Alt attribute does not give any SEO value to a page. The better use of the Alt attribute would be something like this: <img src="some-pic.gif" alt="accurate description of some pic">

Write a natural description for each image and make sure it reads well. This helps you in two ways: a) your site ranks better in the image search; b) Google often takes the Alt text to create a snippet for the SERP.
Meta Description
One of the most popular and steady myths (alongside with 
keyword density) is the Meta Description tag. They say it helps you rank better. They say it is crucial to have it filled with the apropriate description of a page content. They say you must have it on each page of your website. All of these is not true. Nowadays, the only way the Meta Description is used by search engines is taking its content to create a snippet for the SERP. That is all! You don't get any other benefits of using the Meta D on your page, neither do you fall upon any penalty for not using it.

There is an opposite opinion suggesting not to use the Meta D at all, since a search engine anyway creates a snippet basing on the content of a page and you can't make this work better than a search engine. So why waste your time doing that? Personally, I would not agree with this point, since according to 
Google guidelinesthe Meta Description tag is still the preferred source of the info for a snippet. Though it is up to you decide whether you want it on your page or not, since as stated above it doesn't give any additional SEO impact, neither positive, nor negative.
3.1.3. Useless stuff (no pain, but no gain as well)
Meta Keywords
Long time ago the <meta name="keywords" content=""> tag was intended to tell search engine the keywords relevant to this particular page. In modern SEO history search engines download websites and extract relevant keywords from their content, so the Meta K tag is not used for web ranking anymore. Simply forget it, it is useless for SEO.
Keyword Density
One of the most overestimated web ranking factors is the keyword density. What is keyword density and why this myth lives so long? The keyword density of each particular word on a page is calculated as follows:
KD = Word_Count / Total_Words * 100%


That is, if a page has 150 words and the word "SEO" is mentioned 24 times on that page, its keyword density would be: 24 / 150 * 100% = 16%

But why this value is useless? Because search engines has evolved and does not count on keyword density anymore, since it is very easily manipulated. There are thousands of factors that search engines consider when calculating the page rank, so why would they need such simple (not to say primitive) way to rank pages as to count the number of times a word appears in the page text? You may hear the keyword density of 6% is the best rate, or keep it within 7% to 10%, or search engines like kw density within 3% to 7% and other bullshit. The truth is...

Search engines like pages written in a natural language. Write for humans, not for search engines! A page can have any keyword density from 0% (no keyword on a page at all) to 100% (a page consisting of only one word) and still rank high.

Well, of course you may want to control the keyword density of your pages, but please consider that there is no good value for this factor. Any value will work if your text is written with a human reader in mind. Why would one still want to check for keyword density if it is not count any more? Because it is a quick and dirty way to estimate the theme of a page. Simply do not overestimate this thing, it is merely a number, nothing more and it is useless for SEO.

Another interesting question: why this myth is still alive and why there are so many people still talking about keyword desnity as an important ranking factor? Perhaps, because keyword density is easy to understand and modify if needed. You can see it right here with your naked eye and quickly learn if your site is going good or bad. Well, it only seems as that, but not actually is - keyword density is useless, remember?
Dynamic URLs vs. static URLs
Beleive me or not, there is no difference. Both are of the same SEO value. The days when search engines had difficulties indexing dynamic URL websites are gone for good.
www.site.com vs. site.com
No difference either. If you want your site to be accessed with both ways, please add something like this into your .htaccess file:

RewriteEngine on
RewriteCond %{HTTP_HOST} ^domain.com
RewriteRule (.*) http://www.domain.com/$1 [R=301,L]

Underscore vs. hyphen in URLs
Once again, there is no any difference from the SEO point. You can use underscore, or hyphen, or even don't use any separator at all - this neither helps, nor hurts your position in SERPs.
Subfolders
Is it better to have a /red-small-cheap-widget.php file rather than/widgets/red/small/cheap/index.php? Does it hurt your rank if you put the content deep into the subfolders? The answer is no, it won't hurt your rankings and actually it doesn't matter at all how deep in the folder tree a file is located. What matters is how many clicks it takes to reach that file from the homepage.

If you can reach that file in one click - it certainly is more important and would have more weight than say some other file located within 5 clicks away from the index page. The homepage usually has many link juice to share, so the pages it directly links to are obviously more important than others (well, since they receive more link juice, that is).
W3C validation
W3C is 
World Wide Web Consortium - an international consortium where Member organizations, a full-time staff, and the public work together to develop Web standards. Basically speaking, they are guys who invented HTML, CSS, SOAP, XML and other web technologies.

Validation is the process of checking a page or website for its compliance with W3C standards. You can run a validation of any website for free 
here. Note, this validator shows not only such trivial things like unclosed quotation, undefined tags or wrong attribute values. It also checks the encoding problems, the compliance with the specified DOCTYPE, obsolete tags and attributes and many more.

Why is validation needed? A 100% valid website ensures that it will display correctly (and identically!) in all browsers that support standards. Unfortunately, in real life some browsers do not strictly follow the W3C standards, so a variety of different cross-browser problems with the number of websites are not rare thing all over the web. This doesn't belittles the importance of W3C standards, however.

From the SEO point the validation doesn't look so crucial though. Run a validation through google.com and you'll see a bunch of warnings and errors on their website. This example pretty clearly shows that Google doesn't care of W3C validation itself. At least not as much to give a strong rank boost to valid websites or penalize erroneous ones. It simply doesn't care. The recommended W3C validation strategy is: perform it to make your site working and accessible with all common browsers and don't bother doing it for the SEO purposes only, if you don't experience any cross-browser issues - it works fine as it is.
3.1.4. Stuff that hurts your rankings
Keyword stuffing
Google defines that term pretty clear. Once again: write for humans. Repeating keywords across the page can trigger Google spam filter and this will result in huge loss of positions if not total ban of your website. Write naturally, optimize a bit if needed - that's the best way of using keywords nowadays.
Hidden text / Invisible links
At first, let's see what 
Google says about hidden text. Obviously, Google doesn't like it and if your site uses such technique it may be excluded from Google's index. You may ask, how would Google know if I use hidden text or not? Ok, I can set "display:none" in my external CSS file and limit the access to that CSS file with my robots.txt. Will Google be able to learn that a page has a hidden text then? Yes and no. This might work in the short term, but in the long run your disguise will fail, sooner or later. Also, it's been reported that GoogleBot not always strictly follows the robots.txt instructions, so it actually can read and parse JS and CSS without any problems and once it does - the consequences for your website and its web rankings will be disastrous.
Doorway pages
As bad as some SEO method could ever be. The doorway pages are special landing pages created for the only sake of obtaining good positions for some particular keyword. It doesn't have any valuable content and its only purpose is to catch the visitor from the SERP and redirect him to some other, non-doorway page which by the way is usually absolutely irrelevant to the initial visitor's query.
Splogs
Splogs (derivative from Spam Blogs) is the modern version of the old-evil doorways. The technique was as follows: one created thousands of blogs on some free blog service like blogspot.com, linked them between each other and obtained some backlinks via the blog comment spam and other blackhat methods (see below). Splogs itself did not contain any unique information, their content was always automatically generated articles stuffed with keywords, however due to a large number of inbound links such splogs ranked very well in SERPs dislodging many legitimate blogs. Later, Google implemented some filters to protect itself from the large amount of splogs and now any splog gets banned pretty fast.

If you own a blog - do not make it spammy. Instead focus your attention on writing good and interesting content. This works better in fact.
Cloaking
Not as bad in some particular cases, but still a blackhat technique. The method is based on determining whether a visitor is a human or search engine spider and then deciding which content to show. Humans then get one variant of the website while search engines get another one, stuffed with keywords.
Duplicate content
Being a scarecrow for many webmasters, duplicate content is not actually as dangerous as it is spoken. There are two types of content that can be called duplicate. The first case is when a website has several different ways to access the same page, for instance:

http://www.somesite.com/
http://somesite.com/
http://somesite.com/index.php
http://www.somesite.com.index.php?sessionid=4567
etc.

All four refer to the same page, but actually are treated as different pages having the same content. This type of duplicate content issue is easiliy resolved by Google itselfand does not lead to any penalty from Google.

The other type is duplicate content on different domain names. A content of a website is considered duplicate if it doesn't add any value to the original content. That is if you simply copy-paste an article to your site - it is a duplicate content. If you copy-paste an article and add some comments or review it from your point of view - that's not duplicate content. The key feature here is some added value. If a site adds value to the initial information - it is not duplicate.

There are two other moments here that are worth to be mentioned. First, if someone copies your text and posts it then on another site - it is very unlikely that you will be penalized for that. Google tracks the age of each page and tends to consider the older one - and it is your website in this case - as the source of the original text. Second, you still can borrow the materials from other websites without a significant risk of being penalized for duplicate content by simply re-writing the text with your own words. There is a way to produce unique random texts using Markov chains, synonymizers and other methods, but I would not recommend using them, since the output looks too spammy and is not natural anyway, so it really can hurt your Google position. Write for humans. Write by yourself.
Frames
The frames technology not being a blackhat SEO by itself still can hurt your rankings, because seach engines do not like frames, since they destroy the whole concept of the web - single page for single URL. With frames, one page may load and display the content from many other URLs which makes it very hard to crawl and index. Avoid using IFRAME and other associated tags unless you really, really have to and if you do - provide an alternative way to index the contents of each frame with direct links or use the NOFRAMES tag with some backup content shown to search engines.
JavaScript and Flash
Google can read both JS and Flash (well, its text part of course), but it is not recommended to build your site solely basing on these two. There should always be a way for a visitor (either human or bot) to read the content of a website with the simple plain text links. Do not rely exclusively on JS or Flash navigation - this will kill your SEO perspectives as quickly as the headshot.
3.1.5. On-page factors summary
Well, if you've read carefully the above parts you already can figure out the summary yourself. Content is the king, but only a quality one is. Do not try to trick or cheat with search engines as this only works on the short run and it is always just a matter of time when your rankings get dropped forever. Providing high-quality relevant content interesting both for you and your visitors is the key to on-page ranking success and (paradoxically!) a half of the way to the success with off-page ranking factors.
3.2. Off-page ranking factors
3.2.1. What is it?
At the end of XX-th century search engines ranked websites basing solely on their content. The situation has changed after the Google triumph. Google's algos were based on the link popularity, not only the content of websites. So, the more inbound links a website had, the higher it was ranked by Google. The whole concept didn't change very much from those days - popular websites often get linked to, so this factor is applied for calculating web rankings alongside with the content of such websites. In present days it is possible to rank for some keyword even if a website does not contain that keyword in its text! (the proof link)

Needles to say that you should pay an attention to off-page ranking factors as much as you do for on-page optimization. This SEO tutorial describes all the things you should keep in mind while maintaining your inbound links. Read along.
3.2.2. PageRank
In the first hand, we must separate two things: the real PageRank and the PageRank green bar shown in Google Toolbar and other online and offline PageRank tools. The Google bar PageRank (I'll be calling it the green PageRank, or gPR) is merely an indicator. The real PageRank of a website (I'll be calling it the PageRank, or PR from now) is a mathematical value reflecting the probability of a visitor randomly following the links on websites to open this particular website. The value of 1 means 100% probability, that is a visitor randomly surfing the web will always open the website. Sooner or later. On the opposite, the value of 0 means that a random visitor never comes to that particular website through a link on some other site.

I won't get deep into mathematics of the PageRank since this info can be easily found on the web. I'll only underscore the key moments of the PageRank statistical nature.
First of all, you must understand the following: the number of websites grows each day, while the overall PageRank value always stays the same: 1 (one). In other words, there is a 100% probability of the fact that a visitor opens SOME site on the web. But the odds of each particular website are going lower and lower every minute. If you have 3 apples and two of them are maggoty, what are your odds to take a good apple? They are 1/3 or 33%. If you have to choose one apple of 100 you only have 1% probability. That's the case with the PageRank - it lowers naturally every day.
Due to the pt.1 and the overall enormous number of indexed websites it is not possible to show the exact PageRank value every minute. That is why we need the green PageRank which is updated every 3 or 4 months and shows the PageRank value in a more comprehensible form: as a number from 0 to 10. This number correlates to the actual PageRank very little, it only shows the basic trends.

Also, the gPR scale is non-linear. One may think that a website with PR2 is two times more popular (or at least has two times more chances to get that random visitor we were talking about earlier) than its unlucky brother with PR1, but that is not true. In fact it is more likely to be that a PR2 website is 10 times more popular than PR1, but 10 times less popular than PR3. Something like this, but the number 10 is only an example here, since we don't know the exact formula.
The PageRank models a random user who is surfing the web and following random links on websites. From the practical point of view this means: the more links all over the web points to your website, the higher its PageRank is.

So, now you know that the key off-page ranking factor is the number of inbound links to a website and the green PageRank is the indirect indicator of that number. However, the PageRank mathematical mechanism considers only a quantity of links, while in fact there is also a quality factor. This is implemented via different filters and value dumping factors that Google applies to each link before including it into the PageRank calculation.

3.2.3. Important stuff
This part describes the crucial off-page ranking factors you should always pay attention to.
The theme of the linking website
This one is very important since the links from a relevant website are worth much more. On your link building efforts, try searching the websites that are close or at least similar to your own site theme. Though having a link from the unrelated site is not bad by itself and even Google admits that a webmaster doesn't have a full control on who and how links to its website. Nevertheless, avoid links from unrelated websites or sites with illegal or unethical content (porn, malware etc).
The theme of websites you link to
On the other hand, you have a full control of the links placed on YOUR own website so if you link to some unrelated website - it is you who is responsible for that and it is your site that will be penalized. So be careful what sites you link to. Linking to some unrelated content not necessarily leads to a penalty, but anyways you should be cautious and link only to quality websites.
Anchor text
The anchor text of the inbound link is very important and if you can adjust it - try to squeeze all out of it. First of all, avoid using the same anchor text all over the links. Use synonyms, paraphrases, different keywords, whatever else. Second, put the important keywords in the beginning of the anchor text. Finally, do not put all of your keywords into the link. There is really no reason to use anchor text longer than 50-55 characters or 10-12 words. Keep it short.
Landing pages
This factor is often ignored even by some professional SEOs and webmasters. It is not enough to simply have a link to your site! The link must be a) relevant; and b) quality. And you must be sure that both websites - the linking and the linked - qualify to these requirements. As for the theme of the linking website - see the pt.1. But the theme of your website should also be quality and relevant both to the donor website and to the anchor text of the link.

Well, it is not necessary in fact, but it helps a lot to have a properly optimized landing pages for every link you have. What this "proper optimization" includes?
The landing page must include keywords mentioned in anchor text in its content;
The keywords should appear in all important places like the title tag, the headings etc;
The overall topic of the page must match to those keywords.
If a landing page qualifies to all of these - it gets significant boost to its rank, because the corresponding inbound links get much more value now.
PageRank
The PageRank doesn't do anything by itself and the green PageRank does even less - it is simply an EGO-meter. However, the PR of the linking website (or a candidate) gives you an approximation of what the link from this website is worth, what value it has. Also, high-PR website are considered as trusted and gets some more value from Google. See below for more trust-factors.

A single link from a PR10 website (if you somehow manage to have one, of course) will quickly boost your own website PR to 7 or even 8 giving you the comparable boost in your search positions. But this factor is the last in the list of important off-page factors, because at first hand you should find a relevant and quality websitethat is willing to put a link to you and then (only then!) check its PageRank. Exactly in that order. Because quality content is worth more than high PageRank.
3.2.4. Helpful stuff
Reciprocal linking
The basic reciprocal linking is very simple: site A links to site B while site B links to site A.
Reciprocal links. Scheme A to B, B to A


There are other schemes though:
Cross linking. Site A links to site B from page A1, while site B links to site A from page B1.
Reciprocal links. Scheme A to B, B to A

Circular linking. Site A links to site B, site B links to site C,... site Z links to site A.
Reciprocal links. Circular links.

Three-in-a-row linking. Site A links to site B, site B links to site C. No link back from C.
Reciprocal links. Three-in-a-row links.

Combined.
Reciprocal links. Combined scheme.

There is a strong misbelief that reciprocal linking does not work anymore. That's not true. It does work, but the efficiency of this method is much much lower than it was in 2003. In 2009 Google greatly reduces the value of reciprocal links, especially for the schemes a and c, but the whole concept still works and really helps to gain rankings on early and middle stages of SEO promotion when literally every link counts.

Though there are some exclusions (as always however). Needless to say that you still have to choose the partners for reciprocal linking very carefully. Consider the theme of the linking website, its quality, its neighbourhood (other sites it links to), consider the page that would point to your site, pay attention to the anchor text of the link etc. You don't want to exchange links with spammy websites, or websites that use e-mail spam to suggest the partnership. You don't want a link buried 17 clicks away from the homepage.

Usually you don't want a nofollow link, but even a nofollow link from a relevant site can bring a load of target visitors to your site so it is up to you to decide whether it is only the link juice that you expect from the link exchange, or the auditory too. By the way, you also don't want a link from a page already having 50+ links on it. And the final yet still important note: do not e-mail website owners all over the web with link exchange proposal! That sucks, man, and no one answers anyway, while you will put your karma down to zero with such activity and possibly will receive a penalty from SpamCop or some other paranoid bastards. Don't do that, I tell you.
Web directories
One more technique that everyone tells it doesn't work anymore. Well, to be honest the efficiency of web directories never was so amazing. In fact there is only one web directory you certainly want to be included into: 
Google's Open Directory (or DMOZ). Google Directory is a free, human-edited web directory of high value. It is a bit tricky to get included into it, because it often takes months before your submission will be approved (if it ever will), but the game worths the candle - a link from DMOZ is a significant boost to your website value and a drop of life-giving link juice too.

If you have some free funds to spend, you may also want to be included into several paid inclusion directories, starting from 
Yahoo Directory which seems to be the most respectful of them. Also, here is a great article on directory submission you should definitely read. Don't miss the outstanding list of directories to submit too.
Social bookmarks
They used to work very well, but due to enormous amounts of spam on the social bookmarking websites the method is not as efficient as it were 2-3 years ago. Promoting a website through the social bookmarking websites has its pros and cons:
Pro: SB websites get crawled very frequently - every 2 or 3 hours. This means if you manage to get there you will get your piece of traffic from search engines pretty soon.
Pro: Social bookmarks not only help you raise your link strength, but also bring some amont of pure traffic from the bookmarking sites themselves. Depending on the popularity of the article posted on SB the traffic to your website could vary from few grains of sand to a pure avalanche.
Con: Unfortunately, you cannot simply bookmark a link to your website and wait for traffic. This could have worked in the first days of social bookmarking, but now it does not. First, the amount of posts (diggs, reddits, stumbles etc) per minute does not leave many chances to each particluar post to get popular. Your bookmark simply may lost between hundreds of thousands others. Second, bookmarking websites often have either moderators or some way for other users to disapprove an inappropriate post or a bookmark. So if you post a bookmark of your own website, the link is deleted and your account is banned. Too bad.

There are some workarounds for this though.
The whitehat one. Post an article or some other valuable (do you hear me? I said valuable!) content on your website and wait until someone else links to it. Then you should bookmark that site instead. This won't increase your link popularity or PageRank, but still bring you visitors. Oh, did I forgot to tell that the other linking site could be yours too? So you may have a commercial website with the article and a non-commercial blog where you mention that commercial article. Or even you may have another blog where you mention the blog that mentions the commercial article (in the house that Jack built).
The blackhat one. Create as many fake accounts as you need to promote your bookmark on every social bookmark site. Getting tough now since the method has been revealed ages ago and now SB websites are highly loaded with such spam already.
Con: One more bad thing about social bookmarks is: they tend to work for a limited period of time. They bring you a splash of traffic in the short term, but then they simply deplete and only give few visits a week. On the other hand, even few visits still's more than no visits at all.
Con: And the worst thing about social bookmarking is the quality of the traffic they produce. The traffic of a social bookmarking website is not targeted well, it is based on the impulse of curiosity, not on the intent. This means that you will (or will not - see above) receive a large load of traffic, but if you manage to convert merely 1% of it to customers you can congratulate yourself - you've done a good job! The conversion rate for this type of traffic is extremely low so this recipe only works well for a limited class of websites and products. Though it is still good if you want to build a community or simply need many people on your website for some reason (AdSense and so on).

Trust factors
A bunch of ranking factors that you only have a limited control of. Each of such factors does not add a value directly to the rank of a website. Instead, they increase itstrust rating. Google (and other search engines too) prefers trusted websites and gives a boost to their ranking. Trust factors include:
Domain name age. Old websites seem more trustworthy than others. If the domain didn't change its owner - it gets even more trust points (though don't ask how many).
The number and quality of inbound links, the PageRank. If many other websites link to this one - it is considered trustworthy. The quality of links doesn't play a significant role though, since you cannot be responsible for the links pointing to your website as you got no any control on that. Otherwise it would be possible to hurt your competitors by posting links to them from some malware sites.
Website content. If a website uses pop-ups, pop-unders or some of the blackhat SEO methods - its trust rating gets lower.
Outbound links. If a website links to other trustworthy sites it gets a boost to its own trust rating.
I believe there are more, but these ones are the most important.
Press releases, related resources, word of mouth etc.
That's a bit off SEO theme, but still can help to obtain a couple of links. You have some exclusive info? Share it with the community on some thematic resource. Do you have some astonishing news in your industry? Tell the world about it with a press release. Are you running a special promo action or offering a discount coupon? Let others know about it.

All of these usually doesn't require a single cent from you. You can send a press release via PRWeb, you can register on some forums within your industry for free to share your thoughts, you can tell others of your promo coupons at 
Giveaway of the Day or RetailMeNot and other similar sites. Don't neglect the power of word of mouth!
3.2.5. Useless stuff
These off-page ranking factors do not work any more (some - never did) or their value is neglible tiny.
Many links pointing to the same page
If a page has several links pointing to the same URL this won't give any additional SEO value, since Google only considers the very first link on the page. From the on-page optimization point this means you want to put your navigation menu links somewhere near the end of your HTML source. From the off-page point this means that you need only one link from one URL, because anyways only the first one counts.

Moreover, it even may hurt a bit. Let's suppose there is an external page that has 3 links in total, one of them points to your site. This would mean that one third of the overall link juice of that page flows into your website. Let's imagine then that you asked a webmaster to put one more link to your site to that page. So now it has 4 links, while two of them point to your site. So the link juice is now divided into 4 parts instead of 3, but hey! - the second inbound link from that page is not considered anyway, so in that case you are getting even less link juice than you did before!
Nofollow links
The SEO value of nofollow links is close to zero as they doesn't pass the PageRank and the link juice doesn't flow through them too. However, the link is always a link. Would you decline a nofollow link from the homepage of Google? This link would not give you any SEO value, but the traffic stream it would generate could smash up any dam.
Links in signature
Forum post signature is a popular place for links, but the SEO value of such method is extremely low. The fact is that nobody reads your signature unless you become a significant figure in that community and even then the signature links are not worth much. How many times have you opened someone's signature link yourself?

The direct SEO impact of such links is also tiny - the links are usually nofollow and even if they dofollow, they still buried in the depths of forum topics. The amount of the link juice you could obtain through them does not worth to be mentioned.

Does this mean "forget links in signature"? No. If you managed to become a part of the community and got some authority there, every spoken word of you (and your signature links as well) would attract the attention of the whole community. It needs time and efforts for sure, but there is no such thing as free lunch you know.
Guestbook links
The old as hell technique that never worked.
Blog comment links
Do not post comments to a blog for the only sake of the link. First, this is SPAM. Second, this doesn't work anyway. Third, most blogs have nofollow links in the comment so don't waste your time on something that doesn't help you, but pisses off all other people at the same time.
No-PR links
The amount of link juice that no-PR link has is utterly small and what is more important the trust rating it passes to the linked sites is small as well. This means that it is crucial to obtain links from high PageRank sites. Wasting your time on PR0 or no-PR sites is not worth the candles, because you need a bulk load of such links for the changes in your website rankings that you'll probably not even notice.

The PageRank by itself (as stated in the above sections) does not directly affect the position of your website, but it does affect the trust rating of other sites that link to you. Since you want links from trusted websites in the first hand, you should prefer high-PR links before all others.
Article submission
This one often occurs in many SEO FAQs and guides all over the web: write an article and submit it to article websites. That doesn't work. Well, ok, may be it used to work in the past, but now it doesn't. What is an article? It is a piece of useful text interesting to its readers. Now imagine a guy that is interested in reading 100.000+ very similar articles on some article website. You can't? What's the problem? The problem is: such guy never existed. Nobody wants to read an article made from the parts of another ten articles each of those in their turn was constructed from some initial article written in early 2003 with a keyword synonyms auto-replace software. Who wants to read those articles? Who wants to link to their authors? Nobody.

Surely, articles are good and you defintely want to write some. But submitting them to article websites is useless. Try applying some link bait instead.
Submitting your site to Google
Useless, because if you have some inbound links it crawls you anyway, and if you don't - ther's no difference whether you submitted the site to Google or not - it won't show up in SERPs. Though you may need this if your site has been excluded from the index for some reason (usually for some black hat SEO) - to include it back when you fixed the issue.
3.2.6. Stuff that hurts your rankings
Link Farms
Well, even a child knows - link farms are evil, do not ever participate in link farms and so on and so further. What is link farm? It is a group of websites that simply link to each other. The technique worked very well in the beginning of 2000s due to a high influence of the link popularity parameter to the SERPs those days. Then search engines introduced a filter and now link farms value for SEO has only a negative consequences for your website position. Do not participate in link farms, do not create one either.
FFA (Free-For-All) sites
Another old as hell example of unavailing SEO method that works opposite the way it was supposed to work. The idea is to have a site that links to others, but the amount of links is limited and all links are shifted down every time a new link is submitted to FFA site. Practically, this means that thousands of webmasters submit their links to FFA and each particular link is only displayed about 5 minutes before it gets dislodged off the site by another load of links. Guess what SEO value such links have?

If you were hoping of getting a bunch of human traffic - you are wrong too. All the traffic on FFA site is generated by other webmasters submitting their links. Most of them do that automatically so they do not visit other links anyway. So this side of FFA value is negative too. Finally, FFA is often a way to collect working e-mails for the SPAM purposes, so summarizing all of the above: by submitting to FFA you get a 5-min link that nobody visits and tons of spam to your e-mail address. Sounds not too attractive, right?
Forum/Blog/E-mail Spam
Simply put - spam is spam. The first rule of the ethical SEO is: do not use spam methods. The second rule of the ethical SEO is: do not use spam methods! Read this carefully and remember: do not use spam methods. Ever. I'll curse you if you do.

Now, leaving the emotions behind, here is a more technical explanation of why the spam is bad.
Spam pisses off forum readers, blog readers and owner, e-mail addressee.
Spam links on forums or blogs are useless in the terms of SEO, because forums where you can freely post spam comments are usually of very low quality and thus such links won't give you any link juice. On the other hand, quality resources are usually human moderated and your spam comments won't pass through anyway.
Someone can abuse your spam activity to SpamCop or other anti-spam freaks.
Spamming requires abuse-proof hosting, abuse-proof domain registrar, abuse-proof payment processor and abuse-proof conscience. Do you happen to have all those?

Paid links
Well... It was a hard decision whether to put the paid links into "Harmful stuff" part, or into "Useless", or into "Helpful"... Because paid links are all of these: helpful, useless and may hurt your rankings depending on how you use them. Personally, I never bought any links and I would not recommend doing so. The paid link is something that breaks up the whole concept of the WWW: "I link to it, because it is interesting or relevant" treating it to "I link to it, because I was paid for it". That's not linking, that's advertising. And that is why many search engines are treating paid links very cautious these days. The value of paid links is very low now and if Google somehow finds that a website prefers paid links over the natural ones it may get a penalty or get sandboxed.

Still, in despite of the above, paid links could be useful in promoting your website. However, you should keep in mind that since paid links are advertising, they must benofollow according to the 
Google paid links guidelines. This way a paid link is simply promoting a website and does not lead to the transfer of the link juice to it. That's ok, but since the link is nofollow now you should keep a closer attention to where you buy the link from - that link must bring you relevant visitors. Choose appropriate websites that are close to your theme, check their trust rating, don't hesitate to make a phone call or an e-mail inquiry if you have any doubts about them. One high quality link is better than 10 garbage links, it doesn't matter whether the links are natural or paid.
Inappropriate neighbours
This works in two ways: a) if some unsavoury site links to you; b) if you link to some unsavoury site. Both are bad. The first case is bad, because it hurts the trust rating of your site. If a bad guy links to you - you're also a bad guy. The second is no better either - it directly hurts your ranking positions.

What do we mean under "inappropriate neighbours"? This are malware, porn, hack, fishing, casino and other websites of questionable kind. So do not link to such websites and try to not have any inbound link from them too, though you can't control that directly, of course.
Unrelated websites
Not as bad as the above, but still can bring your rankings down a bit. Try obtaining links from the relevant sources - websites of your own theme or at least related to it. Why? Because search engines not only consider the anchor text of a link, but also read the surrounding text preceding and following it and collate it then with the text of your site. If the subjects of both sites differ significantly - the link is filtered out and its value is neglected by a search engine. Simply put, it is better to have one link from a relevant site than five links from ones as far from your website as the Sun is from the Earth.
3.2.7. Off-page ranking factors summary
Let's summarize the above part of this SEO guide. Here is a quick synopsis of what you have already read:
Good inbound link is a link from a relevant, high-PR site closely related to yours.
Good inbound link has target keywords in its anchor text and around the link itself.
Good inbound link points to a landing page made specially for each target keyword.
Reciprocal links and directories still work, but don't expect miracles.
Don't overlook social bookmarking sites.
Article submission is useless.
Do not spam.
Paid links may both hurt and help. To avoid penalty, paid links must be nofollow.
Keep your link neighbourhood clean and relevant.
4. SEO strategies
In this section of the SEO tutorial we'll describe the strategies, irreplaceable to raise the position of your site in Google and other search engines. As before, we will talk about on-page and off-page strategies separately, however you should understand that in order to achieve the highest efficiency of your efforts all of the strategies must work together. There must be no preferences to the content or to the link builidng. Both are crucial and both require your constant attention.
4.1. On-Page SEO strategies
The content is the king. Your main on-site SEO strategy is to build the content interesting to your visitors, providing info and value, correctly describing the services you offer, but not being a dull piece of text leaving the memory two seconds after you have finished reading it. At the same time, the content must serve its SEO purpose - to be a landing page for search engine queries. Surely, the process of writing a good copy is a creation, not the hack-work, but nevertheless there are few steps that you can follow walking this path:
Discover your keywords
This is the first thing you should think of while starting to optimize your site content. It doesn't matter what industry you're working in - every industry, every sphere of human work can be described in hundreds if not thousands of different words! And your potential customers enter many of those words in the search box every day! You can't afford yourself to sit and look how this money river just flows downwards passing you by.

Think of what you're doing. Think of what your customers want from you. Try to figure this out with simple keywords. Then use synonyms. That would be the basis of your keywords. Then use 
Google Keyword Tool or Search-based keyword tool to find more synonyms and related searches that other people type in the search box. Conveniently, you can see the search volume per month and the approximate competition for each keyword which also helps you much to filter out the best terms to target.

Write your keywords down and sort them by demand (the amount of searches according to Google keyword tool) and then by relevance. Then, make a quick Google search for those keywords to reveal your competitors and 
analyze their websites to extract a bit more keywords of your niche. Select the most relevant and demanded keywords of the final list and proceed to the next step.
Prepare the content
Now, as you know which keywords people are searching, you need to give them what they want - that is you should prepare a landing page for each search term. Obviously, it is not enough to simply write a copy and stuff it with the keyword. Quite the contrary, you should build your copy around the chosen keyword and the theme it describes. Write naturally, don't neglect the headers and the loud speaking title with sticky words in it attracting the attention of the reader.

Remember the important places where your keywords should appear? We thoroughly covered this matter 
earlier in this FAQ. Write a plain, natural-language title with the target keyword placed somewhere in the beginning of the title, write a couple of headers, name the file accordingly etc. How many keywords should each page cover? The best strategy here is to make a landing page for 2-3 keyword phrases, no more. You don't want to disperse the focus of your efforts. Each page should concentrate on several short-tail keywords and a dozen or two long-tail ones.

Repeat the step for all of your keywords. It may take a while to write a proper content for each of your target pharses, and more importantly it may be hard to write a unique, non-duplicate content for each of the pages. Do not simply copy-paste one text to another changing then "gadgets" to "widgets" and "foo" to "bar". That's won't work. Instead, write with normal language, write for humans (I never tired of repeating this one!). You don't have to prepare all of the pages at once - this is a marathon, not a sprint.
Interlink your pages
A very important step! As you already know, the relevant links to a page with the proper anchor text are one of the most important ranking factors. Most likely, at this time you don't have many backlinks from other websites so building the internal links is crucial for you as it gives the very first boost to your search engine position.

So, review the pages you have and make links on them to each other. I do not mean the usual navigation stuff here. I mean find some keyword on a page and enclose it into the anchor tag, pointing to the page that is closely relevant to that keyword. Here is a picture to illustrate this concept:

Interlinking the pages with proper anchor text
You can see that the gadgets page links to the widgets page with some general term and to some more detailed page (or a product page) with the "orange gadgets" anchor text. Interlinking your pages is extremely important, but please do not overload your pages with links! Too many links hurt your positions and make it difficult to read the page. Try using link only when it's necessary, that is: when a link helps a visitor to better understand the info, or to choose a proper product, or simply adds some other value to a page. Also, you should keep in mind that only the first link counts, so you must assure that your navigation menu links are placed at the very bottom of your HTML source, while the links with proper anchor text - somewhere near the top. 
Create navigation and site map
Aside from the interlinking, your site must provide the proper navigation both for humans and for search engines. Ensure you have linked all the pages, check the whole site for broken links, create a site map that puts all pages together in one place and make it accessible with one click from any other page of your site. It is not necessary though to create an XML sitemap and submit it to Google - it won't give any preferences anyway.
Keep working
While your newly-created pages start to obtaining their positions in SERPs, you should continue working - keep writing the content, optimize or change it if needed, monitor your competitors and so on. Things usually get changed pretty fast in SEO, so don't let yourself to rest on the laures.
4.2. Off-Page SEO strategies
Google Rank, SEO SEo Tools Rank Ranking Google Perhaps you'll be a little surprised, but off-page optimization starts with the content (which is the king, as you remember). The very first step you should accomplish before starting any link building is quality content. Remember, it is much easier to get a link onto some amazingly creative or informative article rather than onto a scanty set of trivialities. Deliver the content and the quality backlinks will become much more closer to you than you think.

Nevertheless, the content alone is not enough. You still need to put some efforts into your link building strategy. Here is the rough step-by-step:

Know your product
Seems stupid at first, isn't it? But surprisingly many people all over the world simply do not know what do they offer! They simply cannot explain what they sell, what product they deal with, what's its features, why do other people need this product and what differs this product from other 2567 products of that kind. They simply do not know. But you must know that! Think of your product or service. What's unique in it? How can it help others? Who are your potential customers? You see, you cannot offer your services or products to some average and faceless customers. You must clearly understand who your customers are and why do they need you product.Start with that, figure it out for yourself, then formulate it into words to explain this for other people as well.
Set objectives
Now as you know who are your target audience, you should set the objectives - what is the purpose for you in reaching that audience? Money? Fun? Word of mouth? Fame? Popularity? Why do you need those people linking to your site? Spare a few minutes (or maybe few days) onto thinking of it - that's worth it. Without that answer you won't succeed.
Research
The next vital step is research. Before diving into the link building, you should gather as much information on the market as you can. The key to the success is information. But the value of the info is inverse of its age. So investigate the market, find the fresh info on your theme, conduct some personal researches or experiments if needed, talk with specialists in your industry - that's all count. What's the point in writing an article if the story it tells to your potential buyers is old as your granny? Well, of course, there is always someone who didn't hear it yet, but some recent news will attract much more interest anyway and that mean much more inbound links too.

You should also probe the market - what is popular, what are alternatives, how much does it cost, what are the bottle-necks, what's the ROI and so on and so further. Base your link building strategy on the current trends of the market in your niche. 
Build content
As it was said above, the quality content opens the door to many inbound links to your site. At this step you should already know who are your potential visitors, who do you target, what objectives do you pursue and so on. Also you have a handful of great, freshly squeezed information. Now it is time to clothe all of this into tasty marketing texts. If you can do it with your own strengths - that's great, do it! If you cannot write persuasive selling copy yourself - hire someone who can.

Use the 
on-page optimization section as a guide on content writing.
Acquire links
You can refer to the 
above part to learn what works in the current off-page optimization world and what not. One of the today's buzz words is link baiting. The term describes the way how the links are acquired. The technique does not intend toask webmasters to put a link to you, instead it encourages them to do that. Just like the fisherman baits a fish to the hook, you can bait a webmaster with your website. Why whould a webmaster want to link to you? Because your site has some value. Because it provides some services that others do not. Because it has unique content. Because it is funny. Because it's authoritative and trustworthy. There are plenty of possible reasons. But what common about them is: the link bait is never an easy money! You need to thought it out well. The bait needs to be well researched. It must fill some gap in the market to attract high value links.

Don't think of acquiring links as a part-time job. Treat it as something big, something that takes all of your efforts and something that you would be proud of when it would be done. Then it will amaze other people. Then it will attract their attention. Then webmasters will be happy to put a link to you.

Ok, let's suppose you have written a good content, but how do you promote it? How do you make others link to it? Where to start from? The best place to start is other resources at your theme - forums, personal blogs, some authority sites in your field etc. You could also research your competitors and find who links to them and why. Try to suggest better solution (read: your solution) to them. If you have a mass product - try offering it to respected bloggers for free. By the way, some free stuff always helps to get a link or two. For instance, simply by reading this tutorial you have already learned the secret coupon code for 40% discount on any products by CleverStat: CLEV-45S0-SFAQ (simply type it into the order form). Obviously, you'll want to tell your friends about this secret offer, or make a post in your blog: "hey, they hide a nice promo-code in this SEO FAQ (by the way, it is good too)". Hope you got the idea. If you made a tool - ask others to test it for bugs, if you write an article - ask for a review etc. People are often eager to help, why not let them?
Keep working
Once again, as soon as you got some results of your link campaign (or even if you didn't get any) - do not stop. Keep working. Repeat the steps 1-5. Build content, acquire links and then build content again.

Learn new tricks, read industry news, interview interesting persons (why not?), communicate with related sites of your field. SEO is dynamics. Every day something happens and your website is better to reflect the changes, otherwise your competitors do that.
4.3. Tracking the results
One of the most common mistakes a newbie webmaster makes is the constant tracking of the SERPs. Yeah, I agree there is something magical, something hypnotic in it - seeing as your site slowly climbs up (oh, yeah!) the SERPs or suddenly falls down (oh, no!) and then raises again. It is a very captivating for sure, but absolutely useless process. Believe me, you don't need to check how well your site is doing every 10 minutes. Moreover, you should not do that even once a day. I am sorry for such rude comparison, but checking your website rankings is just like masturbation - when you're ok you simply don't need it.

Of course, you do want to know the positions of your website. But that info is not crucial, you better spend your time writing another page for your site, or contacting another blogger for links. Checking your SERP positions once a week is ok. And you also do not want to check them for all 100 result pages down. Forget it. If your SERP is located below the first 30 results - nobody will open it ever. There're billions of websites all over the web and the number grows constantly, why would I spend my valuable time to digg into SERPs below the first page? Well, may be if I'd be very eager to find something, I'd probably looked two or three pages more, but that's all - afterwards I'd either open one of the top sites, or paraphrase my query.

The other popular question here is - whether one should use some 
tool to check search engine position or do that manually? Both points have its strengths and weaknesses. Obviously, you cannot afford to track the results manually if you have a bulk load of keywords. Let's suppose you can check one keyword in 30 seconds: open Google, type the keyword, hit enter, quickly screen the SERPs, proceed to the next keyword. Then, for 100 keywords it would take 50 minutes! Almost an hour of tedious repeating work. No, thank you. From the other hand, checking the ranking with some tool also has its disadvantages: tools often lie, show incorrect or inaccurate info, they also may lead to a ban of your IP address if the amount of terms to check is significant. So, what to choose? That's depend on your vision and the needs of your business. You may prefer rare and accurate manual checks if the amount of target keywords is rather small, or choose some tool to save a couple of hours if the list of keywords is rather long.

The summary of this part is simple:
Check your web rankings rarely, once a week or so.
Check the first 30-50 results only.
If your website is not found within that depth - you're not performing well and need to double your efforts.
You can check your ranking by hand in a browser, or prefer a tool or software - that depends on your business needs.
.4. Help! I've lost my rankings!
First of all - don't panic! There are plenty of possible reasons for the lost of positions and not all of them are results of Google penalty.
Your competitors are not sleeping
This is a very common situation actually. You think that your site has lost its rank, while in fact it's your competitors' sites gained it. Usually, the amount of drop is not significant, few positions down or so, but depending on the intensity of your previous months work it may drop even more. Remember, SEO is not sprint, it is marathon. 
Some of your inbound links were filtered out or devalued
This happens when you got many links and then Google devalued them. Your link strength has greatly decreased and your rankings have dropped. If you read the 
off-page section carefully, you know that links don't have the same value. Some are more valued than others and the value of a link highly depends on the value of a linking website. So if the quality of that website falls down for some reason - so does the value of link to your site.

For instance, you may had many links from reciprocal partners and all went good, but the partner sites placed more and more links to the same page and finally your link has become simply one of the many, receiving only a small portion of the link juice - that's devalueing. The links also could be devalued by Google. If it founds that a link is irrelevant or bad quality, it reduces its initial value or filter it out completely.

Devalueing may result in significant loss of rankings at once, or in a slow yet constant loss of positions over time. Gathering high-PR links from relevant websites reduces the risk of such issue to the minimum.
New ranking algorithm introduced
Very similar to the above. Each ranking factor, either off-page or on-page has some value assigned to it somehwere inside Google's algos. If Google introduces some new algorithm or made some changes with the next update - all sites get evaluated according this new algo and their ranking is updated accordingly. Some of them could raise up during the update while others could fall down.

The key point here is that Google does not want to put somebody's sites down. It has no interest in it. The purpose of such update is to increase the quality of a search, so the best strategy to survive any Google algo update is to use high-quality links and whitehat methods of SEO. If you don't mess up with some doubtful SEO, if you have good neighbourhood, and if you obtain only high-quality links from relevant sources - you will not lose your positions, or even increase them thanks to those unlucky guys who didn't manage to read this tutorial carefully and therefore lost their rankings and freed some space in the SERPs for you. 
On-site problems
We all make mistakes, so sometime the issue is not Google, but your own site. The most common case is when you buy a new domain name and transfer your website to it, but do not setup a redirect ftom the old one correctly. So the links are now pointing to non-existing pages and Google has no other choice but remove them from the index. The other common situation happens when your hosting provider changes some script execution rules and your dynamic pages suddenly stop working returning with 404 or 503 or any other server-side error. If your ranking are going down - check your website first. 
Penalty
What is penalty? It is a kind of damping factor applied to your rankings. How does one know if his site is penalized? No way. Your site may be penalized right at this time and you don't know it. You should understand though that not every loss of rankings is a penalty! However, Google does penalize sites that use blackhat SEO techniques such as hidden links, keyword stuffing or having bad link neighbourhood or irrelevant links. There is no artificial penalties made for the only sake of dropping some sites down. Google is about search, and it worries about search, not about putting some site higher or lower in rankings. So if your site has received a penalty you should ask yourself, why Google thinks that my site is not relevant to that query? Is this something about my content? Is this those links I bought last month? There is always a reason. Here are some more tips from Google: 
My site isn't doing well in search.
Site is banned
What is ban? When your site gets excluded from the index of a search engine - that is ban. How does one know if his site is banned? Simply. Type site:mysite.com query in the Google search box, where mysite.com is the URL of your site. If you don't see any results then - congratulations, your site is either banned, or has never been crawled (true for brand new sites). Why would Google ban a site? Mostly, because that site violates 
Google Webmaster Guidelines or uses blackhat SEO methods. How can one bring the site back to Google index? Step one: remove all blackhat stuff from your website. Step two: submit your site to Google again.

Reading the above points you may have already noticed some similarity about them. There is a zero chance of having any problems with Google if you are a) using legitimate, whitehat SEO methods; b) working constantly on improving the site. Indeed, the source of all problems is low (well, may be not exactly low, but at least not high enough) quality of the website content. Fill your site with unique, informative, must-read content and you won't need to ask others to link to you - that linking strategy works best of all and does not experience any difficulties with Google updates.
Obviously we need to say a word or two about SEO software. Whether it is safe using it, whether it is worth to purchase some and other matter. Since we have two types of SEO activity - on-page and off-page - all the software can be categorized the same way. However, I would prefer another classification:
Keyword tools
This includes tools that perform some 
keyword research, whether on the Web or by means of analysis of some website. Such tool can gather or try to calculate the popularity of a keyword, some of the artificial keyword parameters (like the keyword density), suggest some synonyms and other keyword related stuff.
Web ranking SEO software
This is simply tools that 
check the position of your site in a search engine. This refer to the above section on the web ranking checks. 
Link tools
Link building, managing, structurizing, buying and selling software.
Submission tools
This bunch of different SEO tools try to submit your website or some particular pages to some web resources, like web directories, catalogs, search engines, article sites and others.
All-in-one packages
A swiss knife performing all the tasks. A big fat SEO application having all (or nearly all) needed (and sometimes unneeded too) functions within.
Automation SEO software
In this category I don't mean the automated software. Actually, all of the above kind of tools are working more or less automatically. However, there are some tasks that cannot be fully automated and thus still need a human intervention. This includes managing your AdWords or AdSense accounts, finding link partners, tracking some interesting blogs etc. So such SEO software simply helps you to fulfil those tasks, still requiring you to do the most of the work yourself.
All of the above can be a software tool working on your computer, or a online tool working remotely through a browser. I won't digg deeply into this matter, neither will I review any particular names. Some of the SEO software and web promotion tools can be found at our website, though. Instead, I would like to point out the main idea of this section.

There is no a SEO tool or a software that makes you an easy life! You may read the promo texts saying a tool will move you on top of SERPs - that's not true. It may say "guaranted #1 position in Google" - false promise. You may see testimonials of other people saying "I made $5000 in two weeks! It's amazing!" - all is fake or they simply cheating themselves. No SEO software can guarantee anything. It is a tool, not magic. If you see some of the ad texts like the examples above - know that this particular piece of software is not what it pretends to be. Double your suspicions if they ask you to pay them first. And triple if they do not provide any kind of trial for that tool.

The next question. Should you pay for your convenience, or use only a 
free SEO softwareinstead? Actually, there are plenty of quality free SEO tools out there, starting from the truly amazing and really a "must have" Google Keyword Tool (though, its amazingness isargueable). However, someday you may discover you need more power or more automation and there is nothing bad to look at the paid tools as well. Simply keep in mind that this tool is simply a tool, nothing more. It won't help you rank better. It can only help you to make your site rank better. Don't trust anybody who assert the opposite. Most likely that guy merely wants your money.

Another point we should cover here is: if it is safe to use the SEO software? Can it provoke a penalty from Google? According to Google, only automated ranking checks are against its policy while all other types of SEO software is OK, at least unless they do not violate Google Webmaster Guidelines. As for web ranking checks - we have covered this topic earlier in this SEO tutorial.

Summarizing, you should not expect some magic from any SEO software you use. Moreover, any software is simply a program that repeats what it was told to do and thus may be wrong in some particular cases. You can run some SEO report within that tool and find some recommendations it gives to you, but please keep in mind that a software does not know your current situation, it merely knows of the competition, some keyword data etc, but easily can overlook some more delicate matters that only a human can know about. In general, a hired SEO specialist is always better than the bunch of electonic impulses called SEO software. On the other hand, even a SEO specialists still use some tools to automate their work, so why can't you? Simply remember the main point of this part: there is no a SEO tool or a software that makes you an easy life.

6. SEO resources that are worth visiting
Here is the unordered list of essential SEO websites, blogs and other resources. Every one suggests some interesting info and/or tools that might aid your SEO efforts.
Google Webmaster Central
A must-have website. Simply register, confirm the ownership of your website and see the extended statistcs on crawling, backlinks, keywords and many other info.
Webmaster Essentials
A must-read collection of crucial info about Google. The quotations from it are scattered through the entire tutorial you're reading now.
Google Analytics
Install it onto your site to get the most comprehensive info on your site visitors. Combined with the server-side logs it becomes the endless source of crucial information needed to focus your future SEO efforts. Easily integrates with AdWords and AdSense accounts so you can track not only the visitors, but also a ROI index.
Matt Cutts blog
Hey! Isn't he a Google guy?! Yes, he is. Matt Cutts is a tech engineer at Google since 2000 and he has a blog where it answers many popular questions about Google, reveals some inside information, busts some popular myths about Google and creates some others. Very interesting blog with tons of info to consider, however they said since Matt is affiliated with Google we cannot trust him in full. Well, who knows, who knows... Definitely deserves to be read anyway.



No comments:

Post a Comment