The basic steps of search engine optimization can and should be carried out by all webmasters as a matter of course. In less competitive markets doing this alone is often all that is required to achieve top search engine rankings.
However, it should be borne in mind that these basic SEO techniques alone won’t be sufficient if you are attempting rank highly for very competitive search terms (keywords) like “SEO” or “internet marketing.”
To rank well in very competitive searches, detailed analysis of the search engine algorithms and competitor pages is required. There are more variables to consider, pages usually need to be tailored for specific SE’s, and it’s generally too complex for the SEO newbie to tackle successfully. It’s far wiser to target the some of the billions of easier keywords.
That said, let’s look at the aspects of SEO that anyone can do:
Keyword Analysis: Identifying The Keywords Your Pages Will Target
You simply MUST get this bit right. Target the wrong words and everything you do from here on out is a complete waste of time.
The first step is to ascertain what keywords people interested in your topic are typing into the search engines. From the different keyword phrases that could apply to your page you want to choose 2 or three to target: The main keyword phrase, and 1 or 2 closely related secondary keyword phrases.
In deciding which particular phrases to target, you want to compare the number of searches carried out for that keyword, with the number of competing pages listed in Google or Yahoo search results.
How To Do Keyword Research
If you have not yet created the page and want to use free tools, visit the first. Type in a few 2 or 3 word phrases that you feel relate to your topic. Tick the box to include synonyms, and then click the “Get More Keywords” button.
By default, results are targeted to English, United States. If you want another region, say English, UK, click the “edit” link, make your selection and run the search again.
Two lists will be returned, one for the keywords you input, and one for synonyms that Google thinks is related to them, under “Additional keywords to consider.”
If you are happy with your lists, click on the “Search Volume” column to sort them into most searched keywords first, and then scroll to the bottom of each set of results and click on the links to download the keyword lists to your PC.
You may find your initial ideas are a bit off base and don’t return the kind of phrases you expected. If that’s the case, simply change some or all of your phrases and get more keyword suggestions.
At this point you may also want to take some of the keyword synonyms and feed them back into the keyword research tool for more ideas.
When you’ve finished, go through your lists deleting irrelevant phrases and selecting the keywords you think are most appropriate that have anywhere between a low and medium to high search volume. We’ll call these your “root keywords.”
Google’s keyword generator doesn’t tell us exactly how many searches are performed for each keyword, so now we need to plug these root keywords into a tool that will.
These SEO tools all have free options, if using a paid version you can skip the Google keyword tool and the checking of SERPs mentioned below. I’ve omitted the Yahoo / Overture keyword suggestion tool and others that use its data because although it’s the keyword tool most commonly referred to, it can be very misleading due to the way it groups plurals
and some synonyms — in short, it gives inaccurate results in many
I can’t go into great detail on the next part, because it depends on which keyword analyzer you’re using, but basically you want to run a keyword search on each of your root keywords, which will give you a list of longer keyword phrases that incorporate your root words, together with the number of searches performed.
Comparatively speaking, the more words in a keyword phrase, the easier it will be to rank highly for it. Thus, “free internet marketing articles to download” will be vastly easier to rank for than “internet marketing” or even “internet marketing articles.”
However, there’s no point in having high search engine rankings for keywords that are seldom searched.
You can decide for yourself the minimum number of searches a keyword can have to be considered, and in reality it will also depend on your goals.
For instance, your plan might be to make lots of pages targeting very easy keywords with few searches (known as “long tail keywords”), looking at the overall amount of traffic you’ll get. Nevertheless, bear in mind that conversion rates on most sales pages are only in the order of 1-2 percent, meaning 100 visitors is only likely to result in a single sale, if that.
Your goal is to find keywords that offer the best compromise between high search volume and low competition.
How to find how much competition there is for a keyword?
This is a simple, if somewhat tedious process (a keyword research tool or service will automatically show keyword competition, making life much easier).
Take the keywords that look promising, put them between quotation marks and search for them on Google, noting the number of competing pages Google lists (where it says, “Results 1 – 10 of about ____”).
The reason to put your keyword phrases in quotation marks is because only those pages containing that exact phrase are directly competing with you, giving you an accurate benchmark.
Generally speaking, the newer your website is (both because Google is initially skeptical of new websites and because as a site ages its pages start to gain PageRank and reflect a theme, providing additional leverage), and the less experienced you are at SEO, will determine the maximum number of competing pages a keyword phrase can have before you consider it too difficult (for now at least).
Many years ago, Sumantra Roy came up with what he called the Keyword Effectiveness Index (KEI), which you can also use to help you choose the right keywords to target. The formula is KEI = P^2/C*1000. That is, the popularity of the keyword squared, divided by number of competing pages, and multiplied by 1000 to give a nice number to work with. Keywords with a higher score have a better popularity to competitor relationship, and are therefore more worthwhile to target. If you decide to use KEI, the easiest way is to put all your keyword data into an Excell spreadsheet, and then add a column at the end to automatically perform the KEI calculation for you.
I understand that might look like a lot of work, and to be fair, it is.
However, I’ve taken this from the standpoint of someone with absolutely no idea what keywords to target. If you already have a basic list of relevant keywords, and have developed a feel for keyword analysis, some of the above can be skipped, or at least gone into in less detail. The other thing of course is that like anything else, the more you do it, the more proficient you become and the less time it takes.
Optimize Your Pages (On Page SEO)
If you don’t want to go to the trouble of proper keyword research and simply want to do the bare minimum to improve the rankings of existing pages, you can start here (although I recommend you at least take the main keywords of the page and see if you could swap them for better ones. Try the Google keyword tool’s Site-Related Keywords setting).
Once you’ve decided on the keyword phrases for a page:
- Create a title using your main keyword. If they fit nicely and the title still reads well also include one or both of your secondary phrases. Sometimes your main keyword will be part of one of your secondary keywords, making this easy. Don’t make your title really long.
- Put your title text in the HTML TITLE tag at the top of the page code, right after the opening HEAD tag. The less clutter the search engine has to go through before finding the important stuff, the better.
<title>My Title Here</title>
- Write a description of the page content that would entice someone reading it to visit your page. Incorporate your keywords, and use your most important keyword phrase first, because the order gives an indicator of relevancy. Put this description into a meta description tag in your HTML code immediately after your TITLE tag.
name=”description” content=”Learn how main keyword phrase can help you and what keyword phrase2 is really all about” />
- Put your keyword phrases into a meta keywords tag immediately after your meta description tag. Your most important keyword phrase should be first, followed by the second most important and so on.
<meta name=”keywords” content=”main keyword phrase, keyword phrase2, keyword phrase3″ />
I often separate keywords with spaces instead of commas (except on blogs), ensuring search engines find exact matches to more search phrases (Google ignores the commas, and gives little weight to the meta keywords anyway). For example, if your meta keywords tag contains “best SEO, ranking advice” many SE’s won’t match for “SEO ranking.” Bear in mind though that this means a few of the smaller — and consequently, less important — search engines will see your keywords as one big phrase.
Avoid repeating any phrase more than two or three times in either the title, meta description or meta keywords tags. Never stuff any of them with lots of keywords or use irrelevant keywords (this is what’s known as “keyword stuffing”).
The fewer the words in your title, meta keywords and meta description tags, the more “relevancy points” each of them will get. e.g., take 100% as the maximum relevancy of the title tag to the page. 100% divided by 20 words gives 5% relevancy for each word. 100% between just 4 words gives a 25% relevancy. Whilst this generally isn’t a major issue, it should be born in mind that the more words you add, the more the importance of each is diluted
- Put your title text in a H1 or H2 heading at the top of your page. Try and make this the first text on the page whenever possible (perhaps by making any preceding text into images).
<h1>My Title Here</h1>
<p>My first paragraph of text</p>
Tip: Use CSS to style your heading tags so they aren’t huge and suit your page design.
- Use your keyword phrases in the first one or two sentences right after the H1 title.
Also use your keyword phrases naturally and SPARINGLY throughout the content, together with other synonyms. Don’t try and force keywords in where they don’t fit. Take the length of the text as your guide to if,
and how often they should be repeated. You can use one of the free Keyword Density Analyzers for this or WebCEO’s Density Analysis Report.
If it sounds contrived when you read it, you’ve probably overdone it. Better to add more synonyms and other phrases common to the theme (other terms you might expect to find within the topic, which aren’t synonyms of nor directly related your keywords). I suggest you ignore anything you might hear about LSI (Latent Semantic Indexing) — it’s far too complex and based on such a massively large data set that it’s a waste of time trying to manipulate the search engines on this score, and far easier just to write quality focused content.
- Use your keyword phrases again at the very end of the page if possible. I mean the last sentence or two of text on the page, before the closing BODY tag, not the end of the article.
If possible, make use of your secondary keyword phrases in H2 or H3 subheadings within your article or content.
An image somewhere near the top of the page with a file name of “main-keyword-phrase-something.gif” and an ALT attribute of “main keyword phrase something” also helps relevancy.
Save the page as “my-main-keyword-phase.html” or “my-page-title.html”. Use hyphens, not underscores as word separators. Google reads a hyphen as a space, but an underscore as a character.
Internal links to the page (links from other pages on your website) should use its main keyword in the anchor text (the part you click).
<a href=”my-page-title.html”>My Page Title</a>
- Keep related pages in a single directory (web folder) named after the common theme. Usually this will be a keyword applicable to them all.
Each directory should have an index page listing all the pages within it, as per point 11 above. Every page in the directory should link bank to this index page.
Your website should consist of a main index page/ homepage, containing links to the index page of each directory. Ideally keep to 1 level of subdirectories, e.g., mysite,com/directory/page.html. Don’t go beyond 2 levels deep. Although if given enough incentive they will, search engines aren’t overly enthusiastic about crawling down further than that, so you’d just be creating unnecessary difficulties for yourself.
Make a sitemap and link to it from your home page. This will further help Google and the other main search engines find all your pages and monitor updates.
There are many ways you can do this, so your best bet is probably to look on Google for the solution that fits your needs. My suggestion is to go for something that updates automatically, or use one of the free online builders or scripts.
SEO Web Design
Web design is also important to the search engines. Not how the page looks, but what the code is like underneath. Messy, overly-complicated, or plain bad code give the search engine spiders a hard time crawling your pages.
If the spiders (also known as crawlers or bots) can’t crawl your pages properly and retrieve all the data they need, the search engines can’t rank them properly.
Crawlers have very basic text browsing abilities. It’s important to understand that they don’t see your website in the same way as IE or Firefox does. To view your page as a bot sees it, use a text browser like Lynx (or use the SE view report in WebCEO).
Web Design With Search Engines In Mind
- Make sure your HTML code is valid and free from errors. Use the syntax checker in your web page editor, or the free one at W3C. Broken code makes it hard for the spiders to read your page, and can result in information being missed, or the page being skipped altogether if it’s really bad. Take this simple scenario; you miss the closing bracket off a paragraph tag, so your code reads “<pMy keyword is here.” The search engine might ignore your keyword because it thinks it’s part of the tag.
Have a valid Document Type declaration at the top of your page. The DOCTYPE tells the search engine spider what kind of code it can expect to find in your page. Without the Doctype the crawler is forced to guess. Most of the time it will guess right, but do you really want to leave something this important to chance?
Also, if the code has errors there’s a greater chance of confusion these days, because web pages now come in 2 different varieties. Whilst the majority of the web is still in HTML, most new sites are written using XHTML. The Doctype declaration has to be the very first thing on the page.
<!DOCTYPE html PUBLIC “-//W3C//DTD XHTML 1.0 Transitional//EN” “http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd”>
<!DOCTYPE HTML PUBLIC “-//W3C//DTD HTML 4.01 Transitional//EN”
Most scripts that need to be put in the BODY section will work fine out of the way at the very bottom.
- Try to avoid using images as links for internal linking between pages on your site. Use regular text links with keyword anchor text the search engines can read. If you must use images as links, ensure you put the keyword phrase of the page you are linking to in the ALT attribute of the image tag.
If you want fancy button type links, create them using CSS and text links.
- Check for and fix or delete broken links in your pages. Your HTML editor might have a feature to do this, or you can check pages with the free W3C link checker (WebCEO does this plus checks for syntax and other problems). Dead links not only give the spiders a hard time, they indicate to the search engines that the site is not well maintained or up to date, negatively effecting your search engine position.
Make sure that when a page that doesn’t exist is requested, a proper 404 error is returned. A simple check for some kinds of automated spam sites is to request a few made up, nonsensical page names like “jko548fvn2se.html” and see if errors are returned or not. Also, redirecting errors to your homepage and inadvertently sending out 301 or 302 header codes instead of 404 effectively tells the search engine that all those pages are not missing at all, but have the same duplicate content.
Although these days dynamic pages are indexed by the major search engines, spiders still have problems with URL’s containing too many parameters. You also want to avoid feeding session IDs to bots, and any other parameters that will end up creating different URLs for the same page. Not only will that lead to problems with duplicate content, but it can make your website seem like a bottomless pit to a bot, which might crawl the same pages at different URL’s over and over, but miss half of your site altogether.
Use a robots.txt file at the root of your website to block crawlers from accessing pages that will result in very similar or duplicate content, or which have no valuable (topical) content. Also use robots.txt to prevent search engines indexing anything you don’t want public. Errors in this file can prevent spiders from crawling your site, so.
Frames aren’t as much of an issue as they used to be, but I’d still avoid using them unless I didn’t care about search engine positioning. The problem is that the content of the page you want to rank isn’t actually on that page, it’s on another page altogether. Whether or not the search engine associates one with the other can be a hit or miss affair. Google advises against using frames.
Off-Page SEO refers to search engine optimization techniques that aren’t carried out on your own website or page, but on other sites.
These days, only optimizing the content of your pages isn’t enough to get them to rank highly. To do that you need help from other websites in the form of incoming links, known as backlinks (links back to your site, “back links”). In fact, you’ll find that Google won’t bother listing your site if it has no backlinks at all.
In essence, off-page SEO is all about getting quality backlinks relevant to your topic that assist the search engines in establishing the value of your page and what it focuses on. You can view each backlink as a vote of approval for your page. The more inbound links a page has, the greater its link popularity. Google’s PageRank measures this, but in a complex way that takes many factors into consideration.
Notes On Linking Strategies & Increasing PageRank / Link Popularity
- A single inbound link from a high-quality site is worth tens of links from different low-quality sites.
Linking out to low value, spammy sites, or those engaged in SEO practices the search engines frown upon can negatively affect your own website’s rankings.
One-way links to your website are of far greater value than reciprocal links obtained from engaging in link exchanges.
Backlinks from pages covering the same or related topics are far more valuable than those from totally unrelated sites. Have links pointing to the most relevant page on your site, not simply the homepage (known as “deep linking”).
Links to your pages should have one of its keywords in the anchor text. Employ numerous variations on this text if you intend to create a high number of backlinks to a page. This leads to better results and looks more natural to the search engines, avoiding throwing up a red flag for possibly attempting to manipulate the listings.
Avoid participating in organized link exchanges or link farms. Most of the time this will end up harming rather than helping your rankings. This is because the search engines see it as manipulation of the SERPs (Search Engine Results Pages) and penalize linked websites once they discover the network. Read Google’s view on this, and note that even “Excessive reciprocal links or excessive link exchanging” is considered to violate Google’s Webmaster Guidelines!
Don’t build backlinks too fast. Hundreds of backlinks appearing for a site in a matter of days send a clear signal to the search engines that you’re probably doing something you shouldn’t be (in their eyes), simply because it appears so unnatural. Grow your links steadily over time.
Links from high PR sites are good, but don’t obsess about getting them. You’ll often get as much value from a highly targeted low PR link as from an untargeted high PR one. The search engines aren’t the only reason to have links, and good links bring traffic themselves. Having said that, if you’re on a link building campaign, unless a site looks particularly good, I wouldn’t bother targeting it if it has no PR, e.i., is PR0.
Good backlinks with targeted keyword anchor text carry a LOT of weight these days. So much so that if done well, it’s possible to rank a page highly for terms that aren’t even on it.
Backlinks are also the way to get your site found and crawled by the main search engines. I wouldn’t bother submitting to the main SE’s, it’s generally better and faster to get links from websites that Google or Yahoo already values, and which are regularly crawled as a consequence. Let them “discover” your pages themselves, by putting links where you know they’ll be found and followed.
How To Get Backlinks?
There are lots of ways to get backlinks, although few are quick or easy (tools like SEOelite, WebCEO or Link Assistant help speed this up). Here are some options:
- Create content that makes people link to it (often termed “link bait”)
- List your website in directories
- Write and syndicate articles
- Get your page mentioned on bookmark sites, Digg.com, etc
- Create content on Hubpages that links to your page
- Post comments to topically related blogs
- Trackback to topically related blogs
- Syndicate your blog feed to announcement and aggregator sites
- Use Tags for links from Technorati.com
- Make forum posts that include your link
- Exchange links with other webmasters
- Buy links
Congratulations! Now you know how to do search engine optimization yourself. Of course, I’d be lying if I didn’t admit there ARE easier ways to do SEO.
SEO Elite might be the more glamorous, but I think WebCEO is the better choice for the average webmaster (and I say that even though I get a bigger commission for recommending SEO Elite).
Quite simply, WebCEO has more useful features, covering more bases. Plus, its analysis of your pages and reports on over 130 parameters that affect your rankings help develop a deeper understanding of SEO as you use it, and that’s in addition to the valuable free SEO course and certification that’s included with WebCEO.
Wonderful as these tools are though, you can still do good SEO without them. If you simply follow the steps above your pages WILL start to get high rankings and quality traffic from the search engines.
As you get more pages ranked, the search engines will value your site more, and it will become easier to get top positioning for more difficult keyword phrases. Naturally, you’ll also get better and better at SEO too!