Installing PHP on Windows 7

PHP is one of the most common web languages out there, Windows is the most used operating system in the world… So I’ll take a wild guess and say many PHP developers use Windows, thus have to install PHP on windows to do their work or test their websites before releasing them online.

Many developers use bundles like XAMPP Or WAMP to install an Apache server, MySql server and PHP. I personally use Microsoft’s ASP.Net for web development most of the time but sometimes I do things on PHP – which is still an awesome scripting language – so I tried to cut back on the resource consumption and get PHP to work on Windows IIS Server.

That was a bit difficult on Windows XP, there were too many things you had to do to get it working. But for Windows 7… Getting PHP to work on IIS will take a couple of minutes if you had a good internet connection. So getting to those steps:

  1. Download and install Microsoft Web Platform Installer.
  2. Start the web platform installer.
  3. While waiting for it to start click on the “Start” button and type “Turn windows features on or off” without the quotes of course.
  4. After the window opens go to the “Internet Information Services ==> World Wide Web Services ==> Application Development Features” And make sure CGI is checked, then click the OK button.
  5. Go back to the Web Platform Installer, and type PHP in the search box and hit the enter button.
  6. When the search results show up, click the “Add” button next to “PHP Manager for IIS”, and press the Install button.
  7. Follow the simple dialogs until you’re done, and voilà, you have PHP running on IIS now.

Let me know how it works out for you

Entertainment Geeky

Writing an Essay in a Programming Language

This is how a programmer will write an essay using their favorite programming language:


How To Do Search Engine Optimization (SEO) Yourself

Basic SEO

The basic steps of search engine optimization can and should be carried out by all webmasters as a matter of course. In less competitive markets doing this alone is often all that is required to achieve top search engine rankings.

However, it should be borne in mind that these basic SEO techniques alone won’t be sufficient if you are attempting rank highly for very competitive search terms (keywords) like “SEO” or “internet marketing.”

To rank well in very competitive searches, detailed analysis of the search engine algorithms and competitor pages is required. There are more variables to consider, pages usually need to be tailored for specific SE’s, and it’s generally too complex for the SEO newbie to tackle successfully. It’s far wiser to target the some of the billions of easier keywords.

That said, let’s look at the aspects of SEO that anyone can do:

Keyword Analysis: Identifying The Keywords Your Pages Will Target

You simply MUST get this bit right. Target the wrong words and everything you do from here on out is a complete waste of time.

The first step is to ascertain what keywords people interested in your topic are typing into the search engines. From the different keyword phrases that could apply to your page you want to choose 2 or three to target: The main keyword phrase, and 1 or 2 closely related secondary keyword phrases.

In deciding which particular phrases to target, you want to compare the number of searches carried out for that keyword, with the number of competing pages listed in Google or Yahoo search results.

How To Do Keyword Research

If you have not yet created the page and want to use free tools, visit the first. Type in a few 2 or 3 word phrases that you feel relate to your topic. Tick the box to include synonyms, and then click the “Get More Keywords” button.

By default, results are targeted to English, United States. If you want another region, say English, UK, click the “edit” link, make your selection and run the search again.

Two lists will be returned, one for the keywords you input, and one for synonyms that Google thinks is related to them, under “Additional keywords to consider.”

If you are happy with your lists, click on the “Search Volume” column to sort them into most searched keywords first, and then scroll to the bottom of each set of results and click on the links to download the keyword lists to your PC.

You may find your initial ideas are a bit off base and don’t return the kind of phrases you expected. If that’s the case, simply change some or all of your phrases and get more keyword suggestions.

At this point you may also want to take some of the keyword synonyms and feed them back into the keyword research tool for more ideas.

When you’ve finished, go through your lists deleting irrelevant phrases and selecting the keywords you think are most appropriate that have anywhere between a low and medium to high search volume. We’ll call these your “root keywords.”

Google’s keyword generator doesn’t tell us exactly how many searches are performed for each keyword, so now we need to plug these root keywords into a tool that will.

These SEO tools all have free options, if using a paid version you can skip the Google keyword tool and the checking of SERPs mentioned below. I’ve omitted the Yahoo / Overture keyword suggestion tool and others that use its data because although it’s the keyword tool most commonly referred to, it can be very misleading due to the way it groups plurals
and some synonyms — in short, it gives inaccurate results in many

I can’t go into great detail on the next part, because it depends on which keyword analyzer you’re using, but basically you want to run a keyword search on each of your root keywords, which will give you a list of longer keyword phrases that incorporate your root words, together with the number of searches performed.

Comparatively speaking, the more words in a keyword phrase, the easier it will be to rank highly for it. Thus, “free internet marketing articles to download” will be vastly easier to rank for than “internet marketing” or even “internet marketing articles.”

However, there’s no point in having high search engine rankings for keywords that are seldom searched.

You can decide for yourself the minimum number of searches a keyword can have to be considered, and in reality it will also depend on your goals.
For instance, your plan might be to make lots of pages targeting very easy keywords with few searches (known as “long tail keywords”), looking at the overall amount of traffic you’ll get. Nevertheless, bear in mind that conversion rates on most sales pages are only in the order of 1-2 percent, meaning 100 visitors is only likely to result in a single sale, if that.

Your goal is to find keywords that offer the best compromise between high search volume and low competition.




How to find how much competition there is for a keyword?

This is a simple, if somewhat tedious process (a keyword research tool or service will automatically show keyword competiton, making life much easier).


Take the keywords that look promising, put them between quotation marks and search for them on Google, noting the number of competing pages Google lists (where it says, “Results 1 – 10 of about ____”).


The reason to put your keyword phrases in quotation marks is because only those pages containing that exact phrase are directly competing with you, giving you an accurate benchmark.


Generally speaking, the newer your website is (both because Google is initially skeptical of new websites and because as a site ages it’s pages start to gain PageRank and reflect a theme, providing additional leverage), and the less experienced you are at SEO, will determine the maximum number of competing pages a keyword phrase can have before you consider it too difficult (for now at least).


The Search Guild search term difficulty checker can help you get an idea of where you stand. I suggest you put in a really high and really low competition phrase that you have looked at on Google to see how they compare, and then the phrase you are considering targeting.

Many years ago, Sumantra Roy came up with what he called the Keyword Effectiveness Index (KEI), which you can also use to help you choose the right keywords to target. The formula is KEI = P^2/C*1000. That is, the popularity of the keyword squared, divided by number of competing pages, and multiplied by 1000 to give a nice number to work with. Keywords with a higher score have a better popularity to competitor relationship, and are therefore more worthwhile to target. If you decide to use KEI, the easiest way is to put all your keyword data into an Excell spreadsheet, and then add a column at the end to automatically perform the KEI calculation for you.


I understand that might look like a lot of work, and to be fair, it is.
However, I’ve taken this from the standpoint of someone with absolutely no idea what keywords to target. If you already have a basic list of relevant keywords, and have developed a feel for keyword analysis, some of the above can be skipped, or at least gone into in less detail. The other thing of course is that like anything else, the more you do it, the more proficient you become and the less time it takes.


Optimize Your Pages (On Page SEO)


If you don’t want to go to the trouble of proper keyword research and simply want to do the bare minimum to improve the rankings of existing pages, you can start here (although I recommend you at least take the main keywords of the page and see if you could swap them for better ones. Try the Google keyword tool’s Site-Related Keywords setting).


Once you’ve decided on the keyword phrases for a page:


1. Create a title using your main keyword. If they fit nicely and the title still reads well also include one or both of your secondary phrases. Sometimes your main keyword will be part of one of your secondary keywords, making this easy. Don’t make your title really long.


2. Put your title text in the HTML TITLE tag at the top of the page code, right after the opening HEAD tag. The less clutter the search engine has to go through before finding the important stuff, the better.


For example:



<title>My Title Here</title>


3. Write a description of the page content that would entice someone reading it to visit your page. Incorporate your keywords, and use your most important keyword phrase first, because the order gives an indicator of relevancy. Put this description into a meta description tag in your HTML code immediately after your TITLE tag.




name=”description” content=”Learn how main keyword phrase can help you and what keyword phrase2 is really all about” />


4. Put your keyword phrases into a meta keywords tag immediately after your meta description tag. Your most important keyword phrase should be first, followed by the second most important and so on.




<meta name=”keywords” content=”main keyword phrase, keyword phrase2, keyword phrase3″ />


I often separate keywords with spaces instead of commas (except on blogs), ensuring search engines find exact matches to more search phrases (Google ignores the commas, and gives little weight to the meta keywords anyway). For example, if your meta keywords tag contains “best SEO, ranking advice” many SE’s won’t match for “SEO ranking.” Bear in mind though that this means a few of the smaller — and consequently, less important — search engines will see your keywords as one big phrase.


Avoid repeating any phrase more than two or three times in either the title, meta description or meta keywords tags. Never stuff any of them with lots of keywords or use irrelevant keywords (this is what’s known as “keyword stuffing”).


The fewer the words in your title, meta keywords and meta description tags, the more “relevancy points” each of them will get. e.g., take 100% as the maximum relevancy of the title tag to the page. 100% divided by 20 words gives 5% relevancy for each word. 100% between just 4 words gives a 25% relevancy. Whilst this generally isn’t a major issue, it should be born in mind that the more words you add, the more the importance of each is diluted


5. Put your title text in a H1 or H2 heading at the top of your page. Try and make this the first text on the page whenever possible (perhaps by making any preceding text into images).




<h1>My Title Here</h1>

<p>My first paragraph of text</p>


Tip: Use CSS to style your heading tags so they aren’t huge and suit your page design.


6. Use your keyword phrases in the first one or two sentences right after the H1 title.


7. Also use your keyword phrases naturally and SPARINGLY throughout the content, together with other synonyms. Don’t try and force keywords in where they don’t fit. Take the length of the text as your guide to if,
and how often they should be repeated. You can use one of the free Keyword Density Analyzers for this or WebCEO’s Density Analysis Report.


If it sounds contrived when you read it, you’ve probably overdone it. Better to add more synonyms and other phrases common to the theme (other terms you might expect to find within the topic, which aren’t synonyms of nor directly related your keywords). I suggest you ignore anything you might hear about LSI (Latent Semantic Indexing) — it’s far too complex and based on such a massively large data set that it’s a waste of time trying to manipulate the search engines on this score, and far easier just to write quality focused content.


7. Use your keyword phrases again at the very end of the page if possible. I mean the last sentence or two of text on the page, before the closing BODY tag, not the end of the article.


8. If possible, make use of your secondary keyword phrases in H2 or H3 subheadings within your article or content.


9. An image somewhere near the top of the page with a file name of “main-keyword-phrase-something.gif” and an ALT attribute of “main keyword phrase something” also helps relevancy.


10. Save the page as “my-main-keyword-phase.html” or “my-page-title.html”. Use hyphens, not underscores as word separators. Google reads a hyphen as a space, but an underscore as a character.


11. Internal links to the page (links from other pages on your website) should use its main keyword in the anchor text (the part you click).




<a href=”my-page-title.html”>My Page Title</a>


12. Keep related pages in a single directory (web folder) named after the common theme. Usually this will be a keyword applicable to them all.


13. Each directory should have an index page listing all the pages within it, as per point 11 above. Every page in the directory should link bank to this index page.


14. Your website should consist of an main index page/ homepage, containing links to the index page of each directory. Ideally keep to 1 level of subdirectories, e.g., mysite,com/directory/page.html. Don’t go beyond 2 levels deep. Although if given enough incentive they will, search engines aren’t overly enthusiastic about crawling down further than that, so you’d just be creating unnecessary difficulties for yourself.


15. Make a sitemap and link to it from your home page. This will further help Google and the other main search engines find all your pages and monitor updates.

There are many ways you can do this, so your best bet is probably to look on Google for the solution that fits your needs. My suggestion is to go for something that updates automatically, or use one of the free online builders or scripts.


SEO Web Design

Web design is also important to the search engines. Not how the page looks, but what the code is like underneath. Messy, overly-complicated, or plain bad code give the search engine spiders a hard time crawling your pages.


If the spiders (also known as crawlers or bots) can’t crawl your pages properly and retrieve all the data they need, the search engines can’t rank them properly.


Crawlers have very basic text browsing abilities. It’s important to understand that they don’t see your website in the same way as IE or Firefox does. To view your page as a bot sees it, use a text browser like Lynx (or use the SE view report in WebCEO).


Web Design With Search Engines In Mind


1. Make sure your HTML code is valid and free from errors. Use the syntax checker in your web page editor, or the free one at W3C. Broken code makes it hard for the spiders to read your page, and can result in information being missed, or the page being skipped altogether if it’s really bad. Take this simple scenario; you miss the closing bracket off a paragraph tag, so your code reads “<pMy keyword is here.” The search engine might ignore your keyword because it thinks it’s part of the tag.


2. Have a valid Document Type declaration at the top of your page. The DOCTYPE tells the search engine spider what kind of code it can expect to find in your page. Without the Doctype the crawler is forced to guess. Most of the time it will guess right, but do your really want to leave something this important to chance?


Also if the code has errors there’s a greater chance of confusion these days, because web pages now come in 2 different varieties. Whilst the majority of the web is still in HTML, most new sites are written using XHTML. The Doctype declaration has to be the very first thing on the page.




<!DOCTYPE html PUBLIC “-//W3C//DTD XHTML 1.0 Transitional//EN” “”>


<!DOCTYPE HTML PUBLIC “-//W3C//DTD HTML 4.01 Transitional//EN”



3. Avoid unnecessary Java or Javascript, especially near the top of the page. That’s not to say you shouldn’t use it, just realise large amounts can be a hurdle to search engine crawlers. For example, if you have half a page of Javascript that needs to go into the HEAD section of your page, put the code into another file saved with a “.js” extension and reference it from your page like this instead:


<script language=”JavaScript” src=”/pathtomy/javascript.js” type=”text/javascript”></script>



Most scripts that need to be put in the BODY section will work fine out of the way at the very bottom.


4. Try to avoid using images as links for internal linking between pages on your site. Use regular text links with keyword anchor text the search engines can read. If you must use images as links, ensure you put the keyword phrase of the page you are linking to in the ALT attribute of the image tag.


Don’t use Javascript links. Spiders can’t follow them and there’s nowhere to put your target keywords. If you really MUST use these kind of links, repeat the link elsewhere on the page in plain text.


If you want fancy button type links, create them using CSS and text links.


5. Check for and fix or delete broken links in your pages. Your HTML editor might have a feature to do this, or you can check pages with the free W3C link checker (WebCEO does this plus checks for syntax and other problems). Dead links not only give the spiders a hard time, they indicate to the search engines that the site is not well maintained or up to date, negatively effecting your search engine position.


6. Make sure that when a page that doesn’t exist is requested, a proper 404 error is returned. A simple check for some kinds of automated spam sites is to request a few made up, nonsensical page names like “jko548fvn2se.html” and see if errors are returned or not. Also, redirecting errors to your homepage and inadvertently sending out 301 or 302 header codes instead of 404 effectively tells the search engine that all those pages are not missing at all, but have the same duplicate content.


7. Although these days dynamic pages are indexed by the major search engines, spiders still have problems with URL’s containing too many parameters. You also want to avoid feeding session ID’s to bots, and any other parameters that will end up creating different URL’s for the same page. Not only will that lead to problems with duplicate content, but it can make your website seem like a bottomless pit to a bot, which might crawl the same pages at different URL’s over and over, but miss half of your site altogether.


8. Use a robots.txt file at the root of your website to block crawlers from accessing pages that will result in very similar or duplicate content, or which have no valuable (topical) content. Also use robots.txt to prevent search engines indexing anything you don’t want public. Errors in this file can prevent spiders from crawling your site, so .


9. Frames aren’t as much of an issue as they used to be, but I’d still avoid using them unless I didn’t care about search engine positioning. The problem is that the content of the page you want to rank isn’t actually on that page, it’s on another page altogether. Whether or not the search engine associates one with the other can be a hit or miss affair. Google advises against using frames.


Off-Page SEO

Off-Page SEO refers to search engine optimisation techniques that aren’t carried out on your own website or page, but on other sites.

These days, only optimizing the content of your pages isn’t enough to get them to rank highly. To do that you need help from other websites in the form of incoming links, known as as backlinks (links back to your site, “back links”). In fact, you’ll find that Google won’t bother listing your site if it has no backlinks at all.

In essence, off-page SEO is all about getting quality backlinks relevant to your topic that assist the search engines in establishing the value of your page and what it focuses on. You can view each backlink as a vote of approval for your page. The more inbound links a page has, the greater its link popularity. Google’s PageRank measures this, but in a complex way that takes many factors into consideration.


Notes On Linking Strategies & Increasing PageRank / Link Popularity

1. A single inbound link from a high quality site is worth tens of links from different low quality sites.

2. Linking out to low value, spammy sites, or those engaged in SEO practices the search engines frown upon can negatively effect your own website’s rankings.

3. One-way links to your website are of far greater value than reciprocal links obtained from engaging in link exchanges.

4. Backlinks from pages covering the same or related topics are far more valuable than those from totally unrelated sites. Have links pointing to the most relevant page on your site, not simply the homepage (known as “deep linking”).

5. Links to your pages should have one of its keywords in the anchor text. Employ numerous variations on this text if you intend to create a high number of backlinks to a page. This leads to better results and looks more natural to the search engines, avoiding throwing up a red flag for possibly attempting to manipulate the listings.


6. Avoid participating in organized link exchanges or link farms. Most of the time this will end up harming rather than helping your rankings. This is because the search engines see it as manipulation of the SERPs (Search Engine Results Pages) and penalize linked websites once they discover the network. Read Google’s view on this, and note that even “Excessive reciprocal links or excessive link exchanging” is considered to violate Google’s Webmaster Guidelines!


7. Don’t build backlinks too fast. Hundreds of backlinks appearing for a site in a matter of days send a clear signal to the search engines that you’re probably doing something you shouldn’t be (in their eyes), simply because it appears so unnatural. Grow your links steadily over time.

8. Links from high PR sites are good, but don’t obsess about getting them. You’ll often get as much value from a highly targeted low PR link as from an untargeted high PR one. The search engines aren’t the only reason to have links, and good links bring traffic themselves. Having said that, if you’re on a link building campaign, unless a site looks particularly good, I wouldn’t bother targeting it if it has no PR, e.i., is PR0.

9. Good backlinks with targeted keyword anchor text carry a LOT of weight these days. So much so that if done well, it’s possible to rank a page highly for terms that aren’t even on it.

10. Backlinks are also the way to get your site found and crawled by the main search engines. I wouldn’t bother submitting to the main SE’s, it’s generally better and faster to get links from websites that Google or Yahoo already values, and which are regularly crawled as a consequence. Let them “discover” your pages themselves, by putting links where you know they’ll be found and followed.


How To Get Backlinks?

There are lots of ways to get backlinks, although few are quick or easy (tools like SEOelite, WebCEO or Link Assistant help speed this up). Here are some options:



  • Create content that makes people link to it (often termed “link bait”)
  • List your website in directories
  • Write and syndicate articles
  • Get your page mentioned on bookmark sites,, etc
  • Create content on Hubpages that links to your page
  • Post comments to topically related blogs
  • Trackback to topically related blogs
  • Syndicate your blog feed to announcement and aggregator sites
  • Use Tags for links from
  • Make forum posts that include your link
  • Exchange links with other webmasters
  • Buy links


Closing Thoughts

Congratulations! Now you know how to do search engine optimization yourself. Of course, I’d by lying if I didn’t admit there ARE are easier ways to do SEO.


SEO Elite might be the more glamorous, but I think WebCEO is the better choice for the average webmaster (and I say that even though I get a bigger commission for recommending SEO Elite).


Quite simply, WebCEO has more useful features, covering more bases. Plus its analysis of your pages and reports on over 130 parameters that affect your rankings help develop a deeper understanding of SEO as you use it, and that’s in addition to the valuable free SEO course and certification that’s included with WebCEO.


Wonderful as these tools are though, you can still do good SEO without them. If you simply follow the steps above your pages WILL start to get high rankings and quality traffic from the search engines.

As you get more pages ranked, the search engines will value your site more, and it will become easier to get top positioning for more difficult keyword phrases. Naturally, you’ll also get better and better at SEO too!


On Page SEO Optimization Techniques


On Page SEO:

“On Page” SEO simply refers to the text and content on your web site pages. Basically editing your page and content so the Search Engine can find your webpage when a surfer is searching for your web sites particular topic.


On Page Search Engine Optimization has been around the longest, since the beginning of search engines. Search engines used simpler less sophisticated technology a few years ago, and the world wide web was a lot smaller. At the time “ON Page” SEO Worked years ago, and it was basically an easy comparison. As the World Wide Web grew larger and larger it became more difficult for search engines to differentiate between your site and other sites. A search on “Autos” may return 100 million + pages that have the word “Auto” on it. So Off Page SEO began to take off as the world wide web and search engines grew in complexity.

On Page Elements:

On Page Elements refer to the HTML tags within the page. They include Heading Tags (<H1>), Title Tags, BoldTags, Italic tags on your web page. Below is an example of phrase “SEO Company” used in a Heading (<h1>) and Bold (<b>) Example:

SEO Company

SEO Company

SEO Company

Notice the difference?

In the HTML Source, The search phrase “SEO Company” Was placed between <h1> tags.

<H1>SEO Company</H1> HTML Tags

In the second version, It was placed between bold tags.

<b>SEO Company</b> HTML tags.

In the third version, it was placed between emphasize tags.

<em>SEO Company</em> HTML tags.

Natural On Page SEO:

Your Search Phrases should be emphasized in a natural way for both the visitor and the search engine spider. Do not “KeyWord Stuff” your web page, by repeating the search phrase over and over again in your webpage. This will often result in a Search Engine “Penalty” and move your sites ranking Lower in the results.

Unethical/Unsavory On Page Techniques:


There are several different techniques known as “black hat” or “unethical” On Page Techniques. Some SEO companies engage in these type of activities and should be avoided. Sooner or later the search engines will catch up to these unethical techniques and the likely result will be your site being demoted or banned from the search engines. We recommend the following unethical SEO techniques should not be used.

Negative ON Page SEO Techniques Include:

  • Avoid Using “hidden” or invisible text on your page for the purpose of higher search engine placement. For example the words/text for search phrase “Widget” in the HTML, the font color has been set to White. The background of the page is also white. Therefore the textual content is actually there, however the words are “hidden” from the surfer. This is frowned upon by search engines and frequently results in your site being penalized.
  • Avoid Using Negative Div tags. Div tags are division tags. Unscrupulous SEO services may insert them into your page with negative x/y coordinates to place content outside of the visible page for the surfer, but the text itself is in the HTML page. The search engine finds the keywords in the text, yet the surfer does not see it. Again a technique to be avoided and not recommended under any circumstances.
  • Avoid Cloaking or Sneaky Redirects. Cloaking refers to serving up 2 different types of content based on the visitor who is visiting. Is the visitor a regular web surfer, serve up this page. Is the visitor a search engine spider? Serve up this OTHER page specifically for the search engine spider. The other page being served up is typically garbled textual content with no meaning to a human, and is stuffed with various keywords and search phrases. Again this technique is not recommended and will likely get your site penalized or banned from search engines.
  • Avoid duplicate content. Duplicate content means you create one web site, with content on topic a, and then repeat the content over and over again on multiple websites. In theory you could create one website, achieve high ranking on it, and then clog up the search engines with the same content duplicated on multiple domains. Again this is not recommended and should be avoided.

List of Best and Worst practices for designing a high traffic website

Keywords in <title> tag
This is one of the most important places to have a keyword because what is written
inside the <title> tag shows in search results as your page title. The title
tag must be short (6 or 7 words at most) and the the keyword must be near the beginning.
Keywords in URL
Keywords in URLs help a lot – e.g. –,
where “SEO services” is the keyword phrase you attempt to rank well for. But if
you don’t have the keywords in other parts of the document, don’t rely on having
them in the URL.
Keyword density in document text
Another very important factor you need to
. 3-7 % for major keywords is best, 1-2 for minor. Keyword density
of over 10% is suspicious and looks more like keyword stuffing, than a naturally
written text.
Keywords in anchor text
Also very important, especially for
the anchor text of inbound links, because if you have the keyword in the
anchor text in a link from another site, this is regarded as getting a vote from
this site not only about your site in general, but about the keyword in particular.
Keywords in headings (<H1>, <H2>, etc. tags)
One more place where keywords count a lot. But beware that your page has actual
text about the particular keyword.
Keywords in the beginning of a document
Also counts, though not as much as anchor text, title tag or headings. However,
have in mind that the beginning of a document does not necessarily mean the first
paragraph – for instance if you use tables, the first paragraph of text might be
in the second half of the table.
Keywords in <alt> tags
Spiders don’t read images but they do read their textual descriptions in the <alt>
tag, so if you have images on your page, fill in the <alt> tag with some keywords
about them.
Keywords in metatags
Less and less important, especially for Google. Yahoo! and Bing still rely on them,
so if you are optimizing for Yahoo! or Bing, fill these tags properly. In any case,
filling these tags properly will not hurt, so do it.
Keyword proximity
Keyword proximity measures how close in the text the keywords are. It is best if
they are immediately one after the other (e.g. “dog food”), with no other words
between them. For instance, if you have “dog” in the first paragraph and “food”
in the third paragraph, this also counts but not as much as having the phrase “dog
food” without any other words in between. Keyword proximity is applicable for keyword
phrases that consist of 2 or more words.
Keyword phrases
In addition to keywords, you can optimize for keyword phrases that consist of several
words – e.g. “SEO services”. It is best when the keyword phrases you optimize for
are popular ones, so you can get a lot of exact matches of the search string but
sometimes it makes sense to optimize for 2 or 3 separate keywords (“SEO” and “services”)
than for one phrase that might occasionally get an exact match.
Secondary keywords
Optimizing for secondary keywords can be a golden mine because when everybody else
is optimizing for the most popular keywords, there will be less competition (and
probably more hits) for pages that are optimized for the minor words. For instance,
“real estate new jersey” might have thousand times less hits than “real estate”
only but if you are operating in New Jersey, you will get less but considerably
better targeted traffic.
Keyword stemming
For English this is not so much of a factor because words that stem from the same
root (e.g. dog, dogs, doggy, etc.) are considered related and if you have “dog”
on your page, you will get hits for “dogs” and “doggy” as well, but for other languages
keywords stemming could be an issue because different words that stem from the same
root are considered as not related and you might need to optimize for all of them.
Optimizing for synonyms of the target keywords, in addition to the main keywords.
This is good for sites in English, for which search engines are smart enough to
use synonyms as well, when ranking sites but for many other languages synonyms are
not taken into account, when calculating rankings and relevancy.
Keyword Mistypes
Spelling errors are very frequent and if you know that your target keywords have
popular misspellings or alternative spellings (i.e. Christmas and Xmas), you might
be tempted to optimize for them. Yes, this might get you some more traffic but having
spelling mistakes on your site does not make a good impression, so you’d better
don’t do it, or do it only in the metatags.
Keyword dilution
When you are optimizing for an excessive amount of keywords, especially unrelated
ones, this will affect the performance of all your keywords and even the major ones
will be lost (diluted) in the text.
Keyword stuffing
Any artificially inflated keyword density (10% and over) is keyword stuffing and
you risk getting banned from search engines.
Links – internal, inbound, outbound
Anchor text of inbound links
As discussed in the Keywords section, this is one of the most important factors
for good rankings. It is best if you have a keyword in the anchor text but even
if you don’t, it is still OK.
Origin of inbound links
Besides the anchor text, it is important if the site that links to you is a reputable
one or not. Generally sites with greater Google PR are considered reputable.
Links from similar sites
Having links from similar sites is very, very useful. It indicates that the competition
is voting for you and you are popular within your topical community.
Links from .edu and .gov sites
These links are precious because .edu and .gov sites are more reputable than .com.
.biz, .info, etc. domains. Additionally, such links are hard to obtain.
Number of backlinks
Generally the more, the better. But the reputation of the sites that link to you
is more important than their number. Also important is their anchor text, is there
a keyword in it, how old are they, etc.
Anchor text of internal links
This also matters, though not as much as the anchor text of inbound links.
Around-the-anchor text
The text that is immediately before and after the anchor text also matters because
it further indicates the relevance of the link – i.e. if the link is artificial
or it naturally flows in the text.
Age of inbound links
The older, the better. Getting many new links in a short time suggests buying them.
Links from directories
Great, though it strongly depends on which directories. Being listed in DMOZ, Yahoo
Directory and similar directories is a great boost for your ranking but having tons
of links from PR0 directories is useless and it can even be regarded as link spamming,
if you have hundreds or thousands of such links.
Number of outgoing links on the page that links to you
The fewer, the better for you because this way your link looks more important.
Named anchors
Named anchors (the target place of internal links) are useful for internal navigation
but are also useful for SEO because you stress additionally that a particular page,
paragraph or text is important. In the code, named anchors look like this: <A
href= “#dogs”>Read about dogs</A> and “#dogs” is the named anchor.
IP address of inbound link

Google denies
that they discriminate against links that come from the same
IP address or C class of addresses, so for Google the IP address can be considered
neutral to the weight of inbound links. However, Bing and Yahoo! may discard links
from the same IPs or IP classes, so it is always better to get links from different
Inbound links from link farms and other suspicious sites
This does not affect you in any way, provided that the links are not reciprocal.
The idea is that it is beyond your control to define what a link farm links to,
so you don’t get penalized when such sites link to you because this is not your
fault but in any case you’d better stay away from link farms and similar suspicious
Many outgoing links
Google does not like pages that consists mainly of links, so you’d better keep them
under 100 per page. Having many outgoing links does not get you any benefits in
terms of ranking and could even make your situation worse.
Excessive linking, link spamming
It is bad for your rankings, when you have many links to/from the same sites (even
if it is not a cross- linking scheme or links to bad neighbors) because it suggests
link buying or at least spamming. In the best case only some of the links are taken
into account for SEO rankings.
Outbound links to link farms and other suspicious sites
Unlike inbound links from link farms and other suspicious sites, outbound links
to bad neighbors
can drown you. You need periodically to check the status of the sites you link to
because sometimes good sites become bad neighbors and vice versa.
Cross-linking occurs when site A links to site B, site B links to site C and site
C links back to site A. This is the simplest example but more complex schemes are
possible. Cross-linking looks like disguised reciprocal link trading and is penalized.
Single pixel links
when you have a link that is a pixel or so wide it is invisible for humans, so nobody
will click on it and it is obvious that this link is an attempt to manipulate search
<Description> metatag
Metatags are becoming less and less important but if there are metatags that still
matter, these are the <description> and <keywords> ones. Use the <Description>
metatag to write the description of your site. Besides the fact that metatags still
rock on Bing and Yahoo!, the <Description> metatag has one more advantage
– it sometimes pops in the description of your site in search results.
<Keywords> metatag
The <Keywords> metatag also matters, though as all metatags it gets almost
no attention from Google and some attention from Bing and Yahoo! Keep the metatag
reasonably long – 10 to 20 keywords at most. Don’t stuff the <Keywords> tag
with keywords that you don’t have on the page, this is bad for your rankings.
<Language> metatag
If your site is language-specific, don’t leave this tag empty. Search engines have
more sophisticated ways of determining the language of a page than relying on the
<language>metatag but they still consider it.
<Refresh> metatag
The <Refresh> metatag is one way to redirect visitors from your site to another.
Only do it if you have recently migrated your site to a new domain and you need
to temporarily redirect visitors. When used for a long time, the <refresh>
metatag is regarded as unethical practice and this can hurt your ratings. In any
case, redirecting through 301 is much better.
Unique content
Having more content (relevant content, which is different from the content on other
sites both in wording and topics) is a real boost for your site’s rankings.
Frequency of content change
Frequent changes are favored. It is great when you constantly add new content but
it is not so great when you only make small updates to existing content.
Keywords font size
When a keyword in the document text is in a larger font size in comparison to other
on-page text, this makes it more noticeable, so therefore it is more important than
the rest of the text. The same applies to headings (<h1>, <h2>, etc.),
which generally are in larger font size than the rest of the text.
Keywords formatting
Bold and italic are another way to emphasize important words and phrases. However,
use bold, italic and larger font sizes within reason because otherwise you might
achieve just the opposite effect.
Age of document
Recent documents (or at least regularly updated ones) are favored.
File size
Generally long pages are not favored, or at least you can achieve better rankings
if you have 3 short rather than 1 long page on a given topic, so split long pages
into multiple smaller ones.
Content separation
From a marketing point of view content separation (based on IP, browser type, etc.)
might be great but for SEO it is bad because when you have one URL and differing
content, search engines get confused what the actual content of the page is.
Poor coding and design
Search engines say that they do not want poorly designed and coded sites, though
there are hardly sites that are banned because of messy code or ugly images but
when the design and/or coding of a site is poor, the site might not be indexable
at all, so in this sense poor code and design can harm you a lot.
Illegal Content
Using other people’s copyrighted content without their permission or using content
that promotes legal violations can get you kicked out of search engines.
Invisible text
This is a black hat SEO practice and when spiders discover that you have text specially
for them but not for humans, don’t be surprised by the penalty.
Cloaking is another illegal technique, which partially involves content separation
because spiders see one page (highly-optimized, of course), and everybody else is
presented with another version of the same page.
Doorway pages
Creating pages that aim to trick spiders that your site is a highly-relevant one
when it is not, is another way to get the kick from search engines.
Duplicate content
When you have the same content on several pages on the site, this will not make
your site look larger because the duplicate content penalty kicks in. To a lesser degree duplicate content
applies to pages that reside on other sites but obviously these cases are not always
banned – i.e. article directories or mirror sites do exist and prosper.
Visual Extras and SEO
If used wisely, it will not hurt. But if your main content is displayed through
JavaScript, this makes it more difficult for spiders to follow and if JavaScript
code is a mess and spiders can’t follow it, this will definitely hurt your ratings.
Images in text
Having a text-only site is so boring but having many images and no text is a SEO
sin. Always provide in the <alt> tag a meaningful description of an image
but don’t stuff it with keywords or irrelevant information.
Podcasts and videos
Podcasts and videos are becoming more and more popular but as with all non-textual
goodies, search engines can’t read them, so if you don’t have the tapescript of
the podcast or the video, it is as if the podcast or movie is not there because
it will not be indexed by search engines.
Images instead of text links
Using images instead of text links is bad, especially when you don’t fill in the
<alt> tag. But even if you fill in the <alt> tag, it is not the same
as having a bold, underlined, 16-pt. link, so use images for navigation only if
this is really vital for the graphic layout of your site.
Frames are very, very bad for SEO. Avoid using them unless really necessary.
Spiders don’t index the content of Flash movies, so if you use Flash on your site,
don’t forget to give it an alternative textual description.
A Flash home page
Fortunately this epidemic disease seems to have come to an end. Having a Flash home
page (and sometimes whole sections of your site) and no HTML version, is a SEO suicide.
Domains, URLs, Web Mastery
A very important factor, especially for Yahoo! and Bing.
Site Accessibility
Another fundamental issue, which that is often neglected. If the site (or separate
pages) is unaccessible because of broken links, 404 errors, password-protected areas
and other similar reasons, then the site simply can’t be indexed.
It is great to have a complete and up-to-date
sitemap, spiders love it, no matter if it is a plain old HTML sitemap or
the special Google sitemap format.
Site size
Spiders love large sites, so generally it is the bigger, the better. However, big
sites become user-unfriendly and difficult to navigate, so sometimes it makes sense
to separate a big site into a couple of smaller ones. On the other hand, there are
hardly sites that are penalized because they are 10,000+ pages, so don’t split your
size in pieces only because it is getting larger and larger.
Site age
Similarly to wine,
older sites are respected more. The idea is that an old, established site
is more trustworthy (they have been around and are here to stay) than a new site
that has just poped up and might soon disappear.
Site theme
It is not only keywords in URLs and on page that matter. The site theme is even
more important for good ranking because when the site fits into one theme, this
boosts the rankings of all its pages that are related to this theme.
File Location on Site
File location is important and files that are located in the root directory or near
it tend to rank better than files that are buried 5 or more levels below.
Domains versus subdomains, separate domains
Having a separate domain is better – i.e. instead of having,
register a separate domain.
Top-level domains (TLDs)
Not all TLDs are equal. There are TLDs that are better than others. For instance,
the most popular TLD – .com – is much better than .ws, .biz, or .info domains but
(all equal) nothing beats an old .edu or .org domain.
Hyphens in URLs
Hyphens between the words in an URL increase readability and help with SEO rankings.
This applies both to hyphens in domain names and in the rest of the URL.
URL length
Generally doesn’t matter but if it is a very long URL-s, this starts to look spammy,
so avoid having more than 10 words in the URL (3 or 4 for the domain name itself
and 6 or 7 for the rest of address is acceptable).
IP address
Could matter only for shared hosting or when a site is hosted with a free hosting
provider, when the IP or the whole C-class of IP addresses is blacklisted due to
spamming or other illegal practices.
Adsense will boost your ranking
Adsense is not related in any way to SEO ranking. Google will definitely not give
you a ranking bonus because of hosting Adsense ads. Adsense might boost your income
but this has nothing to do with your search rankings.
Adwords will boost your ranking
Similarly to Adsense, Adwords has nothing to do with your search rankings. Adwords
will bring more traffic to your site but this will not affect your rankings in whatsoever
Hosting downtime
Hosting downtime is directly
related to accessibility because if a site is frequently down, it can’t be indexed.
But in practice this is a factor only if your hosting provider is really unreliable
and has less than 97-98% uptime.
Dynamic URLs
Spiders prefer static URLs, though you will see many dynamic pages on top positions.
Long dynamic URLs (over 100 characters) are really bad and in any case you’d better
use a tool to rewrite dynamic
in something more human- and SEO-friendly.
Session IDs
This is even worse than dynamic URLs. Don’t use session IDs for information that
you’d like to be indexed by spiders.
Bans in robots.txt
If indexing of a considerable portion of the site is banned, this is likely to affect
the nonbanned part as well because spiders will come less frequently to a “noindex”
Redirects (301 and 302)
When not applied properly,
can hurt a lot – the target page might not open, or worse – a
redirect can be regarded as a black hat technique, when the visitor is immediately
taken to a different page.