September 6, 2011

Intro to SEO

Everyone nowadays seems to be interested in SEO (Search Engine Optimization). On most of the interviews I went to recently I was asked... 'Do you know SEO?'

On many occasions I ended up asking myself, Does that person know SEO? You see, the web works in a pretty simple way... There are 2 types of websites:

  1. Applications (Services): A website that provides some sort of a unique service or product (Like Facebook, Twitter, Google, MySpace, YouTube... etc).
  2. Content websites: A website that provides, Articles, Discussion Threads, and information in general (Forums, Blogs, Online Magazines... etc).
Some websites (Those who have products they want to promote or want to control the information you have about their products use a mix of both and the best example would be Microsoft.
Their websites (Or website's they sponsor) are usually on top of the search results when you're looking for a specific problem in one of their products... And that ladies and gentlemen is brilliant.
So the first type of websites (Apps) usually depend on the hype they've created to get people to know them. And their unique service to get people to comeback.
In some cases the fact that the hype never ends (as in twitter) gets people to come back.
So what does the second category do to get people to know about them?
Well... They pick a specific topic. And they create a lot of pages about that topic (In cases like forums, they depend on the users to do the writing).
If you don't have hundreds of pages on your website, you will need some specific techniques to get people to find you. And the best way to get people to find you is appearing in their search results for the topic of your focus.

How do you do that?

We'll these are the basic steps you need to do.
  1. You pick a keyword or a bunch of them that you want people to find with.
  2. You analyze and filter your keywords. Google AdWords to do that.
  3. Incorporate keywords into your website.
  4. Fix your website.
  5. Sometimes people actually forget to submit their websites to search engines. You can either do that yourself, or have a tool do it for you.

Details?

Pick a keyword or a bunch of them that you want people to find with.
Well, this depends on you, the editor or the person managing the website. For example in my blog I focus on Software, SEO and Entertainment (It's my personal blog so I choose those topics).
But if you're in printing business you might want to focus on Books, Magazines, Articles, Papers... etc.

What's next?

Filtering those keywords.

Best way to do this is the Google AdWords keyword analyzer. for example, I tried analyzing the keyword 'entertainment', opened the 'Advanced Options and Filters' and selected United States as a target country and got this:

adwords-tool

which means, the keyword entertainment is searched for 16 million times across the world and 6 million times in the US, the competition is low. So this keyword is a good choice.

More related keywords are available below in case my website was just about entertainment, I’d be interested in those specific keywords.

Using those keywords

Now that you have your keywords, you need to put them into a good use. First of all you should select the most important ones and try placing them in the title and description of your home page.

Search engines are smart, really smart… So placing those keywords in pages with heading tags (h1, h2, h3… etc) means they’re very important to this page thus giving this page a higher rank for those keywords. Also, the more the keyword appears in your page the better. So try a page in my website titled entertainment gains a good rank for this keyword, and if the text heading had the keyword entertainment the page would gain a better rank. Next if the page had a lot of entertainment word count or entertainment related keywords, the rank will improve.

Simple, right?

Fixing your website

If you’re not a web developer or designer you’ll need help doing this. A good website with minimum html errors is a great website for search engines. You can use The W3 Validator to check your website for html errors.

You also need to make sure all your pictures and images have their ‘alt’ attribute set. This helps image search engines find your photos, which also increases the rank of your website.

One more thing is to check that your page headings use the h1 tag and that your pages have the meta description set.

You have to make sure that each page’s title is less than 70 chars and that its meta description is less than 140 characters.

What’s Next?

Make sure everything is actually working. Go to Google Webmaster Tools, add your site following Google's instructions and wait for a couple of days.

Remember that Google has to follow up on billions of websites so it might take a while until they recheck your website. And it will take about 3 months until your website rank is changed.

After you see some activity on your Webmaster Tools account, be sure to check the “Diagnostics=>HTML suggestions” to see if any of your pages needs a fix.

What about other search engines?

Most search engines follow the same steps and checks as Google so if you work your website for Google other search engines will also index your website but keep in mind that Google is the search giant so what takes Google 2 or 3 weeks takes other search engines months.

Need more help?

You can use some service such as Attracta to help you submit your website and follow up on your site evaluation.

January 28, 2011

4 Killer SEO Tools and How to Use Them

After the fun parts of designing a website, comes the often painstaking process of optimizing it for Search Engines. The good news is aside from the proliferation of information on how to do that, there are actually a number of tools that can make it infinitely easier. Here are 4 SEO tools that can help anyone with the finer points of on-site SEO.
SEO Workers – Analysis Tool
Even though this tool only works on a page by page basis, it gives you a ton of information on any single page. This tool looks at important on page factors like:

  • HTTP Headers
  • Meta tags
  • Keyword Relevancy
  • URLs on the page
  • Keywords in anchor tags
  • Keywords in image ‘alt’ attributes
  • Heading & Phrase Elements

These are all important elements to analyze of course because everything from Title tag relevance to the anchor text on internal links can contribute to successful on-page SEO. One of the best parts of this tool is also the many instructional videos it provides discussing best practices and search engine policies on all of these many aspects of a web site. Use this tool to make sure that your keyword phrases exist in all of the important places. This is a useful tool particularly after a website is designed to take a snap shot of the finished product to insure that each page is set up properly.
Xenu Link Sleuth
This is an awesome tool, which needs to be downloaded but is well worth it. The Link Sleuth essentially crawls a website and returns a report showing all broken links on a site. Broken links are detrimental from an SEO perspective because they are dead ends. So cleaning up errors on your site or retracting outbound links to 404 pages should always be a regular part of maintenance.
This tool also gives a great overview of internal links for link architecture analysis. Prominent and important pages should be kept closer to the home page, and Link sleuth can help you find any pages which may be buried deeper than they should be.
Web Page Speed Report
One of the newer subjects being discussed in the SEO world is the impact of loading speed on SEO. Patents filed by multiple Search Engines suggest load times will have a role to play if they are not being factored in already. That’s why this tool is such a helpful little gem. Since speed can affect your over all quality score, it’s important to make sure that all pages are loading at their maximum possible speed. This tool allows you to see object size totals and approximate loading times for various page elements. The report also provides detailed recommendations on how load speeds can be improved making it a very educational tool.
Title Tag Generators – SEO Book

Title Tags are generally renowned as the one of, if not the most important on-page factor when it comes to what you will rank for. That’s why it’s so important to set up your title tags well. While these tools may not help you too much in the way of keyword research, they will help you create all of your necessary Meta data. The informative and educational tips and tutorials provided by SEO Book are also great help for people who are new to SEO or good reminders for anyone.
Conclusion
Tools cannot replace human intelligence when it comes to implementing quality SEO strategies, but for designers and SEOs, these tools can be invaluable in making sure that the most vital aspects of on-page are on target.

January 23, 2011

A Complete Guide on How To Do SEO For A Website.

What is SEO? SEO or Search
Engine Optimization is the practice of making your website attractive to
search engines such as Google, Bing, Yahoo, etc. Search engines
regularly read and archive websites so that people can find them easily.
For example, a person may be searching for ways to cook salmon. If your
website is about salmon and optimized properly, your site should appear
within the first page or two of every search engine. 

What isn't SEO? SEO is not
about tricking search engines such as AltaVista, Ask or Live.com.
Search Engine Optimization is about creating clean and detailed web
pages that can be easily read by automated robots. By following a basic
set of rules and ensuring that you have the correct information in your
source code along with keywords and other detailed information
throughout a page, a search engine like Google will be able to easily
read and catalog (or index) your page.

Checklist for Doing SEO For your Website:

Keyword Research Tools
There are sites and services where you can input keywords and see their search volume, among other data. The most reputable include:
  • Google Adwords Keyword Tool - free service
  • Keyword Discovery - paid service
  • Wordtracker - paid service with free trial

A Specific Overview of Search Engine Optimization
This is a comprehensive guide that explains How to SEO Your Website. This tutorial offers an overview of how to optimize your site's performance in search engine rankings. From how to set up your site to naming pages to creating conversation across the web, this page offers strategies, tips, and suggestions that will make your site a success. Search Engine Optimization, more commonly known as SEO, is the process of making your website easier for search engines to understand. The goal
of SEO is increased ranking for your website, which will result in more
traffic. Engaging in search engine optimization requires a constantly evolving skill set. This guide contains basic practices that have remained relatively constant over time. Many people who don't understand SEO or the goals of SEO consider it to be spam or manipulation. However if implemented within search engine guidelines, the practice is endorsed by Google
and other search engines. Good SEO results in pages set up in a structured and orderly fashion. The pages will be filled with better information and more valuable content. How Search Engines Works Search engines have programs called spiders which visit web pages to determine what the content of your site is, and to find other links to scan at a later date.

  1. Spiders, or web crawlers, scan the content of web pages.
  2. They send the results of their scan back to the algorithm to be broken down and analyzed.
  3. If the spiders encounter a link to another page or website these links are stored.
  4. Eventually other spiders crawl the linked-to pages.
  5. Therefore, the more links from other websites and pages your website has, the more frequently your website is visited and crawled.

What Search Engines Look At
Search engines look at a combination of over 200 factors to determine what pages should rank for which queries. These factors include: 1. Information on the web pages (known as on-page factors), such as:

  • Page content
  • Title heading
  • Off-site factors also come into play. These factors incorporate:
    • How reputable is the page linking to you
    • What words are being used to link to you
    • How long the link to you has existed
  • It's the combination of on-site and off-site factors that determine your search engine rankings. Advantages of Good Site Architecture Having good site architecture offers benefits beyond aesthetic considerations, including:
    1. Easy Expansion—Because your site is divided into manageable sections it is easy to add new sections and grow in the future.
    2. Easy Navigation—Intermediate and advanced users can manually manipulate the site's URL to change sections.
    3. Easy Maintenance—Because the website is divided up into manageable sections it is easier to maintain than a site with a flat structure.
    4. Well-Defined Hierarchy—Pages with more generalized information are at the top of the
      tree. As you navigate deeper into the site, pages present more specialized information.
  • Good site architecture requires:
    1. A good understanding of your website's subject matter.
    2. Knowledge of how users are likely to search for information.
  • Keyword Research Keyword research is valuable because it's a way to learn how your users search for information. Keyword research can also give you a better understanding of the subject your website will be about. When doing keyword research there are a few basic ideas to understand: 1. Singular vs. Plural—Search engines and keyword research tools handle singular and plural terms differently. * Apple will return different search results than Apples. * Understand how your research tool displays and reports singular and plural terms. 2. Word Order and Prominence—The order of the words that are typed into a search box matters, as does the order of the words on a page. * Macaroni and cheese has slightly different results than cheese and macaroni. * Some keyword tools will always list words in alphabetical order, something you should be aware of when doing your research. 3. Head Keywords—Head keywords are usually short one or two word concepts that can have a wide range of meanings. They have a high volume of searches, but the variety of possible meanings makes it difficult to know what the
    user was actually searching for. * An example of head keyword would
    be golf. Was the searcher looking for shoes, clubs or places to play golf? 4. Long Tail Keywords—These are multiple keywords (at least four or five words). These keywords are very specific and signal a clear intent on the users part. * An example of tail keyword would be Mens black Nike golf shoes. 5. There is a wide range of keywords falling between the head and tail. When doing keyword research it's important to:
  • Compile a list of all of your keywords.
  • Try to develop clusters around a particular topic or subject.
  • These will become the high level directories in your website architecture.
Site URLs and Server Technology

  1. People who build and run websites have a wide array of technology to choose from. Some popular programming platforms include:
  2. Hypertext Preprocesor (PHP)
  3. Active Service Pages (ASP)
  4. JavaServer Pages (JSP)
  • Each of these platforms has its own language, but they all serve pages out in HTML format. From a search engine optimization perspective there is no advantage in choosing one over the other. You can choose among the platforms based on cost of operations: hiring developers, designers, programmers, and webmasters.
  1. example.com/blue-widgets.html
  2. example.com/candy.php
  3. example.com/cars.asp
  • Your pages' URLs should look like:
  1. example.com/blue-widgets/
  2. example.com/candy/
  3. example.com/cars/
  • Using this scripting language will allow you to move from one technology to another without altering your site's URLs or its underlying structure.
Canonicalization

  • It's possible to serve your website under both: http://www.example.com and http://example.com. However, many search engines will see both and
    consider it duplicate content (the same content under two URLs), and may penalize your site accordingly. To avoid this:
  1. Pick either http:// or http://www and use it consistently.
  2. Configure your web server to 301 redirect all traffic from the style you are not using to the style you are using.
Static URLs and Dynamic URL Parameters

  • In many cases programming implementations use parameters instead of static URLs. A URL with a parameter will look like this:
  • example.com/page/?id=widget
  • While a static URL will look like this:
  • example.com/page/widget/
  • In most cases search engines have the ability to index and rank both formats. However, best practices advise the use of static URLs over dynamic ones, for reasons like:
  1. You produce cleaner, easier to understand output.
  2. You extract the keyword value the parameter can add.
  • As you start to add more than one parameter to a URL search engines have a harder time properly indexing the URL.
  • URLs which are not in the index will never rank or drive traffic from search engines.
Keywords in URLs
  • Generally speaking, it's beneficial to have keywords contained within your URL structure. Having the keyword in your URL helps search engines
    understand what your page is about and also helps users know what they are likely to find on the page. Consider these two examples and see which you find more useful:
  1. example.com/123467/9876/
  2. example.com/images/tulips/
Delimiters in URLs
  • Delimiters are used in URLs to separate words.
  1. The best practice is to use a hyphen to separate words.
  2. Search engines do have the ability to understand other characters, such as an underscore, but the hyphen is preferred over an underscore for human usability issues.
  3. For more information, see Matt Cutts' discussion on dashes vs underscores.
JavaScript and Flash
  • Search engines are very limited as to what forms of information they can read and interpret.
  1. Currently, they understand text-based content which is directly on the page.
  2. If your web application relies on JavaScript, Flash, or some other non-text form of displaying information, search engines will have a difficult time interpreting the content of your pages.
  3. Consequently, your search engine rankings will suffer.
Design Page Structure and CSS
  • Website and design usually have more of an impact on usability and marketing. However, there are some SEO concerns to be aware of.
  1. While proper semantic markup and W3C code compliance never hurts, it's not a requirement if you want to have a high-ranking page in search engines.
  2. Care should be taken taken when building pages to eliminate as many errors as possible.
  3. A page with several hundred coding errors is much more likely to trip up a search engine spider than one with fewer errors.
  4. Using proper standards and markup usually means pages are laid out in a more logical fashion.
  5. Using CSS makes it possible to put the main body of a page's content first.
    • Otherwise the top banner and any side navigation appear first.
  6. Search engines still place some weight on the text that comes first on your pages.
  7. More and more website owners are using a Content Management System (CMS) to build their sites.
  8. Using these programs forces you to isolate content from the context, which usually results in cleaner and more streamlined code.
  9. Additionally, these CMS systems make it much easier to build and maintain mid- and large-sized websites.
On Page SEO Factors and Considerations

  • On-page SEO factors deal with the elements that are on the actual web page. Links from other sites are off-page factors.
  1. Most professional SEOs consider the title element the strongest on-page SEO factor, so it's important to pay attention to it.
  2. You want a title that is short and eye-catching, with as many keywords as possible.
  3. Make sure your title still reads cleanly; do not have an unintelligible keyword-stuffed title, as this will display in the search engine listing for your website.
  4. Included your site name in your title for branding purposes.
  5. Whether to place your website name at the front or end of the title can be decided by personal preference.
  6. If you are a large company or well-recognized brand, such as Coca-Cola or Ford, you can place your name at the beginning of the page title. This lets you build on the trust in your brand.
  7. Smaller or less well-known companies should place their names near the end of the title, so that a browser's focus goes to the keywords in your title.
Meta Keywords and Descriptions
  • These factors are largely ignored by search engines due to abuse in the past.
  1. In some cases having identical keywords and descriptions across an entire website has been shown to be a slightly negative factor in ranking.
  2. The meta description will appear under the title when your website shows up in a search engine result.
  3. Therefore, create a unique description that is well-written and eye-catching.
Headlines and Page Headings
  • Page headings (also known as H tags) are structural elements used to divide a page into meaningful sections.
  1. They number from H1 through H6, with H1 being the most important and H6 being the least.
  2. Your page should only have one H1 tag.
  3. You can use as many other H2-H6 tags as you want, as long as you don't abuse them by keyword stuffing.
  4. Many people have their H1 match their title tag.
  5. You can make them different, which allows you to use a wider array of keywords and to create more compelling entries for humans.
  6. By default H tags are large and bold. You can use CSS to make them appear however you'd like on a page.
Bold and Italics
  • Bolding and italicizing fonts doesn't impact search engine rankings. Use these fonts for visual or other formatting reasons, not to affect your standing in search engines.
Internal Anchor Text and Links

  • Anchor text refers to the words that are clickable in a link. Internal anchor text are the words that link to other parts of your site.
  1. Anchor text is one of the mechanisms search engines use to tell what a page is about.
  2. If you link to a page with the words Blue widgets, search engines think you are trying to tell it the page on the the other end of the link is about blue widgets.
  3. By using consistent or similar anchor text every time you link to a page, search engines gets a better understanding of what a page is about.
  4. Avoid using anchor text that doesn't contain keywords (i.e., anchor text that reads "click here") whenever possible.
Content Considerations
  • Content refers to the pages and articles on your website, excluding your template. Creating content that is entertaining, interesting, educational, informative, funny, or compelling in some other way is the best way for you to encourage people to visit your site frequently, link to you, and ultimately improve your rankings. The more unique and  interesting your content is, the more value it has to your site's  visitors. For example, there are millions of websites about iPods. There are very few websites about putting your iPod in a blender, smashing your iPod, or hacking your iPod to do new things. Think of your content as your point of differentiation.
  • Most content falls into three different categories: boilerplate, news, and evergreen.
Boilerplate Content

  • Boilerplate content is general information.
  1. Your about us page, testimonials, contact information, privacy contract, and terms of service constitute boilerplate content.
  2. These pages exist to help website visitors get to know you, learn to trust you, and feel comfortable sharing information or making a purchase from you.
News Content
  • News content is content that has a short-term lifespan.
  1. News pages can remain relevant for a few hours, or even a few months.
  2. Eventually people stop searching for the term and the page will get very little traffic after that.
Evergreen Content
  • Evergreen content is content which has a long lifespan. An example of long term content is How to Paint a Room.
  1. Techniques for painting your interior rooms aren't going to change much for the foreseeable future.
  2. The number of people who are searching to learn how to paint a room will remain fairly constant from one year to the next.
  • Most websites have a blend of news and evergreen content. This will vary from one industry to the next. Gadgets and technology websites will have more news content, whereas websites about William Shakespeare will have more evergreen content.
  • Creating good content is only part of the equation. Once you have the good content, you have to make sure other people know about it, read it, link to it, or tell other people they know about it. For new websites you will have to engage in more proactive marketing.
Marketing

  • Marketing a website is no different than marketing a business. You have to advertise, send out press releases, engage in viral or word of mouth campaigns, or visit other places and tell them about your website. Though a complete marketing plan demands its own guide, some of the key goals of any website marketing plan should involve:
  1. Getting people to visit your website.
  2. Getting people to link to your site.
  3. Convincing visitors to tell others about your website.
  4. Encouraging people to regularly come back to your pages.
Link Building and Link Development
  • Links are the primary method a search engine uses to discover your website, and a key factor in its rankings. Links help search engines determine how trustworthy and authoritative your website is, and they also help search engines figure out what your website is about.
  1. Links from trusted authoritative websites tell search engines that your website is more reliable and valuable.
  2. A link from websites like CNN, The New York Times, and The Wall Street Journal are more valuable than links from your local neighborhood garage or realtor.
  3. Search engines also look at the anchor text (words that link to your website).
  4. When someone links to you with the words blue widget they are telling the search engines you are about the words blue, widget, and blue widget.
  5. Links also increase in value over time.
  6. The longer a link has been in place, the more effective it is in passing along trust, authority, and ranking power to your website.
Directories
  • Once your website is built you want to try and acquire links from as many trusted sources as possible in your particular industry. Getting links from websites that are related to your industry is usually more helpful than getting links from websites that are not related to your industry, though every link helps.
  1. One of the first places many people start building links is from directories.
  2. Most directories have a fee for inclusion.
  3. Look for directories that are charging fees because they review each site before deciding whether to accept it.
  4. Don't join a directory that lets in every site that applies; you want one that keeps out low-quality sites.
  5. To see if a directory is worth the review fee, check to see how much traffic they are going to send you.
  6. To evaluate potential traffic, check to see if the directory page is listed for its particular search term.
  7. If the directory is listed, this is usually a good indicator it will send you traffic.
  8. If the directory does not rank well for its term, check to see if it's listed in the search engine index, and how recently it was crawled.
  9. You can check the last crawl date by clicking on the cache link on the search engine result page.
  10. Pages that are in the index and have been crawled frequently are usually more trusted and will pass some of that value to you.
  11. Pages that are not in the index or have not been crawled recently are usually not worth the review fee.
Press Releases
  • Press releases are usually used to get the attention of journalists or industry news websites, magazines and periodicals.
  1. Many press release websites have relationships with search engine news feeds, so using them can be a very effective way to put your website in front of the right people.
  2. Most press release websites do not pass along any link value, they simply act as link pointers to your website.
  3. If a journalist, news website or blogger sees your press release and writes about you, you may get a link from them.
  4. Consider press releases in light of how much traffic and secondary links they can bring; ignore the link from the press release service.
Content and Article Syndication

  • Content and article syndication websites allow you to publish your content on other sites. In exchange for the free content these sites are willing to provide you with a backlink.
  1. Most of these article syndication sites are like press release sites in that they do not pass any link value, but instead act only as link pointers.
  2. To decide if this strategy should be a part of your marketing and link-building plan, look at the most popular articles in your category and see how well they rank and how much traffic they are likely to drive.
  3. You can also use article syndication sites to identify third-party websites that would be interested in publishing other articles from you.
Link Exchanges, Reciprocal links, and Link Directories

  • Exchanging links with other related websites is a good practice, if it makes sense for your users. Creating link directories with hundreds of links to
    other websites that are of very little or no use to the user is a bad practice and may cause search engines to penalize you.
  1. If the link has value to visitors of your website and you would place the link if search engines didn't exist, then it makes sense to put up the link.
  2. If creating the link is part of a linking scheme where the primary intent is to influence search engines and their rankings then don't exchange the link.
Paid Links and Text Link Advertising
  • Paying for links and advertising can be valuable, as long as you follow search engine guidelines.
  1. If a link is purchased for the advertising value and traffic it can deliver, search engines approve of the link.
  2. If the link is purchased primarily for influencing search engine rankings it is in violation of Google guidelines and could result in a penalty.
  3. If you want to buy or sell text link advertising without violating Google guidelines, look for implementations with a nofollow, JavaScript, or intermediate page that is blocked from search engine spiders.
Viral and Word of Mouth Marketing
  • Creating content that is viral in nature and gets you word-of-mouth marketing can help you acquire links. This process is often called linkbaiting.
  1. Content created for this purpose is often marketed on social media sites like Digg, del.icio.us, and Stumbleupon.
  2. As long as your content becomes popular naturally, without artificial or purchased votes, you will be within search engine guidelines.
Blogs and Social Media

Blogs are a relatively new form of website publishing. Their content is arranged, organized or published in a date or journal format. Blogs typically have a less formal, almost conversation-like style of writing, and are designed to help website owners and publishers to interact more with their customers, users, or other publishers within their community.

  1. The journal and conversational format of blogs usually makes it a much easier way to gain links from your community.
  2. You must create content that members of that community value and are willing to link to.
  3. For a blog to be truly successful the authors must participate in the community and publish frequently.
  4. If this behavior doesn't mesh with your company culture, creating a blog is not going to be effective for you.
Social Media

  • Social media and bookmarking sites like Digg, del.icio.us, and Stumbleupon have community members who function almost like editors. They find and  vote on web pages, stories, articles, videos or other content that is interesting or engaging.
  1. Most social media or bookmarking sites are looking for new content on a regular basis.
  2. The frequent publishing demands of blogs also requires a constant flow of new material.
  3. To get the most out of social media you must become involved in the community and submit stories from other sources, not just from your website.
  4. Each social media website has its own written and unwritten rules. Learn these before submitting stories.
  5. Every community frowns upon attempts to "game" the voting procedure. Tactics such as voting rings and paid votes that artificially influence the voting mechanism should not be permitted.
Analytics and Tools
  • Once your website is up and running you will want to know how many people are coming to your site, how they are getting there, what pages they are viewing when they arrive and how long they are staying. For this you will need a website analytics package.
    1. There are a wide variety of analytics packages, ranging in cost from free to several hundred thousand dollars each month.
    2. Each analytics package measures data in its own way, so it's not uncommon for two programs to have slightly different results from the same set of
      data.
    3. Additionally, each package provides a different level of detail and granularity, so you should have some idea what you are looking for before purchasing a package.
    4. The two main methods of implementation are log files and JavaScript tracking.
  • The most commonly used analytics package is Google Analytics.
Linking Strategy

  • A good general overall linking strategy is to slowly acquire links from as many trusted sources as possible, with a wide variety of anchor text, to both your home page and sub-pages. If, over a short period of time, you gain too many links with similar words in the anchor text, from a few or low-trusted websites, to a limited number of pages, this would create an unnatural linking profile. Your website will be penalized or filtered by search engines for such behavior.
Common SEO Problems

  • You can build a website with great content and institute an effective marketing plan, yet still be foiled by technical issues. Here are some of the most common problems:
Robots.txt File

  • A robots.txt file communicates what pages or sections of your website you want search engines to crawl.
  1. A common mistake is blocking search engine spiders from a section or entire site you want indexed.
  2. You can learn how to create a robots.txt file from Google Guidelines.
  3. Google's Webmaster Central has a tool to let you verify that your robots.txt file is performing how you'd like.
Response and Header Codes
  • When your web server serves a page there is a special code that tells the browser or spider the status of the file served.
  1. A 200 response code means the page serves normally.
  2. If not configured correctly, some web servers will serve a 200 code even when a file is missing.
  3. This can create a problem when search engines index a lot of blank empty pages.
  4. A 404 is the response code when a page or file doesn't exist.
  5. To improve usability, set up a custom 404 page with a message explaining what happened, a search box, and links to popular pages from your website.
Duplicate Content
  • The content from any page should only exist on one URL.
  1. If the same content exists under multiple URLs, search engines will interpret this as duplicate content.
  2. Subsequently, the search engines will try to make a best guess as to the best URL for your content.
  3. If this condition is true for a large amount of your pages, your website may be judged low quality and be filtered out of the search results.
Duplicate Titles

  • Every page of your website should have a unique title. When a search engine sees duplicate titles it will try to judge the better page and eliminate the other from the index.
Duplicate Meta Descriptions

  • If a large number of pages have identical or very similar meta descriptions, these pages may be filtered for low quality and excluded from the index.
Poor or Low Quality Content

  • In an attempt to create a large number of pages very quickly, many people will employ automated solutions that end up generating pages with fill-in-the-blank or gibberish content. Search engines are getting better at catching this condition and filtering these sites from the index.
Blackhat SEO and Spamming
  • Some people engage in tactics or methods that violate search engine guidelines to achieve higher rankings.
    • They can employ a wide variety of tactics including (but not limited to):
    • Keyword stuffing
    • Link spamming
    • Paid linking
    • Artificial link schemes
    • Sneaky or deceptive redirects
  1. If you employ a tactic that seems to involve tricks or is done primarily to manipulate search engines and artificially inflate rankings, you can be considered to be engaged in blackhat SEO or spamming. This has repercussions for any site you work with, and should be avoided.
  2. For more detailed information, review Google's guidelines.
Conclusion

  • Good SEO takes time, as you need to develop great content and a strong community voice. But this is just what the Internet needs: high-quality
    pages that provide a valuable service to users.
January 23, 2011

How To Do Search Engine Optimization (SEO) Yourself

Basic SEO

The
basic steps of search engine optimization can and should be carried out
by all webmasters as a matter of course. In less competitive markets
doing this alone is often all that is required to achieve top search
engine rankings.

However, it should be borne in mind that
these basic SEO techniques alone won't be sufficient if you are
attempting rank highly for very competitive search terms (keywords) like
"SEO" or "internet marketing."

To rank well in very
competitive searches, detailed analysis of the search engine algorithms
and competitor pages is required. There are more variables to consider,
pages usually need to be tailored for specific SE's, and it's generally
too complex for the SEO newbie to tackle successfully. It's far wiser to
target the some of the billions of easier keywords.

That said, let's look at the aspects of SEO that anyone can do:

 

Keyword Analysis: Identifying The Keywords Your Pages Will Target

You simply MUST get this bit right. Target the wrong words and everything you do from here on out is a complete waste of time.

The
first step is to ascertain what key words people interested in your
topic are typing into the search engines. From the different keyword
phrases that could apply to your page you want to choose 2 or three to
target: The main keyword phrase, and 1 or 2 closely related secondary
keyword phrases.

In deciding which particular phrases to
target, you want to compare the number of searches carried out for that
keyword, with the number of competing pages listed in Google or Yahoo
search results.

How To Do Keyword Research

If you have not yet created the page and want to use free tools, visit the
first. Type in a few 2 or 3 word phrases that you feel relate to your
topic. Tick the box to include synonyms, and then click the "Get More
Keywords" button.

 

By default, results are targeted
to English, United States. If you want another region, say English, UK,
click the "edit" link, make your selection and run the search again.

Two
lists will be returned, one for the keywords you input, and one for
synonyms that Google thinks is related to them, under "Additional
keywords to consider."

If you are happy with your lists,
click on the "Search Volume" column to sort them into most searched
keywords first, and then scroll to the bottom of each set of results and
click on the links to download the keyword lists to your PC.

Note:
You may find your initial ideas are a bit off base and don't return the
kind of phrases you expected. If that's the case, simply change some or
all of your phrases and get more keyword suggestions.

At
this point you may also want to take some of the keyword synonyms and
feed them back into the keyword research tool for more ideas.

When
you've finished, go through your lists deleting irrelevant phrases and
selecting the keywords you think are most appropriate that have anywhere
between a low and medium to high search volume. We'll call these your
"root keywords."

Google's keyword generator doesn't tell us
exactly how many searches are performed for each keyword, so now we
need to plug these root keywords into a tool that will.

Note:
These SEO tools all have free options, if using a paid version you can
skip the Google keyword tool and the checking of SERPs mentioned below.
I've omitted the Yahoo / Overture keyword suggestion tool and others
that use its data because although it's the keyword tool most commonly
referred to, it can be very misleading due to the way it groups plurals
and some synonyms -- in short, it gives inaccurate results in many
instances.

I can't go into great detail on the next part,
because it depends on which keyword analyzer you're using, but basically
you want to run a keyword search on each of your root keywords, which
will give you a list of longer keyword phrases that incorporate your
root words, together with the number of searches performed.

Comparatively
speaking, the more words in a keyword phrase, the easier it will be to
rank highly for it. Thus, "free internet marketing articles to download"
will be vastly easier to rank for than "internet marketing" or even
"internet marketing articles."

However, there's no point in having high search engine rankings for keywords that are seldom searched.

You
can decide for yourself the minimum number of searches a keyword can
have to be considered, and in reality it will also depend on your goals.
For instance, your plan might be to make lots of pages targeting very
easy keywords with few searches (known as "long tail keywords"), looking
at the overall amount of traffic you'll get. Nevertheless, bear in mind
that conversion rates on most sales pages are only in the order of 1-2
percent, meaning 100 visitors is only likely to result in a single sale,
if that.

Your goal is to find keywords that offer the best compromise between high search volume and low competition.

 

 

 

How to find how much competition there is for a keyword?

This
is a simple, if somewhat tedious process (a keyword research tool or
service will automatically show keyword competiton, making life much
easier).

 

Take the keywords that look promising, put them
between quotation marks and search for them on Google, noting the number
of competing pages Google lists (where it says, "Results 1 - 10 of
about ____").

 

The reason to put your keyword phrases in
quotation marks is because only those pages containing that exact phrase
are directly competing with you, giving you an accurate benchmark.

 

Generally
speaking, the newer your website is (both because Google is initially
skeptical of new websites and because as a site ages it's pages start to
gain PageRank and reflect a theme, providing additional leverage), and
the less experienced you are at SEO, will determine the maximum number
of competing pages a keyword phrase can have before you consider it too
difficult (for now at least).

 

The Search Guild search term difficulty
checker can help you get an idea of where you stand. I suggest you put
in a really high and really low competition phrase that you have looked
at on Google to see how they compare, and then the phrase you are
considering targeting.

Many years ago, Sumantra Roy came up with
what he called the Keyword Effectiveness Index (KEI), which you can also
use to help you choose the right keywords to target. The formula is KEI
= P^2/C*1000. That is, the popularity of the keyword squared, divided
by number of competing pages, and multiplied by 1000 to give a nice
number to work with. Keywords with a higher score have a better
popularity to competitor relationship, and are therefore more worthwhile
to target. If you decide to use KEI, the easiest way is to put all your
keyword data into an Excell spreadsheet, and then add a column at the
end to automatically perform the KEI calculation for you.

 

I
understand that might look like a lot of work, and to be fair, it is.
However, I've taken this from the standpoint of someone with absolutely
no idea what keywords to target. If you already have a basic list of
relevant keywords, and have developed a feel for keyword analysis, some
of the above can be skipped, or at least gone into in less detail. The
other thing of course is that like anything else, the more you do it,
the more proficient you become and the less time it takes.

 

Optimize Your Pages (On Page SEO)

 

If
you don't want to go to the trouble of proper keyword research and
simply want to do the bare minimum to improve the rankings of existing
pages, you can start here (although I recommend you at least take the
main keywords of the page and see if you could swap them for better
ones. Try the Google keyword tool's Site-Related Keywords setting).

 

Once you've decided on the keyword phrases for a page:

 

1.
Create a title using your main keyword. If they fit nicely and the
title still reads well also include one or both of your secondary
phrases. Sometimes your main keyword will be part of one of your
secondary keywords, making this easy. Don't make your title really long.

 

2.
Put your title text in the HTML TITLE tag at the top of the page code,
right after the opening HEAD tag. The less clutter the search engine has
to go through before finding the important stuff, the better.

 

For example:

 

<head>

<title>My Title Here</title>

 

3.
Write a description of the page content that would entice someone
reading it to visit your page. Incorporate your keywords, and use your
most important keyword phrase first, because the order gives an
indicator of relevancy. Put this description into a meta description tag
in your HTML code immediately after your TITLE tag.

 

Example:

 

<meta
name="description" content="Learn how main keyword phrase can help you
and what keyword phrase2 is really all about" />

 

4.
Put your keyword phrases into a meta keywords tag immediately after
your meta description tag. Your most important keyword phrase should be
first, followed by the second most important and so on.

 

Example:

 

<meta name="keywords" content="main keyword phrase, keyword phrase2, keyword phrase3" />

 

I
often separate keywords with spaces instead of commas (except on
blogs), ensuring search engines find exact matches to more search
phrases (Google ignores the commas, and gives little weight to the meta
keywords anyway). For example, if your meta keywords tag contains "best
SEO, ranking advice" many SE's won't match for "SEO ranking." Bear in
mind though that this means a few of the smaller -- and consequently,
less important -- search engines will see your keywords as one big
phrase.

 

Avoid repeating any phrase more than two
or three times in either the title, meta description or meta keywords
tags. Never stuff any of them with lots of keywords or use irrelevant
keywords (this is what's known as "keyword stuffing").

 

The
fewer the words in your title, meta keywords and meta description tags,
the more "relevancy points" each of them will get. e.g., take 100% as
the maximum relevancy of the title tag to the page. 100% divided by 20
words gives 5% relevancy for each word. 100% between just 4 words gives a
25% relevancy. Whilst this generally isn't a major issue, it should be
born in mind that the more words you add, the more the importance of
each is diluted

 

5. Put your title text in a H1 or
H2 heading at the top of your page. Try and make this the first text on
the page whenever possible (perhaps by making any preceding text into
images).

 

Example:

 

<h1>My Title Here</h1>

<p>My first paragraph of text</p>

 

Tip: Use CSS to style your heading tags so they aren't huge and suit your page design.

 

6. Use your keyword phrases in the first one or two sentences right after the H1 title.

 

7.
Also use your keyword phrases naturally and SPARINGLY throughout the
content, together with other synonyms. Don't try and force keywords in
where they don't fit. Take the length of the text as your guide to if,
and how often they should be repeated. You can use one of the free Keyword Density Analyzers for this or WebCEO's Density Analysis Report.

 

If
it sounds contrived when you read it, you've probably overdone it.
Better to add more synonyms and other phrases common to the theme (other
terms you might expect to find within the topic, which aren't synonyms
of nor directly related your keywords). I suggest you ignore anything
you might hear about LSI (Latent Semantic Indexing)
-- it's far too complex and based on such a massively large data set
that it's a waste of time trying to manipulate the search engines on
this score, and far easier just to write quality focused content.

 

7.
Use your keyword phrases again at the very end of the page if possible.
I mean the last sentence or two of text on the page, before the closing
BODY tag, not the end of the article.

 

8. If possible, make use of your secondary keyword phrases in H2 or H3 subheadings within your article or content.

 

9.
An image somewhere near the top of the page with a file name of
"main-keyword-phrase-something.gif" and an ALT attribute of "main
keyword phrase something" also helps relevancy.

 

10.
Save the page as "my-main-keyword-phase.html" or "my-page-title.html".
Use hyphens, not underscores as word separators. Google reads a hyphen
as a space, but an underscore as a character.

 

11.
Internal links to the page (links from other pages on your website)
should use its main keyword in the anchor text (the part you click).

 

Example:

 

<a href="my-page-title.html">My Page Title</a>

 

12.
Keep related pages in a single directory (web folder) named after the
common theme. Usually this will be a keyword applicable to them all.

 

13.
Each directory should have an index page listing all the pages within
it, as per point 11 above. Every page in the directory should link bank
to this index page.

 

14. Your website should
consist of an main index page/ homepage, containing links to the index
page of each directory. Ideally keep to 1 level of subdirectories, e.g.,
mysite,com/directory/page.html. Don't go beyond 2 levels deep. Although
if given enough incentive they will, search engines aren't overly
enthusiastic about crawling down further than that, so you'd just be
creating unnecessary difficulties for yourself.

 

15. Make a sitemap
and link to it from your home page. This will further help Google and
the other main search engines find all your pages and monitor updates.
There are many ways you can do this, so your best bet is probably to look on Google
for the solution that fits your needs. My suggestion is to go for
something that updates automatically, or use one of the free online
builders or scripts.

 

SEO Web Design

Web
design is also important to the search engines. Not how the page looks,
but what the code is like underneath. Messy, overly-complicated, or
plain bad code give the search engine spiders a hard time crawling your
pages.

 

If the spiders (also known as crawlers or bots)
can't crawl your pages properly and retrieve all the data they need, the
search engines can't rank them properly.

 

Crawlers have
very basic text browsing abilities. It's important to understand that
they don't see your website in the same way as IE or Firefox does. To
view your page as a bot sees it, use a text browser like Lynx (or use the SE view report in WebCEO).

 

 

Web Design With Search Engines In Mind

 

1. Make sure your HTML code is valid and free from errors. Use the syntax checker in your web page editor, or the free one at W3C.
Broken code makes it hard for the spiders to read your page, and can
result in information being missed, or the page being skipped altogether
if it's really bad. Take this simple scenario; you miss the closing
bracket off a paragraph tag, so your code reads "<pMy keyword is
here." The search engine might ignore your keyword because it thinks
it's part of the tag.

 

2. Have a valid Document Type
declaration at the top of your page. The DOCTYPE tells the search
engine spider what kind of code it can expect to find in your page.
Without the Doctype the crawler is forced to guess. Most of the time it
will guess right, but do your really want to leave something this
important to chance?

 

Also if the code has errors there's a
greater chance of confusion these days, because web pages now come in 2
different varieties. Whilst the majority of the web is still in HTML,
most new sites are written using XHTML. The Doctype declaration has to
be
the very first thing on the page.

 

Examples:

 

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">

 

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"

"http://www.w3.org/TR/html4/loose.dtd">

 

3.
Avoid unnecessary Java or Javascript, especially near the top of the
page. That's not to say you shouldn't use it, just realise large amounts
can be a hurdle to search engine crawlers. For example, if you have
half a page of Javascript that needs to go into the HEAD section of your
page, put the code into another file saved with a ".js" extension and
reference it from your page like this instead:

 

<script language="JavaScript" src="/pathtomy/javascript.js" type="text/javascript"></script>

</head>

 

Most scripts that need to be put in the BODY section will work fine out of the way at the very bottom.

 

4.
Try to avoid using images as links for internal linking between pages
on your site. Use regular text links with keyword anchor text the search
engines can read. If you must use images as links, ensure you put the
keyword phrase of the page you are linking to in the ALT attribute of
the image tag.

 

Don't use Javascript links. Spiders can't
follow them and there's nowhere to put your target keywords. If you
really MUST use these kind of links, repeat the link elsewhere on the
page in plain text.

 

If you want fancy button type links, create them using CSS and text links.

 

5.
Check for and fix or delete broken links in your pages. Your HTML
editor might have a feature to do this, or you can check pages with the
free W3C link checker
(WebCEO does this plus checks for syntax and other problems). Dead
links not only give the spiders a hard time, they indicate to the search
engines that the site is not well maintained or up to date, negatively
effecting your search engine position.

 

6. Make
sure that when a page that doesn't exist is requested, a proper 404
error is returned. A simple check for some kinds of automated spam sites
is to request a few made up, nonsensical page names like
"jko548fvn2se.html" and see if errors are returned or not. Also,
redirecting errors to your homepage and inadvertently sending out 301 or
302 header codes instead of 404 effectively tells the search engine
that all those pages are not missing at all, but have the same duplicate
content. Use Rex Swain's HTTP Viewer
to check your site returns proper 404 header codes. A 301 redirect
should be used instead of a 404 error if a page has simply been moved.

 

7.
Although these days dynamic pages are indexed by the major search
engines, spiders still have problems with URL's containing too many
parameters. You also want to avoid feeding session ID's to bots, and any
other parameters that will end up creating different URL's for the same
page. Not only will that lead to problems with duplicate content, but
it can make your website seem like a bottomless pit to a bot, which
might crawl the same pages at different URL's over and over, but miss
half of your site altogether.

 

8. Use a robots.txt
file at the root of your website to block crawlers from accessing pages
that will result in very similar or duplicate content, or which have no
valuable (topical) content. Also use robots.txt to prevent search
engines indexing anything you don't want public. Errors in this file can
prevent spiders from crawling your site, so don't make any.

 

9.
Frames aren't as much of an issue as they used to be, but I'd still
avoid using them unless I didn't care about search engine positioning.
The problem is that the content of the page you want to rank isn't
actually on that page, it's on another page altogether. Whether or not
the search engine associates one with the other can be a hit or miss
affair. Google advises against using frames.

 

 

Off-Page SEO

Off-Page
SEO refers to search engine optimisation techniques that aren't carried
out on your own website or page, but on other sites.

These
days, only optimizing the content of your pages isn't enough to get
them to rank highly. To do that you need help from other websites in the
form of incoming links, known as as backlinks (links back to your site,
"back links"). In fact, you'll find that Google won't bother listing
your site if it has no backlinks at all.

In essence,
off-page SEO is all about getting quality backlinks relevant to your
topic that assist the search engines in establishing the value of your
page and what it focuses on. You can view each backlink as a vote of
approval for your page. The more inbound links a page has, the greater
its link popularity. Google's PageRank measures this, but in a complex
way that takes many factors into consideration.

 

Notes On Linking Strategies & Increasing PageRank / Link Popularity

1. A single inbound link from a high quality site is worth tens of links from different low quality sites.

2.
Linking out to low value, spammy sites, or those engaged in SEO
practices the search engines frown upon can negatively effect your own
website's rankings.

3. One-way links to your website are of far greater value than reciprocal links obtained from engaging in link exchanges.

4.
Backlinks from pages covering the same or related topics are far more
valuable than those from totally unrelated sites. Have links pointing to
the most relevant page on your site, not simply the homepage (known as
"deep linking").

5. Links to your pages should have one of
its keywords in the anchor text. Employ numerous variations on this text
if you intend to create a high number of backlinks to a page. This
leads to better results and looks more natural to the search engines,
avoiding throwing up a red flag for possibly attempting to manipulate
the listings.

 

6. Avoid participating in organized link
exchanges or link farms. Most of the time this will end up harming
rather than helping your rankings. This is because the search engines
see it as manipulation of the SERPs (Search Engine Results Pages) and
penalize linked websites once they discover the network. Read Google's view on this, and note that even "Excessive reciprocal links or excessive link exchanging" is considered to violate Google's Webmaster Guidelines!

 

7.
Don't build backlinks too fast. Hundreds of backlinks appearing for a
site in a matter of days send a clear signal to the search engines that
you're probably doing something you shouldn't be (in their eyes), simply
because it appears so unnatural. Grow your links steadily over time.

8.
Links from high PR sites are good, but don't obsess about getting them.
You'll often get as much value from a highly targeted low PR link as
from an untargeted high PR one. The search engines aren't the only
reason to have links, and good links bring traffic themselves. Having
said that, if you're on a link building campaign, unless a site looks
particularly good, I wouldn't bother targeting it if it has no PR, e.i.,
is PR0.

9. Good backlinks with targeted keyword anchor
text carry a LOT of weight these days. So much so that if done well,
it's possible to rank a page highly for terms that aren't even on it.

10.
Backlinks are also the way to get your site found and crawled by the
main search engines. I wouldn't bother submitting to the main SE's, it's
generally better and faster to get links from websites that Google or
Yahoo already values, and which are regularly crawled as a consequence.
Let them "discover" your pages themselves, by putting links where you
know they'll be found and followed.

 

How To Get Backlinks?

There
are lots of ways to get backlinks, although few are quick or easy
(tools like SEOelite, WebCEO or Link Assistant help speed this up). Here
are some options:

 

 

  • Create content that makes people link to it (often termed "link bait")
  • List your website in directories
  • Write and syndicate articles
  • Get your page mentioned on bookmark sites, Digg.com, etc
  • Create content on Squidoo and Hubpages that links to your page
  • Post comments to topically related blogs
  • Trackback to topically related blogs
  • Syndicate your blog feed to announcement and aggregator sites
  • Use Tags for links from Technorati.com
  • Make forum posts that include your link
  • Exchange links with other webmasters
  • Buy links

 

 

Closing Thoughts

Congratulations!
Now you know how to do search engine optimization yourself. Of course,
I'd by lying if I didn't admit there ARE are easier ways to do SEO.

 

SEO
Elite might be the more glamorous, but I think WebCEO is the better
choice for the average webmaster (and I say that even though I get a
bigger commission for recommending SEO Elite).

 

Quite
simply, WebCEO has more useful features, covering more bases. Plus its
analysis of your pages and reports on over 130 parameters that affect
your rankings help develop a deeper understanding of SEO as you use it,
and that's in addition to the valuable free SEO course and certification
that's included with WebCEO.

 

Wonderful
as these tools are though, you can still do good SEO without them. If
you simply follow the steps above your pages WILL start to get high
rankings and quality traffic from the search engines.

As you get
more pages ranked, the search engines will value your site more, and it
will become easier to get top positioning for more difficult keyword
phrases. Naturally, you'll also get better and better at SEO too!

December 2, 2010

On Page SEO Optimization Techniques

Definition:

On Page SEO:

"On Page" SEO simply refers to the text and content on your web site pages. Basically editing your page and content so the Search Engine can find your webpage when a surfer is searching for your web sites particular topic.

History:

On Page Search Engine Optimization has been around the longest, since the beginning of search engines. Search engines used simpler less sophisticated technology a few years ago, and the world wide web was a lot smaller. At the time "ON Page" SEO Worked years ago, and it was basically an easy comparison. As the World Wide Web grew larger and larger it became more difficult for search engines to differentiate between your site and other sites. A search on "Autos" may return 100 million + pages that have the word "Auto" on it. So Off Page SEO began to take off as the world wide web and search engines grew in complexity.

On Page Elements:

On Page Elements refer to the HTML tags within the page. They include Heading Tags (<H1>), Title Tags, BoldTags, Italic tags on your web page. Below is an example of phrase "SEO Company" used in a Heading (<h1>) and Bold (<b>) Example:

SEO Company

SEO Company

SEO Company

Notice the difference?

In the HTML Source, The search phrase "SEO Company" Was placed between <h1> tags.

<H1>SEO Company</H1> HTML Tags

In the second version, It was placed between bold tags.

<b>SEO Company</b> HTML tags.

In the third version, it was placed between emphasize tags.

<em>SEO Company</em> HTML tags.

Natural On Page SEO:

Your Search Phrases should be emphasized in a natural way for both the visitor and the search engine spider. Do not "KeyWord Stuff" your web page, by repeating the search phrase over and over again in your webpage. This will often result in a Search Engine "Penalty" and move your sites ranking Lower in the results.

Unethical/Unsavory On Page Techniques:

 

There are several different techniques known as "black hat" or "unethical" On Page Techniques. Some SEO companies engage in these type of activities and should be avoided. Sooner or later the search engines will catch up to these unethical techniques and the likely result will be your site being demoted or banned from the search engines. We recommend the following unethical SEO techniques should not be used.

Negative ON Page SEO Techniques Include:

  • Avoid Using "hidden" or invisible text on your page for the purpose of higher search engine placement. For example the words/text for search phrase "Widget" in the HTML, the font color has been set to White. The background of the page is also white. Therefore the textual content is actually there, however the words are "hidden" from the surfer. This is frowned upon by search engines and frequently results in your site being penalized.
  • Avoid Using Negative Div tags. Div tags are division tags. Unscrupulous SEO services may insert them into your page with negative x/y coordinates to place content outside of the visible page for the surfer, but the text itself is in the HTML page. The search engine finds the keywords in the text, yet the surfer does not see it. Again a technique to be avoided and not recommended under any circumstances.
  • Avoid Cloaking or Sneaky Redirects. Cloaking refers to serving up 2 different types of content based on the visitor who is visiting. Is the visitor a regular web surfer, serve up this page. Is the visitor a search engine spider? Serve up this OTHER page specifically for the search engine spider. The other page being served up is typically garbled textual content with no meaning to a human, and is stuffed with various keywords and search phrases. Again this technique is not recommended and will likely get your site penalized or banned from search engines.
  • Avoid duplicate content. Duplicate content means you create one web site, with content on topic a, and then repeat the content over and over again on multiple websites. In theory you could create one website, achieve high ranking on it, and then clog up the search engines with the same content duplicated on multiple domains. Again this is not recommended and should be avoided.