SEO Basics

SEO (Search Engine Optimization) is one of the most cost effective source of traffic available to you.  However, I rarely hear of anyone hiring SEO talent who was happy with the return on investment.  The key to successful SEO, for most webmasters, is to do it yourself.  Spend a few hours of your time to learn how to incorporate SEO into your pages and you'll get the benefits without the expense.  This page explains the basics and is all you need to get started.


If you're young and impatient, you can
start with this simple
5 step recipe: Search Engine Optimization in a Nutshell

Search Engines vs Directories

Search engines and directories are often
confused. It is important to understand the differences.

Search Engines use robots
Search engines use web crawling robots and computer algorithms to figure out how to list sites. These algorithms rely on keyword density and placement calculations as well as analysis of "link popularity" to determine your rank. Google is a popular search engine.
Directories use people
Directories have human editors who categorize your site and write its link title and description. A listing on a major Directory will influence your link popularity significantly and increase your rank on search engines. Yahoo is the largest directory.
Search engines and directories both supply targeted traffic because the surfer usually arrives at your site after doing a keyword search.  If you are optimizing your site correctly this traffic is more likely to buy your product. 

Overview of Search Engine Submission

Search engine submission is a three
step process:

1. Design your page to include search engine optimization
Read this page for details
2. Submit your pages to major directories
For link popularity and traffic
3. Submit pages to search engines or just wait for them to find you
Many people believe stongly in submitting their pages to search engines. Others believe that pages are ranked better if spiders find them on their own. I don't submit pages and I've never been hurt by it.

Basics of Search Engine Ranking

Designing pages that rank highly is part art, part technology. Before you can begin you need information on what the search engines want to see.  

Most search engines consider page title and the visible text on the page to be the most important "on page" ranking criteria.  (Off page ranking criteria is mostly about inbound link, or pagerank.)  More weight to is given to headlines or the first few words. With that in mind be sure to put your key words in the title, headline, and body of the page.

Keyword Research

Experts spend more time on keyword research than any other aspect of search engine optimization. A good keyword research tool will tell you exactly what words and phrases surfers are actually using. Overture provides keyword research data free of charge, but GoTo's data is skewed because they ignore plurals and spaces. It's a good start though. WordSpot sells superior data but at high cost. Their "WordSpot Now" Keyword and Keyphrase Report gives you 100 Reports for $425. To save money I use GoTo to get a feel for the market, then WordSpot to get the details.


Titles are extremely important for success with most search engines. A title like "Charlie's cool site" may be fun, but "Boston Golf and Caddie Shop" will get 200 times the search engine hits. Usability hint: the text in your title is saved as a bookmark and should make sense to the surfer when seen there or on a search engine results page. It also helps people figure out where they are on your site. A well designed title has relevant keywords but is also a navigation aid.

META tags

There are many meta tags, but most of them are useless and can even hurt your rank. You only need one META tag: the meta description tag. For an example of meta tag usage, view the source of this web page to see how I have used them.

  • Meta tags belong in the header
    section of your web page (between the <head> and </head>
  • Use all small letters! (golf
    = golf and Golf, but GOLF = GOLF not golf).
The meta description tag allows you to control how a search engine describes your site. Without a META description tag the search engine will display the site title
followed by the first 200 characters of visible text on the page. Since your first 200 characters probably won't be the best description of your site, it's a much better to tell the search engine what you want it to say. Be sure your description contains keywords and is appealing to the surfer. Remember, this will often be the first thing the surfer sees when they are deciding whether to click to your site.

The meta keyword tag is now ignored by many search engines and some engines even reduce your rank for using it. I use it because I'm an old timer but it is safe to omit this tag.

The meta robots tags


are useful if you want to prevent search engines from indexing individual pages.  

Remember that nothing in a META tag is
visible to a surfer on your page (unless he views your source).  The description and keywords you put in meta are only used by search engines. The only time a surfer sees the content of your META tags is when he sees your listing on the search engine's results page.

Some Rules of Thumb
  • Do not use "text same color as background." Invisible text will get your page banned.
  • Do not repeat a key word or phrase more than 3 times per tag. It's considered "spamdexing" and can get you banned. (Some optimizers believe that up to 7 repetitions is ok.)
  • On link popularity: The best links are text links that have your keywords in them. Links from sites that
    are listed in Yahoo and other large directories are the most valuable. If you run a site for webmasters please exchange links with us.
  • If you manage to get one of your pages into the top 20, leave it alone. Additional tweaking of a successfull
    page is much more likely to decrease its rank than raise it.
  • "Keyword density" is a weighted factor on many search engines.  Density can be too high. Many search engines set an upper threshold to eliminate spam sites. 1-3% keyword density is generally good although some search engines will favor pages with a density as
    high as 7%. Study the individual search engines.

Search Engine Submission

Many experts do not submit their pages

An AltaVista engineer recently said "99% of all submissions
are spam." With that being true, many people believe that allowing a search engine to find your site with its spider is far better than submitting the page. This is the philosophy I use and I have many high rankings for competitive terms. The trick is to make sure your site is listed in Yahoo. If it is, you'll be indexed without submission. But if you insist on submitting your pages, read on:

Your first shot is your best shot at submitting your site, so give some thought to your site title, keywords, and description before you submit your site the first time. Use the Search Engine Watch list of the major search engines to submit your page.

Do not use a search engine submission service. Manual submission may take longer but it's much more reliable and you will know immediately if something goes wrong. Do not submit your pages often and expect most search engines to take weeks to add your site.


Search Engine Optimization

Advanced optimization is only for people with good technical skills. Advanced optimization techniques change
constantly and take two forms, depending on where you get your data:

1. Public data
Becoming a search engine expert with public data is a full time job at first, but after a few months of hard work it only takes a few hours per week to stay on top of the subject.
2. Proprietary data
Proprietary data is derived by reverse engineering the search engines. Reverse engineering the engines is an extremely time consuming and technical endeavor and not something to undertake casually. The advantage is that you will be in a small, elite group of optimizers who know things the public doesn't know. (This is how Planet Ocean gets their data.) That's a huge advantage if SEO is your primary method of getting traffic. Reverse engineering is a complex, fast changing subject which is beyond the scope of these pages.

Worth Mentioning:


Cloaking software presents the search engines with one page while serving a different page to surfers.

Generally, cloaking is done by sites in highly competitive markets, such as @dult, g@mbling, or if you target valuable terms such as "insurance." Cloaking does two things for you: It prevents competition from ripping off your highly optimized source code, and it allows you to create a separate page for surfers which is both visually appealing and is optimized for selling your products rather than for ranking your site in the search engines.

Warning: Search engines may ban cloaked sites. Some engines will tolerate cloaking so long as the subject of the cloaked page and the surfer version are the same, other engines will ban you for cloaking, period. If you cloak, you're playing very close to the fire.

If you're gonna cloak, you must use quality cloaking software which includes a constantly updated database of search engine spiders. Beyond Engineering offers such a service.

Further Reading

For more information I recommend:
After you learn the basics one of the best sources of breaking news is Search Engine Forums.