SEO and Your Web Site

SEO and Your Web Site

Got something to say?

In: Columns > The $ & Sense of IT

By Alan K’necht

Published on August 4, 2004

We all know the importance of having a Web site rank well in search engine results for searches on specific keywords/phrases. If your Web site doesn’t have a page appearing in the top 10 search engine result positions (SERPs) the chances of someone clicking on your listing, and actually visiting your site, drop dramatically. If you’re not in the top 20 you have almost no chance that someone will scan through the SERPs that far to find your page.

Optimizing your site and content for a search engine, for a better ranking in SERPs, is known as Search Engine Optimization (SEO), yet many Web developers/designers either don’t take time to code a site properly or don’t know how to do proper SEO. The basics of code optimization are just sound HTML coding practices; when followed, they go a long way toward SEO.

There is a lot you can do to optimize your Web site for search engines from the code level. Where you can also affect things, and this is beyond the work of the developer/designer, is in the actual content. Understanding how to tag the content, and where to place it in the HTML, is critical. Here is a basic outline of SEO best practices.

Understand the Search Engines and Search Engine Spiders

So how does your site get into a search engine? A search engine obtains your URL either by you submitting your site directly to the search engine or by others linking to your site. Then, at a time of its choosing, a search engine sends out its spider (or “bot”) to visit your site.

Once there, the spider starts reading all the text in the body of the page, including markup elements, all links to other pages and to external sites, plus elements from the page head including some meta tags (depending on the search engine) and the title tag.

It then copies this information back to its central database for indexing at a later date which can be up to two or three months later.

The spider then follows the links on the page, repeating the same process. Spiders are, for lack of a better term, dumb. They can only follow the most basic HTML code. If you’ve encased a link in a fancy JavaScript that the spider won’t understand, the spider will simply ignore both the JavaScript and the link. The same thing applies to forms; spiders can’t fill out forms and click “submit.”

To get an understanding of what a spider sees, try accessing your site with a Lynx browser from a Unix server. Lynx is non-graphical, does not support JavaScripts, and will display only text and regular a href tags. This is what the spider can see and therefore index. Does your page work without graphics or JavaScript? If no, then the spidering won’t work either and you’d better head back the drawing board.

Once the SE has all your content in its database, it runs an algorithm (a mathematical formula) against the content. These algorithms are unique to each SE and are constantly changing, but, in essence, all the search engines are looking for the important words on your page (based on word density—how often a word or phrase is used in relation to the total amount of text) and they assign a value to these words based on the code surrounding the words.

In addition to content, the search engine looks for what other sites, or pages on the same site, are linking to that page. The more links to a given page, the more important that page is. Getting other sites to link to your site is very important, but not part of optimizing your site and will be covered in a future column. From a site optimization standpoint, make sure you link to your important pages from more than just the index page (e.g., create a primary navigation that appears on all pages.)

Tip 1

The first rule of SEO is not to design your site in such a way that the code prevents a spider from being able to index it. This means avoiding pages which are 100% graphics and no text, such as pages that contain all images, or are Flash-only. Furthermore, if the first thing a user encounters is a log-in page, before being able to see the site’s content, then that’s what a spider will see and it won’t go any further, either.

If you’re planning to build a Web site entirely in Flash, DON’T. If you have no choice, then read my previous column, Search Engine Optimization and Non-HTML Sites.

Tip 2

To find out what a spider sees on your site, run a spider simulator on a given page. The simulator will show you what text the spider sees and what links it finds. There are many good ones on the market at various prices. If you’re looking for something that’s free, I’d suggest Search Engine Spider Simulator.

Tip 3

Each Web site should have a file called robots.txt. This file tells the spiders what directories they should not spider. Make sure this file is present and that it gives the appropriate permissions to the spiders. This includes access to content and to CSS.

For more information on the robot.txt file, see: Guide to the Robots Exclusion Protocol.

Page Structure

Once you’ve built an SE-friendly Web site, you then need to be sure each page is also SE-friendly. As I said earlier, good HTML structure is the foundation for building an SEO Web page. There are two primary areas of a Web page. The area contained between the <head></head> tags and that which is contained between the <body></body> tags. What information you place in these areas has a huge impact on how a page is indexed and, to a certain degree, what will appear in the SE results page.

When designing your page, or placing content on your page, remember that spiders read like people. They go from left to right and from top to bottom (though this may be different for other languages.) They also feel that the most important information is located at the top of the page. If it’s important, why would you place it at the bottom? When reading specific tags (title, h1, h2, etc.) search engines value words to the left more highly than words to the right. If you’re looking at hiring an agency for help, make sure to review the best SEO companies in your area and get multiple quotes. Most also provide free consultations.

The Title Tag

Let’s start at one of the first elements in a Web page—the title tag (<title></title>). This is one of the, if not the, most important elements for SEO on the entire page. All too often, the information contained in this tag is either left blank, has a default value (e.g. “insert title here”), or is simply the company name.

Why is this tag so important? First of all, it is used by every major search engine as a key indicator of the page’s content, and, second, it used by the search engine as the first line in the SERPs.

Give this tag the consideration it deserves.

Tip 4

Determine the main topic of the page and use it as the title. A page about high-performance running shoes from manufacturer XYZ shouldn’t have the title “XYZ”—it should have a title something like “High-performance Running Shoes.” If the brand is important, then add it to the end of the line like this: “High Performance Running Shoes – XYZ.”

The Meta Tags

Over the years, various meta tags have come in and gone out of favor with search engines. One of those which has lost its value is the “keywords” meta tag. Most search engines say they don’t look at it anymore but if you have time to create one, go ahead and do so. It doesn’t hurt.

The only meta tag that all search engines presently acknowledge is the “description” meta tag. Once again, this tag should be unique to each page and match the content on the page itself.

The proper format for the description meta tag is, for example:

<meta name="description" content="High-performance running shoes for men and women.">.

Tip 5

Write a unique description for each page. If you use the same meta tag across all pages, the search engine will pick up on this and potentially ignore the content of the meta tag or possibly the entire page.


We’re all familiar with loading the top of the HTML page with all sorts of JavaScript functions that are necessary for various page features. This includes, but is not limited to: mouse-overs, form validators, cookie checkers, etc. To search engine spiders, this is clutter, and, while they ignore it, they still need to wade through all that code to find the real content of the page. Many spiders have timeouts or maximum character counts associated with them—if they have to wade through too much junk, they’ll abandon their spidering and move on to another site. So avoid making your pages too top heavy by placing too much code between the <head> tags.

Tip 6

Put all your JavaScripts in external files and link to them. You’ll be creating an SE-friendly page while also making your markup cleaner and your Web site management easier.

The Page Body

This is the part of the Web page that your visitors will be seeing and yes, you can make pages both eye-pleasing and, at the same time, well-optimized for search engines.

Page Headings and Other Word Graphics

For stylistic reasons, many of us have chosen to display page headings as graphics. By turning to our favorite graphical editor, to create unique and creative headings, we’ve removed important words from our Web pages.

Your Web site users may not really care that it took you four hours to create a groovy page heading that says “Yellow Widgets.” They just want to know that they’re on a page about Yellow Widgets.

From the perspective of a search engine spider, the graphic about yellow widgets is just a graphic and spiders won’t read them. One option is to fill in the “alt” attribute in the “img” tag with the actual words. However, search engines give very little value, if any, to “alt” content these days. This attribute is still a requirement for accessibility, but it won’t do much toward getting your page ranked well in a search engine.

The same thing applies to all those great key words on your site that form your site navigation menu. Perhaps, you’ve created graphics of the words for a mouseover effect, but, once again, they’re graphics and a spider couldn’t care less about them.

Instead of spending all that time creating graphics of words, use real text. They are words, after all. If you must use graphics, consider a form of CSS image replacement; the spider should still be able to access the text of your heading.

Tip 7—Page Titles

Search engines love content that appears in header tags (h1, h2, etc.) yet very few Web sites actually use them. Their original intention was to be the visible title of the page (long before Web browsers actually supported graphics), with the primary title using h1 and subsections of the page encased in h2 tags, and so forth. In the early days of Web design, we had little to no control of these elements and they simply appeared as big black text on your page. This all changed with the introduction of Cascading Style Sheets (CSS.)

Take time to define your header tags in your CSS and use the header tag for the titles and secondary titles of your content.


<h1>Page Title</h1>

To avoid spamming search engines, a Web page should have only one h1 tag. They can have as many h2 tags as necessary.

Tip 8—Mouseovers

Instead of spending all that time creating mouseovers, trying using the hover feature of CSS.

If, for specific reasons, you can’t use CSS (perhaps you must support really old browsers), then repeat the menu options at the bottom of the page with plain text.


Graphic designers love using tables to slice and dice a graphical design to use on the Web. Unfortunately, these designers never really understood that the Web is the Web and not a printed page and that designs should be easy to code into Web pages.

The problem with tables is that all the slicing and dicing can create Web pages containing tables embedded four or more deep to accommodate the design—and all the good content ends up inside the inner-most embedded tables.

From a technical perspective, search engine spiders can read tables, and even embedded tables, but once a design gets to be more than about three tables deep, most spiders run into problems. Either it’s simply too much code for them to keep track of, or the search engine thinks you placed that content deep in the page because it’s not important, and so the engine gives it little or no value.

Update: Correctly formatting tables can be an excellent way to land featured snippets–also known as position zero–which are sections of content Google extracts from your website and displays directly in the search results. In most cases, a table with an h2 heading and 4-7 bullet points or numbered items. (use of ordered or unordered lists depends on what Google is displaying for the snippet). According to Kyle Sanders at CWR SEO, “From our testing, we typically see websites need to rank organically at position 5 or higher to land the featured snippet. In many cases, they’ll test or audition certain sections of content and display the results that receive the most clicks. Format your content accordingly and tweak if it’s not working. HubSpot has a decent guide for beginners and SEMRush’s published a study last year that’s definitely worth reading.”

Tip 9

Avoid unnecessary tables where possible. Limit your table embedding to a depth of three.

Where possible avoid the whole table thing and start using XHTML with div tags and CSS to define position. This makes for a much cleaner design and has the bonus of being easier to manage.

Using Bold and Strong

If there is an important phrase in your content, be sure to tag it appropriately. This is good for the user experience—and since you’re telling your users that the words are important, the search engines are likely to think the same way.

Tip 10

Use either <b></b> or <strong></strong> to mark up important words on your page. While most people use bold (<b></b>), according to the W3C the correct markup is “strong” for important words.


By following this basic outline, you’ve created search engine-friendly pages. Your pages will be easily indexed by the search engine spiders, and, with important words and phrases appropriately tagged, those words will receive proper valuation by the search engines. All that’s left is to identify the appropriate words in the Web site copy and to find out if they are the words people actually search for—then develop an appropriate linking strategy. Those are lessons for another day.

Related Topics: HTML, Search Engine Optimization (SEO), Search, Content

Alan K’necht operates K’nechtology Inc., a search engine optimization and marketing and web development company. He is also a freelance writer, project manager, and accomplished speaker at conferences throughout the world. When he’s not busy working, he can be found chasing his small children or trying to catch some wind while windsurfing or ice/snow sailing.