Toronto Search Engine Strategies Conference

Toronto Search Engine Strategies Conference

Got something to say?

Share your comments on this topic with other web professionals

In: Articles

By Rudy Limeback

Published on May 25, 2005

Earlier this month, JupiterMedia’s two-day Search Engine Strategies Conference came to Toronto, giving me a “don’t miss” opportunity to learn the latest in search engine optimization without incurring travel expenses (always attractive to a freelancer). For me, it was a valuable conference and a great experience.

My Web development background began with writing HTML for Netscape 1, and I’ve used search engines since the first days of Lycos, Excite and AltaVista. Even if you haven’t been on the Web as long, you’ve likely collected search engine information and advice—perhaps some of it conflicting—on how to get good search engine results for your Web sites.

But how much of what we’ve picked up over the years is really useful today? Search engines have certainly evolved; have we? My objective attending this conference was to catch up to the “state of the art” of search engine optimization (SEO), as well as to review the conference for Digital Web Magazine readers so those of you who, like me, aren’t search engine specialists, can decide whether to go. (The conference is presented in several cities around the world. Check the Search Engine Strategies site for details.)

The first thing you need to do when entering the search engine marketing (SEM) world is learn the jargon. The topic itself is usually broken down into two components:

  • SEO, which deals with steps you must take to make your site “search engine friendly,”
  • Pay-per-click (PPC), which covers all aspects of paid listings.

You should also know about search engine results pages (SERPs), and “organic” listings, which are listings the search engines find by themselves, as opposed to paid listings, which are usually shown separately from organic listings on the SERPs. Finally, PageRank (named after Larry Page, a Google founder) indicates the importance of a page. PageRank has gotten a lot of attention in recent years, but according to at least one speaker at the conference, it is losing its relevance.

SEO Strategies

The individual sessions provided numerous handy tips and insights. Here are samples from my notes:

  • Your primary audience is human. Search engines should receive secondary consideration. Fortunately, SEO will make your site friendly not only to search engines, but also to your human visitors.
  • Search engine robots look for two things: text and links. Obviously, if your text is not relevant and your links aren’t usable by the robot, your site is not going to show up very high in the SERPs.
  • Be honest. Don’t hide text, don’t hide links, don’t misuse ALTs, don’t stuff keywords, don’t have non-relevant text, don’t use link farms.
  • “People will click up to 25 times as long as they think they are making progress.”
    Shari Thurow, GrantasticDesigns.com
  • Search engines will index only the text on your page. To see what they see, highlight your entire page and paste it into a text editor.
  • “[SEO] is not about the keywords you want to be found on, it’s about the keywords your users use to find you.”
    Christine Churchill, KeyRelevance
  • Mine your log files. Infrequently used search queries are a great source of ideas for improving your site.
  • Use keyword tools like Wordtracker, NicheBOT and Trellian but be aware that some results might be skewed by automatic rank checking algorithms.
  • Search engines measure popularity. The number and quality of the links that point to your site matter. The quality of incoming links has far more weight than quantity. Getting listed in directories such as DMOZ is the quickest way to get a high quality link.
  • Choose keywords carefully. Select no more than three phrases per page, avoid single words, and base your choices on popularity, relevance, and user intent. User intent is important, because these are the keywords people are actually using to find what they want, and not necessarily the keywords you would choose to describe your own content.
  • Submit pages to the search engines’ add URL pages—once. Be very careful with automated submission programs.

Future Directions

The major search engines are becoming smarter and their algorithms more intelligent. Google recently purchased a domain name registrar, and now has access to the entire domain name database. Google can now capture the following information:

  • Length of the current domain registration (number of years)
  • Names and addresses of site owners, administrators, and technical contacts
  • Hosting company
  • How frequently this data might change

Combined with data obtained by the Googlebot (number of pages in the site, content, and links to and from these pages), this domain information could be quite revealing. Authoritative and stable domains are often paid for several years in advance, while “doorway” or “throwaway” domains usually have real sites behind them only for a year or less. Presumably, Google could make conclusions about what the site is actually being used for, and rank it higher based on its stability. In fact, Google has filed a patent claim for this historical analysis, details of which you can read in the article Google’s Patent: Information Retrieval Based on Historical Data.

Case Study

The presentation by National Instruments offered a fascinating account of SEO. Because most sites have competitors, not only for products and services, but also for the same keyword space, maintaining a high search engine ranking for your site for your keywords over longer periods of time takes work. And that’s the problem. To compete with the “algorithm chasers” as they tweak their SEO techniques and occasionally overtake you in the rankings, you have to expend time and energy. You have to become an expert on what works in which search engines. And then when the search engines change their algorithms, you have to change your techniques and start over.

The strategy that National Instruments adopted was not to play that game. Instead, they decided to regard all search engines as black boxes, and concentrate on creating the most relevant content. They implemented an in-house enterprise content grading system based on the FAST engine. Authors place their chosen keywords into META tags, submit pages to the grading system and receive a relevancy score. If the score is unacceptable, authors refine the content and resubmit until it hits a desired level before publishing the page to the site.

The benefit? Significant cost savings from not having to train authors how to be SEO specialists, as well as pages that consistently rank higher than competitors’ pages for the chosen keywords, simply because the content has been optimized for those keywords.

Conclusion

The best part of attending a conference like this is the swag. Uh oh, did I just say that out loud?

In addition to the information-packed sessions, a key benefit of attending a conference is the opportunity to meet peers and presenters informally. After presentations, between sessions, at coffee break and lunch times, and—if you’re friendly enough—in the evening of the first day, you can compare notes and ideas, ask questions, and generally get a feel for where things are going. There’s a lot of opportunity, and let’s face it, it’s more fun than digging up the same information, via, um, you know, search engines…

The Search Engine Strategies Conference gets my recommendation.

For additional opinions, check out:

Got something to say?

Share your comments  with other professionals (2 comments)

Related Topics: Search Engine Optimization (SEO), Search, E-Marketing

Rudy Limeback is Digital Web Magazine’s Database Consultant and was Technical Editor for about six years, responsible for posting articles to the site using valid HTML.