Danny Sullivan and Avi Rappoport
Published on July 16, 2002
Search engine listings and submissions
Digital Web: What are the basic essentials for submitting to search engines?
DS: You should submit your site to the key search engines that offer free submit forms, consider the important ones that require mandatory fees and build a site to be “search engine friendly,” so as to capture traffic from search engines “naturally.” These are fully described in “Essentials of Search Engine Submission“.
Digital Web: At the least, what should designers understand about how search engines work?
DS: They need to understand that they way they build a site can have a severe impact on whether crawler-based search engines will send that site free traffic. Build a graphic-heavy site complete with a splash page done in Flash, frames, and little HTML text and you’ve constructed a search engine’s nightmare. The site will be almost rendered invisible to search engines. In contrast, it is possible to make attractive yet textual sites that bring in search engine traffic, as well as be pleasing to humans.
Digital Web: In an article on SearchEngineWatch.com, you recommend avoiding search engine spamming. What is search engine spamming and why is it not recommended other than the fact that it’s called “spam?”
DS: There are many things considered to be spam, such as excessively repeating a word excessively, like many times in a row, try and boost the relevancy of a page for that word. Rather than try to list everything, there’s another way to avoid problems. If you are making a change solely for a search engine and which won’t benefit a human, you might be stumbling into spam. For example, changing the title of a web pages helps humans and crawler-based search engines. Inserting “invisible” text at the bottom of the page, which humans can’t see, is probably done solely for spiders and which is thus likely to be seen as spam.
Web design in searching
Digital Web: You said, “Meta tags are not a solution.” Why is this the case? How should meta tags be used?
You can provide keywords and descriptions in meta tags on pages that for various reasons lack text like splash and frames pages. They might boost a page’s relevancy. However, simply including a meta tag doesn’t guarantee that a page’s leap to the top of every search engine listing. They are a useful, but not a magic solution.
Digital Web: Users often don’t get the results they want when searching. Web designers can’t make them better searchers, but they can help them get the results. How can designers improve the search tool?
AR: If the search engine is rigid and requires an exact phrase to find a match, SQL or Boolean operators, get rid of it! Everyone’s used to nice flexible Web search, like Google and AltaVista, so something complex is not going to fly. Other common problems include search tools that don’t recognize the semi-structured nature of Web pages: If they don’t give extra weight to phrase matches in the title, they’re just not paying attention to the real world.
Cleaning up the contents of web sites can also help a lot: make sure that the pages are accessible to the indexer, that they all have unique names, that the meta description tag is filled in, especially if you have lots of navigation text.
Digital Web: Designers question whether or not there are users who always search when arriving at a site and spend enormous time trying to perfect the site search capabilities. What do the stats say about users frequently using search?
AR: Jakob Nielsen did a study and found that about 50% of users at that time went directly to the search field–that’s a lot! I think that it’s not so much the users–it’s context. If I go to find something specific, I go to a search box. When I’m browsing, I click links. Everyone does a bit of one and a bit of the other. It’s worth it to clean up search, to make it reasonable and helpful and mostly good, but I would never suggest that any designer spend hundreds of hours trying to “perfect” search. You can never entirely anticipate what’s in site visitors’ brains.
DS: There have been some interesting studies by Jared Spool who suggests that sites without any search at all can be successful if they have good navigation. Usability expert Jakob Nielsen would probably disagree, as he sees site search as essential. I, too, think having search is important. People do expect it and I think it can be helpful. But I would agree with Spool that you don’t want to depend solely on it.
Searchtools.com shows how to approach the development of site searching capabilities. It also offers advice to designers in how to be more effective in helping visitors find what they want.
AR: The factors you mention are mainly issues when setting up robots to follow links and find pages to index. This is both for local search tools and web-wide public search engines like Lycos or Excite, and for low-end browsers such as those on slow dialup connections and visually impaired people using speaking browsers. So it’s worth fixing all the way around. It’s pretty easy to set up an alternate <noscript> tag section with links and text–if you do that, everyone can read the words and traverse the site.
DS: It is true. The design of the page can help enhance it coming up for particular terms.
Tips for structuring a Web document:
- Put target keywords in the page title–most important.
- Put keywords “high” on the page and in the first paragraphs, if possible.
- Watch out for tables, which can “push” text further down the page, making keywords less relevant because they appear lower on the page.
- Use HTML text whenever possible.
- Consider “expanding” your text references–for example: don’t use just “collecting” when it’s a page about stamp collecting. Use both words, if possible.
- Avoid only image map links from the home page to inside pages by having HTML links.
- Work around frames because major search engines cannot follow frame links avoid symbols in URLs, especially the ? symbol because search engines choke on it.
Digital Web: What roadblocks does Flash (including the newest MX) put on search engines? How can those be overcome?
DS: Flash content is graphic and not read by search engines. Imagine you make a TV ad that has no sounds, just pictures. Now broadcast that on radio. The pictures aren’t seen, no sound is heard, so there’s just nothing. Search engines don’t see pictures. The “sound” they do get is text. No text, then nothing.
AR: If the Flash content is just interactive, games, marketing fluff or branding, it doesn’t really matter. If Flash contains a lot of text content, though, you need to find a way to make it visible to search engine indexers. Exporting from Flash means that you get the links and text in comment tags–if you remove the comments and surround them with <noscript> tags, the search indexer can read them. Or store your information in a database or content management systems and export to both Flash and simple HTML files.
Digital Web: What are the most common mistakes you see with regard to search capabilities?
AR: People forgetting to put unique titles on their pages. It seems so simple! The most common cases are frameset pages and PDF documents, but you can fix those with very little trouble.
Another very common problem is to make long multi-topic pages instead of short focused ones. FAQs and blogs have this problem. Search engines just have a hard time with this, because they can only match words and count frequencies. This is one of those things that the Search Engine Optimization people talk about a lot, but it applies to site search tools as well.
DS: Using the same title on every page. Depending too much on graphics. Poor internal link structure within a site.
Digital Web: Are Open Source search engines the answer?
AR: It depends on the expertise of the site development team. If you have folks who can handle compiling code and working with command lines and config files, some of the Open Source search engines, such as SWISH-E, ht://Dig, Lucene, mnoGoSearch and ASPSeek, work very nicely. They have supportive communities and there is no fee. If you want to reserve your most technical folks for other aspects, I’d recommend getting a search engine with a browser administration interface–that way editors and librarians can maintain the system. Phantom and dtSearch are my current favorites on Windows.
Digital Web: Do remote ASP search engines really work?
AR: The remote model works pretty well–all the server functionality stays in their server farm so you don’t have to worry about uptime or disk space. The ASP search engine sends a robot to follow links and spider your Web site, and then stores the index on their server. When one of your site visitors types the search in the field and clicks the button, the form goes to the remote search server; they look in the index, find the matches, format the results page and send it back. The links go to your site, and it works seamlessly. These search engines have browser interfaces so you can control them remotely, but the interface quality, search quality and response time varies, so it’s a good idea to test several of them.
Digital Web: What are your favorite high-end search engines?
AR: There are a lot of good ones out there, my recommendations tend to be very specific to each site or intranet. I like Inktomi Enterprise Search, because it’s a good search engine, has a lot of configuration and control options right in the browser admin interface, and because you can set up several search server and query them all at once.
The Google Search Appliance is pretty nifty as well: it comes pre-installed on hardware, so you don’t have to worry about the server. It’s got a fast robot indexer and lots of control options, also in a browser admin interface–gives you wonderful real-time indexing reports. The search is fast and the relevance is generally excellent. Other high-end engines have specific strengths: FAST Search does some cool stuff with indexing news feeds, AltaVista has an SDK that is quite impressive in ecommerce, Hummingbird is nicely integrated into their CMS, Atomz scales up very well for remote search, and so on.
Digital Web: What is cloaking technology? Should it be used? Why or why not?
DS: Cloaking means that you show a search engine spider something different than a human would see. It can be a useful way to feed a product database to a spider or to show it content better optimized for its algorithm. I tend to think its something ecommerce merchants more than anyone else may want to use, send their databases to search engines via “trusted feed” programs. However, anyone who thinks it’s useful may want to do it, but they should never do it if the search engines haven’t approved it. Google absolutely does not like it. Others with paid inclusion programs may allow it, especially in their trusted feed programs, but you should check before paying to submit a cloaked page.
Search from the Business Perspective
Digital Web: Why do search engines play an important role in the online marketing mix?
DS: Because so many people use them that ignoring them is like ignoring TV, radio or /files/includes/print.css in the offline world. They help increase sales leads on the Net, obtain a larger share of the viewing audience, out position their competition, and increase qualified traffic. One company traced US$760,000 to search engine referrals proving the power of search engine marketing.
Digital Web: At a minimum, what can designers do to help the client’s site get off to a good start with search engine placement?
DS: A minimal budget can go to Yahoo! For an annual flat fee of $300, Yahoo! pays for itself in traffic. This is the absolute minimum a search engine budget should include. Yes, you can get listed for free, but it won’t give you higher traffic than paid placement ads.
Digital Web: What are the steps for successful planning and execution of a search engine optimization campaign?
DS: Search engine optimization means ensuring that web pages rank well for particular terms, especially with crawler-based search engines and focused in ways that help improve the chances they will be found.
Think of it like a lottery. Search engine submission is akin to purchasing a lottery ticket. Having a ticket doesn’t mean that you will win, but you must have a ticket to have any chance at all. Let’s assume there was a way to increase the odds of winning by picking your lottery numbers carefully. Search engine optimization is like this. It’s making sure that the numbers you select are more likely to win than purchasing a set of numbers at random.
Digital Web: How does search engine optimization help with advertising?
DS: It can help you get free traffic or highly qualified paid traffic. Be prepared by writing a 25 word or less description of the entire web site. That description should make use of the two or three key terms that will lead people to the Web site. If you have time, you should consider researching what are the best terms for the site, rather than guessing. The What People Search For page has a list of resources that will allow you to do such research.
Digital Web: How can you measure traffic and ROI?
DS: Looking at log files or using more sophisticated tools such as those provided by Inceptor.com, to name one example.
Digital Web: Search engines is a logical area since it’s all about math and facts and doesn’t seem creative. Does creativity come to play for you? What does it mean to you?
AR: The math part of search engines–the retrieval algorithm–turns out to be a very small part of the whole search engine. I’m very user-focused and believe in testing and search log analysis. In my experience, it’s much more important to be clear about what people are getting, than to twiddle the relevance algorithm. There’s a lot of creativity in understanding what people really want when they type in a one-word search, and how to give it to them in the best way possible.
Digital Web: You’ve been covering search engines for a long time. What inspires you to keep going?
DS: It’s an interesting subject where there’s still a lot of confusion. It’s nice to help clarify things for people.
AR: I got into this partly because I was dismayed to see that companies were spending a lot of money and time on terrible search engines (Excite for Web Servers was one of the worst). I wanted to provide unbiased information to set against all the grandiose claims from vendors. New search engines and new versions of old search engines pop up all the time, with innovative and creative approaches, but there are still some pretty awful search engines out there, and marketing hype has just gotten more outrageous. So there’s plenty of scope for me!
Meryl K. Evans, content maven, is a WaSP member even though she’s far from being a WASP. The content maven writes a column for PC Today and blogs for the Web Design Reference Guide at InformIT. Meryl provides the home for the CSS Collection and she’s the editor of Professional Services Journal, meryl’s notes :: the newsletter as well as other newsletters, so tell all your friends, families and animals to subscribe. Her ancient blog keeps cluckin’ since its arrival on the web in 2000.