Book Review: High Performance Web Sites

Book Review: High Performance Web Sites

Got something to say?

Share your comments on this topic with other web professionals

In: Reviews > Book Reviews

By Matthew Pennell

Published on October 29, 2007

After blogging a couple of weeks ago about a forthcoming Apress book on performance from a couple of Yahoo! employees, what should fall into my lap this week but another book on website performance—by yet another Yahoo! engineer. But, this is not just any Y! hacker—Steve Souders is the man behind the YSlow Firefox/Firebug plugin, and project lead for Yahoo!’s recent widescale overhaul of the performance of their front-end websites.

With all the recent work that Souders has put into investigating and solving site performance problems, it would be foolish to ignore the possibility of a book deal at the end of it, and the result is this slim volume from O’Reilly. At only 146 pages, it seems to have been optimized for performance, too—and it packs quite a punch for its size. “If everyone would implement just twenty percent of Steve’s guidelines, the Web would be a dramatically better place … there’s really no excuse for having a sluggish website anymore” says Firebug creator, Joe Hewitt. Once you get into the main part of the book (past the introductory look at HTTP traffic), it’s very simple; it has fourteen chapters, one per rule, in priority order, and one simple premise: Implement these techniques and your sites will be faster. They won’t be just a little zippier—we’re talking orders of magnitude here.

What’s it all about?

So what problems does the book equip us to tackle? According to Souders’ research, “less than 10–20% of the end user response time is spent getting the HTML document from the web server to the browser. If you want to dramatically reduce the response times of your web pages, you have to focus on the other 80–90% of the end user experience.” That other eighty to ninety percent—images, scripts, stylesheets, and redirects—is the target of the rules presented by Souder. Clear graphs, of the sort familiar to anyone using Firebug’s Net tab, illustrate this particularly galling application of the Pareto Principle (here re-imagined as the Performance Golden Rule), as dozens of scripts and images are downloaded until the page is complete.

If you’re familiar with Yahoo!’s work in this area, and/or an avid reader of the Yahoo! Developer Network blog, you’re probably wondering “what do I get in the book that I can’t already find online?” Well, quite a lot, given the small size of the book; each recommendation is fleshed out with an explanation of what is going on, links to demo pages that show the real-world performance enhancements, and discussion of the pros and cons of each technique.

There are several areas where the recommendations conflict with either accepted wisdom, common sense, or maintainability: Image maps, in this day and age? Enormous, non-modular scripts and stylesheets? External scripts that don’t go in the <head>? And just who do you think is going to have to maintain this unholy mess? Luckily, Souders agrees with all your concerns, and presents clear, practical suggestions on ways to both apply his performance rules and create an easy-to-maintain website, although talk of build processes may go over the heads of some non-corporate web developers.

The most impressive statistics are found in the chapter on compression, where gzipping components can deliver a saving in response time of seventy percent. Web 2.0 also doesn’t go unmentioned—the final chapter tackles performance problems with Ajax, specifically caching responses—and the solutions are simpler than you might think.

The second half of the book takes a detailed look at a good cross-section of the web’s top sites and, with the help of the YSlow plugin, analyzes potential performance improvements that could be made in line with the fourteen recommended rules—and it’s this section that you’ll want to show your boss when you’re recommending a front-end performance review instead of optimizing database queries.

Who is it for?

While some of the recommended rules will not be relevant to smaller sites (and, realistically, if you’re not serving billions of page views a year, are you really going to invest in a Content Delivery Network?), the key takeaway from the book is that it is unquestionably worth your time and effort to learn the ins-and-outs of frequently overlooked subjects, such as HTTP, compression, redirects, DNS, and ETags—and to intelligently apply the fourteen rules to best effect, depending on the requirements of your own site(s).

While clearly written and explained, the book is likely to prove heavy going for the beginner, although a handy overview of the HTTP protocol is included at the start of the book, explaining concepts such as Conditional GET Requests and Keep-Alive headers, for those who have never delved too deeply into this foundation of the web.

If your job is to deliver great websites, this book might just change the way you approach that next build. Even if it all sounds far too technical for right-brained creative types, you should definitely buy this book—even if it’s for the network manager in your life.

Got something to say?

Share your comments  with other professionals (2 comments)

Related Topics: Web Maintenance, Technology, Programming, Planning, Content, Browsers, Basics

Matthew Pennell works as a senior designer for one of Europe’s leading hotel booking websites, writing semantic XHTML, bleeding-edge CSS and JavaScript that usually works. He is the former Managing Editor and former Editor in Chief of Digital Web, and blogs at The Watchmaker Project.