Measuring User Experience
Published on September 24, 2001
As a hardworking Web developer who wants to please or appease the client, you've built the best site you could in the given circumstances. You've built the site even though you asked the client for clear goals, a defined vision, and the missing content (all of which were mysteriously absent in previous client meetings).
A couple months after completing the site, things tend to go back to normal. The remnants of quality all-nighters are gone - the empty cola cans and greasy fast food cartons have been thrown away and the rolled-up carpet under your desk that was used for a pillow has been flattened. That's when the same client comes strolling back into your life - looking well rested too, but also worried. The client asks very politely if her site is a success. You laugh for a second (using your inner voice so as to not offend the client) until you realize you don't know either. Or worse yet, you don't know where to even begin evaluating the success of a web site.
The first place many Web developers and businesses look when evaluating a site's success is the log files. Log files are text files compiled by servers which are constantly and diligently recording numerous aspects of a visitor's trail on a web site. From pages and images requested to the type of browser and IP address being used by viewers of the site, these are all aspects of the user's experience that are tallied in lifeless lines of information. It's not until a software application renders the information into a "human friendly" context does the information become understandable, but you still need to know your log file terminology:
Log File Terms
Hit - a simple server request. Individual GIF, JPG, HTML files, etc. all can be counted as a "hit". A typical web page can have numerous "hits". When talking about traffic, hits are a useless term when sites can have hundreds of images on one page that can amount to a very sizable (and misleading) hit count.
Total Sessions - using a time out setting (say 10 minutes) your web log analyzer can guestimate how many times people have viewed your web site.
Total Unique Visitors - the number of unique clients (e.g., browsers) that visited your site
Total Repeat Visitors - number of clients who have visited your site more than once
A log file evaluation based on traffic will only help if you are comparing traffic to some other temporal reference point. e.g., if you are pulling in higher traffic than you were a year ago, then your site is doing better. However, a site may not be successful in meeting a specific goal you or your client has in mind.
Without site traffic as an aid for indicating a site's success, how do you determine how successful is a site? First thing you have to do is determine a goal or goals for your site.
Goals and Objectives
Normally, developing site goal definitions before you build the site is a very good thing. However, a crucial thing like getting content for a client's site is also a very good thing but seems to slip when development starts. So, it's better to start at some point than not to at all.
A goal is relative. You can't assign hard numbers to the goal, but they are great for laying down the groundwork to more specifics. An example of a qualitative goal for a web site would be "to increase the sales through your site's Amazon.com affiliate," or to "have more users register for the sales promotion," or to "fill out the warranty registration form."
With these goals, you can tell if your site is success or not.
Determining how successful and by how much leads us to the next step: objectives.
An objective is a quantitative measurement of a goal. Examples of objectives are: "a $200 increase in Amazon.com commissions," or "250 users sign up for the sales promotion," or "a thousand visitors to the site to fill out the warranty registration form." In other words, you assign hard numbers to your goals. With these hard numbers, you are in essence building a strong measuring stick.
Formulas and Feedback
Now that we've set up goals and objectives, the next step is to look at how we can gather numbers or other feedback to measure. Along with the log report that you've already analyzed, we can do a couple of quick number crunches.
Goal Percentage Formula:
Unique Number of Visitors performing task successfully
Goal Success Percentage
For example, over the past 24 hours 30 people filled out a sample request form, but I had a thousand people view that page. That means my site was 3% effective. There's plenty of room for improvement, but that's not the impressive bit of knowledge. I now know I have to help my client meet a goal to get those sample request submission numbers up.
Also note that if your site is something beyond marketing brochureware, it's good to have more than one goal for a site. If you have more than one goal, don't forget to find the average of the goals to find a site average. That way, you can summarize the site's growth to your manager or client as a value of average site success.
How much money your site made (or lost) last month a.ka. profit (loss)
Value of each visitor
For example, if your site made two hundred dollars profit and you had six thousand visitors to view your catalog of Café Press t-shirts, the value of each visitor is a bit more than 3 cents. This formula is great if your business is strictly a "click and marketing" site without a physical store.
If you do have a physical store, your business needs to look beyond how much a site visitor is worth to you. You have to think of your web site as multifunctional: able to perform marketing, brand awareness and selling (among other things). And that requires more factors about your business and a lot more math. Other methods of getting results from user experience aren't as math intensive.
If you can't seem to increase that three percent success rate of the sample request form, then focus groups or usability lab tests are a way of seeing how visitors are fumbling through your site. Focus groups are ways of getting input, by having a small sampling of users from your target audience interacting with your web site, helping you understand your product or your company's new tagline.
The beautiful thing about focus groups is that they allow users who aren't close to the project to give a review the work. Instead of hearing your boss or co-worker mention their surfing habits as proof of building a site's navigation, you get unbiased feedback.
This is one method that I wish more sites had. Creating and hosting focus groups and usability testing labs can get expensive. However, by putting up a simple web form survey that asks questions to users about your company and its services and/or products, you will get feedback. True, it won't be an honest sampling of your audience, but you are providing a means for your site visitors to speak beyond the trail in the log files. Your log files may not lie to you, but they can't tell you as eloquently as a visitor typing something like this:
Your personal web site's puke green background color is a joke and you should consider a second career choice. Oh, and if you would print more Web Design mousepads I would buy them in a heartbeat for me and all my loved ones.
Since the form requires the visitor to type in a legitimate email address (that gets checked by a script on the server), I have a chance to follow up with her background color choice for my site as well as a pre-order form.
The trap of focus groups and even online surveys is to review them verbatim. Like a teenager, you will need to understand what they are doing and give little weight to what her parents are saying.
For example, the focus group for a company called ComputerLiteracy.com didn't like one of the new names the company was considering almost to the point of repulsion. However, the focus group's recall of the name was outstanding compared to the others. A little weary, but knowing the name had great staying power in the minds of potential customers ComputerLiteracy.com changed to fatbrain.com.
Justifying a web presence as a success or a failure is a skill that every web developer should have. Building Web pages is not rocket science and the same goes for tracking the sites you develop. The hard part in measuring a site's success is often determining what the goals of the site should be - what should you be looking for.
Chances are that one set of metrics won't work for the previous sites you've built. New goals need to be set and that means you and your client need to be honest as to what is the motivation behind the site (along with asking for the missing content).
The easy part, for once, is in doing the math and determining the noise from the feedback.