Digital Web Magazine

The web professional's online magazine of choice.

Dollars & Sense of Web Analytics : Comments

By Alan K'necht

October 3, 2005



October 4, 2005 1:17 AM

More of a question. I find it harder to analyse a Content managed website (since pages are created when they are requested), how do I overcome this challenge?

Mark Halliday

October 4, 2005 5:41 AM


That a limitation of your log-file analyzing metrics tools. Most of these apps will truncate any any parameters on the url – the result is that if you have dynamic content with urls like, and , they show up aggregated in the app as What a lot of us rely on now are metrics apps that leverage spotlight tags, where you place a small image on the page that really isn’t an image, but a call-back to the metric app with page specific information. In this way, you can apply some meta information to each page (you can give dynamic pages specific names, not just urls), and they are then measured properly. Takes a little planning out-of-the box, but its worth it in the long term. Programs from Omniture and WebSide Story use spotlight tags, where a programs such as WebTrends (log analyzer) and Accrue (network sniffer) work on raw log data.

Jeff Adams

October 4, 2005 6:52 AM

What analysis program are the screenshots in this article from?

Paul Irish

October 4, 2005 4:26 PM

Jeff, those screenshots are from WebTrends (

Mark, though WebTrends initially worked from log files, you can now use tagging OR logfiles.. (tagging is the smarter solution, imo). Regardless, you can still read in query arguments in many logfile-based analytics programs.

Alan K'necht

October 4, 2005 4:57 PM


Depending on your web analytics program you can modify the way it handles dynamica URLs and not have it truncate the URI stem at the parameters.

This is a must for web analytics tool that is going to be used to analyze dynamic web site content.

Alan K'necht

October 4, 2005 5:01 PM

Mark Halliday,

Jeff is correct that WebTrends offers two types of solutions. Traditional log analysis which has its pros and cons and a tagging solution (called Smart Data Collector) which has its own pros and cons.

From my experience, there are short comings regardless if you analyze traditional log files or log files generated by a tagging solution.

One of the benefits that many companies like in tagging solutions is that many web analytic companies offer to host the solution (no need to purchase hardware and software and then to manage the server). This is my least favourite solution as I no longer have control of the data. It’s on someone elses server. Yet for some organizations this is a great solution.

Alan K'necht

October 4, 2005 5:02 PM


Paul is correct once again that those screen shots are from WebTrends. I use and consult on several differnt web analytics programs. I chose WebTrends because I really like the look of their graphics.

Julian Taverner

October 4, 2005 5:34 PM

More of a question. I find it harder to analyse a Content managed website (since pages are created when they are requested), how do I overcome this challenge?


We use phpMyVisites (OSS too) along side a Nielsen//NetRatings package & have found it quite comparable (both use spotlight tags). phpMyVisites is quite up to the task of identifying differnet URLs based on the different GET parameters (e.g. ?id=234). We have a large number of pages that need these parameters & we like to track these to provide us feedback to the various courses that we have.

Cheers Julian

Anthony Ettinger

October 5, 2005 9:13 AM

I’ve been satisfied with AWStats (see the demo at It’s GPL open source software.

Mark Halliday

October 6, 2005 7:37 AM


I enjoyed the article and appreciate the feedback. I think the solution rests on the needs of the individual/company and the goals that they’ve established. That, combined with how much value (and how much they can realistically afford), will point them in the right direction.

I would agree that both log files and spotlight tags have limitations (for one opinion on these limitations, see:, but the benefit of tagging pages far outweighs the limitations (which largely can be overcome with server side programming, like php).

Looking at the industry as a whole, the trend seems to be towards tagging (not that I endorse following trends), as traditional log-file apps, like the aforementioned WebTrends, have added tagging capability, and I don’t see an inverse. Perhaps you could write a follow-up article on tagged-based solutions?

For anyone who

Alan K'necht

October 6, 2005 4:24 PM


Thanks for the link to the ClickTracks article. For the most part unbiased considering it’s from a vendor.

WebTrends like ClickTracks allows for both log analysis and java script solutions as do a few other vendors. Which one is right for you, well as Mark suggested perhaps that’s another article.

My biggest problem with the java script solution (unless you are hosting your own), is who owns the data (referenced in the article). I had a client who was very unhappy with their java script 3rd party hosted solution. When they moved to another solution, they still paid their monthly feed to the orginal company, just so they could access their data a little longer. Eventually the exported what they wanted, but that was a lot of additional work.

That being said, I’ve hooked companies up with the java script solution and they’re thrilled. It was just the right solution for them.

Just remember, when using a java script solution as the artilce pointed out, there are problems with the use of 3rd party cookies and what the article didn’t point out is there are some people or organizations that block java scripts as well as cookies.

So boys and girls, there is no 100% right answer and all web analytic tools are not 100% accurate. They help measure trends and with some tweaking how well your site is performing (conversion, seo, etc.)


October 7, 2005 7:29 AM

Interesting reading indeed!

First of all, concerning Content Management Systems and dynamic sites or URL that include ? and non usable IDs, note that WebTrends allows for truncation or not of the URLs. By default, they are indeed bundled but you need to uncheck the box in order to unbundle the information.
Furthermore, IDs can be translated in the programm through the use of Translation Files in Excel. Ideally, bridges are created between the CMS system and your web analytics software so that when analysis starts, the translation file is up to date.

Now, concerning log files and tagging. I was surprised to see that it is shown as one solution or the other. In my experience, we often start with log files during the first phase of the Web Analytics (WA) project and then redefine the use of some tags for specific reports.
Clearly the use of tags has advantages (smaller size of logfiles, customized information, etc.) but you can’t loose the log files altogether for some more “technically driven” information such as client or server errors, bandwidth, etc.
We often end up, after usually 4 phases of a WA project, with 80% tags and 20% log files.

Last but not least: cookies. They must indeed be persistent but there is a difference between 1st and 3rd party cookies.
Today’s configuration require more and more 1st party cookies but we keep 3rd party cookies for reporting accross different (sub)domains.

Amrit Hallan

October 8, 2005 7:25 PM

I remember in the early 2000s some clients set their own websites as their home page to increase the number of hits they got. It took me some effort to drive in the point that it was not the traffic that mattered, or how many pages were being loaded per day, but how many visitors were really willing to do business with them, and that was the kind of traffic they really needed.

Luckily, now the situation has tremendously improved, at least among those clients that are seeking quality content for their websites because good content means relevant visitors.

Sorry, comments are closed.

Media Temple

via Ad Packs