My Photo

Your email address:

Powered by FeedBlitz

Add to Social Networks, Blog, Bookmark

AddThis Social Bookmark Button

« Whose data is it anyway? | Main | Who’s Web Analytics Data is it Anyway…Part 3 »

January 12, 2009


Jacques Warren

Hi Phil,

Good remarks. I also believe that fee-based solution should make their SaaS version (sometimes it's the only version) data readily available, but at least it's possible to ask for it.

Haven't I seen it! Companies using GA to validate the paid version! If GA is right enough to "validate" other solutions, why not just use it instead? The logic escapes me as well.

I think we have to take into account the fact that, since GA is free, a webmaster can easily install it without going through the budget approval process; I suspect that it has been the case in many organisations. So, you've got webmasters who got creative, tagged the site with GA, and started telling their colleagues that it's better than the paid version, since most of them are just in awe of Google.

Again, at the end of the day, it's not a matter of being pro or con GA, it's a matter of doing professional work. And to this effect, I don't believe GA fully qualifies. It's a darn good tool, and it's got the best quality/cost ratio out there, but I don't think it should be seen and become the tool of record for everybody.

And to that extent, Stephane's concerns are very real.

Stephane Hamel

Double tagging is definitely increasing! When I did the study of the Top 500 retail sites, some paid vendors showed with 30% and nearly 50% of their clients also using GA. For low-end vendors... it's catastrophic! That was last October. I might redo the same analysis for the next eMetrics in Toronto and again for San Jose and for the first time, we will be able to see real market shifts.

As Jacques mention, being able to double-check the raw data being collected is often considered a must (especially by more IT oriented people). However, in my opinion, the first source of "garbage in" is wrong tagging, not wrong storage or manipulation by the vendors.


Jacques Warren

Stéphane, obviously, if data collection is wrong, the whole building crumbles. It's the corner stone of it all. But once it's done (and properly, constantly maintained), data ownership and access become core questions/concerns/assets.

Web Analytics Management

Stephane and Jacques - Thanks for your insight on this. Few thoughts to throw back out.

Jacques, regarding the scenario:
"a webmaster can easily install it without going through the budget approval process; I suspect that it has been the case in many organisations."

Fair enough at face value, but I could argue that the scenario is part of a bigger issue in that the business case for web analytics hasn't been made, so "free" always wins out.

Stephane, regarding the catastrophe for low end vendors in the face of competition from Google and Yahoo. While this may be the case in the study on retail sites, I know that there are many agencies out there that like using low end solutions like Clicktracks/Lyris and VisiStat for their clients because of administration control, as well as functionality. So, I'm hopeful that there will continue to be scenarios that are appropriate.

David T. Smith

I've seen the dual tagging in many places as well, and in many cases the 'free' version is incorrectly tagged or tagged inconsistently compared with the 'fee' version. That is a serious issue and often is politically based. However, once the data collection issues are corrected, I still find some value in dual tagging:

1) Different Geographic translations used by different vendors
2) Provide different kinds of reporting to different audiences (system logs and GA page level tagging to sysadmins, custom tagging/reporting to marketing).

I don't think it's going to go away, and if the web analytics team in the organization does not have a way to explain the inevitable differences in reporting numbers, their numbers can be discredited. This brings the "data ownership and access" issue raised by Jacques back to the intra-organization conflict, rather than the client-vendor relationship.


In some cases (not sure if it's a large or small percentage, but I've done it myself), the double-tagging is done either for professional development purposes -- the analyst wants to expose him/herself to another tool -- or because the site used the free solution first then their analytics matured into the paid one. I always recommend leaving on the old tags for a couple months to understand the differences between the old numbers and the new. Then after the new tool is adopted, people just forget to take the old ones off: it's free anyway, so why bother.

In my case, the overlap is not a "validation" of the fee-based tool, but rather a way to understand what is different and why: we need to know which metric changes over time were because of different visitor behavior, and which were just tool-to-tool differences. Almost every time I've investigated major differences in the numbers, I've walked away with a greater appreciation for our paid tool.

The comments to this entry are closed.