In Eric Peterson's Web Analytics 2.0 keynote at X Change, he references “The Good Old Days” of analytics in a Web 1.0 world. In light of the fact that I started doing web analytics in 1995 and was an early evangelist for the use of Internet technology, I found this reference very interesting and somewhat telling about the state of web analytics practice.
In the Web Analytics 1.0 world, when log file tools ruled the web world, let’s say 1995-2000 (this is arguable of course, but lets not get distracted), it was clear that you needed more than tool driven analytics to understand whether your web site was accomplishing its business objectives. In my first web site management role, I relied on WebTrends and NetAnalysis plus usability testing, surveys, focus groups, and interviews, as well as cost s to determine whether the web sites were making money or saving money - I had to in order to justify the existence of the sites. I considered this to be part of a web site measurement and testing methodology of which web analytics was a part, but not the sole component.
So, when Eric suggested that Web Analytics 2.0 demands these methodologies, in addition to only looking at the numbers, I was rather surprised that this should be considered groundbreaking.
On one hand, there is already activity in the combination of quantitative and qualitative data. Interestingly, Federal agencies have been doing this for some time by combining survey results from companies such as Foresee and Optimost, panel data from Hitwise and web analytics. It could be argued that Federal agencies are in a Web Analytics 1.0 world, but as I learned from my own experiences that might make for potentially more of a complete analysis if managed effectively. At the DC Web Managers Roundtable in December, we had a panel discussing how to use qualitative and quantitative measurement. You might find the podcast an interesting listen to get a perspective on how folks are approaching this.
On the other hand, in many organizations, the practice of web analytics is simply driven by the purchase and use of tools, rather than the use of analysis. Just look at the job reqs that come out of organizations today…consider how many focus on x years of experience with a tool. How many reqs ask for someone with experience in quantitative and qualitative measurement and analysis? How many look at wholistic web site measurement and testing as part of a comprehensive web channel strategy? So with all of the tool sophistication that we now have when compared to 10 years ago, is it possible that web site measurement has gotten more narrow? Have the tools driven the analysis rather than analysis driving the tools? I'd have to say yes, the focus on numbers and stats has become somewhat of a distraction to seeing the whole picture.
If you manage a web channel, quantitative and qualitative testing should always be part of your web site measurement and testing strategy. While this is not new in theory (I discussed this in my book, The Executive’s Guide to Web Site Measurement and Testing), if you haven’t yet put it into practice, you’ll find that it’s a method that will stand the test of time from Web 1.0 going forward.
Phil,
In a funny way I agree with you about how, um, un-groundbreaking the idea of Web Analytics 2.0 is ... mostly because as soon as you think about it for just a wee bit you're inevitably thinking to yourself, "well duh!" But as I've been exploring this with my clients (technology and practitioner) it is clear that despite organizations ** having ** technology to gather both quantitative and qualitative data, few are going so far as to integrate the two (or three, or four.)
Recently I was lucky enough to be invited to present at ForeSee Results "Customer 2.0" client event in Michigan. At the event I talked about the Web Site Optimization Ecosystem and the need to bring ForeSee data together with whatever web analytic data you have at your disposal. Some of ForeSee's customers are doing this today, but most are just now giving serious consideration to what Web Analytics 2.0 could mean to them.
I'll have to listen to the podcast but I'd love to talk to some of the organizations you've talked to who are actively combining data. I suspect that most have confused "having" and "integrating", the latter requiring some type of data integration capability, a relatively modern application, etc.
So yeah, maybe not groundbreaking, but I think there is still a pretty substantial gap between "wanting to do" and "doing".
Welcome to the multilith!
Eric T. Peterson
http://www.webanalyticsdemystified.com
Posted by: Eric T. Peterson | October 03, 2007 at 10:24 PM
Eric - Thanks for the comment. I totally agree with your observation about there still being a gap...you'll notice that I wrote "might make for potentially more complete analysis if managed effectively." I wrote this in thinking about this gap, and in addition being able to really put all data sources into context. This is a big issue...basically when organizations are using qualitative and quantitative data, they're challenged with syncing it all together.
Posted by: Phil Kemelor | October 04, 2007 at 04:17 AM