When Measurement Fails, Analytics Become Useless

I have a terrible habit. Every morning, I get on the scales to check to see if Iíve lost weight or fat. In fact, the habit is so bad that sometimes Iíll get on two or three times a morning and even through the day. Measurement is fine and good but sometimes daily measurement of small or difficult to measure changes can be frustrating and inaccurate (unless it says I lost weight Ė then itís accurate).

The other day I noticed that by simply allowing 30 minutes to lapse without doing anything, my body fat percentage dropped by four percent and I lost 100g (apparently). Well, I thought, forgo food and activity and by lunch I should be at a decent percentage body fat and a pound lighter! Obviously my instrument of measurement was not reliable.

I had an experience with a web analytics tool which seemed to be under reporting for a site I was working on. I was making changes, seeing the effect in other ways (sign-ups) but analytics refused to budge. Analysis of the code showed some variations but apparently it was all OK. There was something wrong though in my opinion and an audit of our raw log file stats illustrated in tangible terms what I had been struggling with for weeks Ė the measurement was indeed flawed.

Relying on web analytics has become something of an addiction among some businesses. With constant feeds of web traffic, the minute-by-minute tracking of where traffic is going to on their site and pages ranked by popularity real-time, web analytics are as essential as their other addictions Ė coffee, toast and antacid tablets. The problem with this is that most web analytics are wrong and are misreporting statistics. Add to the mix a misunderstanding of some of the analysis of the data and itís a recipe for stress.

There was a comparison of the various statistics packages which showed the variation among the packages available in 2007 and how differently they reported on the exact same website. In the report Stone Temple assert that ďas Jim Sterne is fond of saying, if your yardstick measures 39 inches instead of 36 inches, it’s still great to have a measurement toolĒ. I would argue that some statistic packages are not continuously measuring 36 inches as 39 inches due to cookie reliance, poor implementation and other factors and thus are not a proper measurement tool in those conditions.

How can web business operate in an environment where reliance on web analytics has to be tempered with something else? Using more than one measurement tool can help. With the former Urchin product, now Google analytics, now available for free it is easier than before to run two statistics packages. I would argue that someone in your organization should make web analytics their project. If possible, train up someone to dedicate at least an hour a day to web analytics. The more this web analytics person understands, the more appropriate use your company can make of sometimes flawed analytics.

Web Analytics are not the be all and end all of measurement online. Iíd argue that a holistic approach needs to be taken and that online is the same as offline Ė itís simply not conducted face to face. Footfalls are not the only measurement of a shopís success and nor is unique users. Sales are not the only measure of a companyís success and nor are sales online. Websites are more than just online brochures, they are a way for people to engage with your brand.

Iíll still get on the scales every morning, but Sunday is the day I use now to benchmark my weekly weight loss. Diet, exercise and health levels all cause variance in my scales ability to correctly measure me. Your websiteís health, code implementation and use of first or third party cookies (the code kind, websites donít eat cookies) will determine the accuracy of your web analytics. Donít rely on analytics alone, find an internal champion and make sure your website code is correct for a happy, healthy web analytics experience ;-)

Liked this? View all posts in Web Analytics

3 Responses to “When Measurement Fails, Analytics Become Useless”

  1. Alex says:

    oh, terrible habit,i think about it’s never too early to think about the eventual search engine presence of your business. This article covers some basic Do’s and Don’ts for choosing a business name with an SEO mindset.

  2. Jo says:

    Hi good post.

    Indeed, we need to know about effect size and variability. Being good at maths, etc.,I began my career as a psychometrician. The most important thing to learn in psychology is that we deal with weak effect sizes and massive errors of measurement. You can always spot an amateur because they will start reporting a massive effect size (claiming that A massively affects B), when all they are seeing is chance variation in the measurement.

    The trick in the practice of psychology is knowing how to put weak effect sizes to work.

    The amateurism drives us all nuts, I think. I can talk to you about the numbers underlying dieting too! I found it a lot easier to control my weight when I understood what could be controlled and by how much!

    Looking forward to your session at http://medicacamplondon.pbwiki.com

  3. @Alex – Ummm…. I’m cookie-less this morning and so not quite clear on how that relates to web metrics. :-( DeCabbit is sad when she is cookie-less

    @Jo – *laughs* Not that I hated maths (though the media would love to teach girls to hate/fear math) but I found it all much more tedious than pondering the deeper meaning of things :-) as it often involved snacks and math didn’t.
    That kinda leads nicely in to why I’m on a diet :-P
    I’m doing OK at the moment on Atkins – after that initial 8lbs weight loss in a week which left me a bit woosy at MediaCampBucks, it’s gone to a steady loss (thankfully!) which is why I’m always on the scales now :-D

Leave a Reply