When Measurement Fails, Analytics Become Useless

I have a terrible habit. Every morning, I get on the scales to check to see if Ive lost weight or fat. In fact, the habit is so bad that sometimes Ill get on two or three times a morning and even through the day. Measurement is fine and good but sometimes daily measurement of small or difficult to measure changes can be frustrating and inaccurate (unless it says I lost weight then its accurate).

The other day I noticed that by simply allowing 30 minutes to lapse without doing anything, my body fat percentage dropped by four percent and I lost 100g (apparently). Well, I thought, forgo food and activity and by lunch I should be at a decent percentage body fat and a pound lighter! Obviously my instrument of measurement was not reliable.

I had an experience with a web analytics tool which seemed to be under reporting for a site I was working on. I was making changes, seeing the effect in other ways (sign-ups) but analytics refused to budge. Analysis of the code showed some variations but apparently it was all OK. There was something wrong though in my opinion and an audit of our raw log file stats illustrated in tangible terms what I had been struggling with for weeks the measurement was indeed flawed.

Relying on web analytics has become something of an addiction among some businesses. With constant feeds of web traffic, the minute-by-minute tracking of where traffic is going to on their site and pages ranked by popularity real-time, web analytics are as essential as their other addictions coffee, toast and antacid tablets. The problem with this is that most web analytics are wrong and are misreporting statistics. Add to the mix a misunderstanding of some of the analysis of the data and its a recipe for stress.

There was a comparison of the various statistics packages which showed the variation among the packages available in 2007 and how differently they reported on the exact same website. In the report Stone Temple assert that as Jim Sterne is fond of saying, if your yardstick measures 39 inches instead of 36 inches, it’s still great to have a measurement tool. I would argue that some statistic packages are not continuously measuring 36 inches as 39 inches due to cookie reliance, poor implementation and other factors and thus are not a proper measurement tool in those conditions.

How can web business operate in an environment where reliance on web analytics has to be tempered with something else? Using more than one measurement tool can help. With the former Urchin product, now Google analytics, now available for free it is easier than before to run two statistics packages. I would argue that someone in your organization should make web analytics their project. If possible, train up someone to dedicate at least an hour a day to web analytics. The more this web analytics person understands, the more appropriate use your company can make of sometimes flawed analytics.

Web Analytics are not the be all and end all of measurement online. Id argue that a holistic approach needs to be taken and that online is the same as offline its simply not conducted face to face. Footfalls are not the only measurement of a shops success and nor is unique users. Sales are not the only measure of a companys success and nor are sales online. Websites are more than just online brochures, they are a way for people to engage with your brand.

Ill still get on the scales every morning, but Sunday is the day I use now to benchmark my weekly weight loss. Diet, exercise and health levels all cause variance in my scales ability to correctly measure me. Your websites health, code implementation and use of first or third party cookies (the code kind, websites dont eat cookies) will determine the accuracy of your web analytics. Dont rely on analytics alone, find an internal champion and make sure your website code is correct for a happy, healthy web analytics experience icon_wink-5955087

Leave a Reply

Your email address will not be published. Required fields are marked *