Troubleshooting Common Google Analytics Problems & Questions

I wonder how many questions we all ask every day? I imagine it must be quite a lot. From ‘how are you?’ to ‘do you like this song?’, ‘would I want a link from this site?’ to ‘would you like to sign the deal?’. We are inquisitive and would never succeed in anything if we didn’t ask questions. So I wanted to take some of the common questions I get asked about Google Analytics and answer them for you.

My bounce rate is fantastic, is it because my website is so good?

Usually, a very low bounce rate is a sign that there is a problem with the tracking code. Around or under 10% suggests there are two pageviews being tracked on the page affected, the simple way to check this is to look at the source code and use the Find function to search for UA-, if there is more than one they need to be using what’s called ‘roll up reporting’ where each tracker is labelled to send the data in to different accounts. If there are two unmodified versions of the code then removing one set, or implementing roll up reporting should fix you unusually low bounce rate.

Additionally, bounce rate can be affected by Event Tracking as this counts as an interaction on the page, so always bear this in mind when installing Event Tracking code.

Can I track form submissions?

There are different methods you can use to track form submissions. The best being to have the form generate a thank you page to confirm form completion which says thanks to your user for filling in the form, and also generates on its own URL that you can use to set up a URL specific goal.

If you can’t set up a thank you page then you can use Event Tracking to track when people click the submit button. You can set it out as simply as this and place it within the code for the submit button:

onClick=”_gaq.push(['_trackEvent', 'Contact Form', 'Submit']);”

 Can I track clicks on this link?

Again, nice and easy, tracking clicks on links to external websites should be done using Event Tracking, as above, but this time you could use code like this:

onClick=”_gaq.push(['_trackEvent', 'Advertisement', 'External Link Clicked', 'SEO Chick Link']);”

What is (not set)?

This usually means that some ads running are not tracked correctly.  If you’re using AdWords, check that you have set up Auto Tagging, or manually tagged the ads with the correct information, and that you have linked Adwords and Analytics correctly. If you are using another advertising platform, the ads are likely to need to be tagged with tracking code manually. Tagging can be manually created using Google’s URL Tool Builder.

Why is my website on the list of referral sites?

Your website will appear on the list of referral sites when one or more of the pages of your site do not have the tracking code on them By clicking the site where it is listed you can sometimes find out which pages the problem is on, however, only the pages seeing traffic will be shown so it’s best to run a full check on your site to see if there is the correct UA code on every page. One way to do this is to crawl a list of URLs of your site using Screaming Frog SEO Spider and applying a custom setting to find the pages with your UA code and those without.

Where has all my traffic gone?!

If you see a drop in activity, there are a number of factors that could have caused this. You will need to rule out each one to identify the cause of the decrease in activity.

First, I would recommend you check for whether it is a decrease from all traffic sources, or one in particular. If it is just one, you can then look for why this might have happened – did a Google algorithm update reduce your Google organic traffic? Did you run out of money in your AdWords account? Did traffic from a referral site suddenly reduce?

If all traffic sources decreased relatively equally, I would then go on to check whether all pages of the site saw a similar decrease to each other or whether it was a certain set of pages that were affected.

Once you know which pages were affected, check that the tracking code is correct  across the site. Ask the developer if any changes have been made to the site that might have affected the tracking. If the tracking all looks perfect now, it may be that duplicate code has just been removed, or that another domain was also using that UA code.

To check which domains are using the UA code for your account follow these navigation steps:

  1. Audience
  2. Technology
  3. Network
  4. Hostname (the link above the data)

You can then see a list of the sites that have recorded visits with your UA code. Translation websites are normal to see here, as are your various subdomains if you have any. Check however, that there are no other sites on this list,

If you cannot identify the cause from the above, start looking through the other reports, such as device, browser, location, to spot any anomalies. On more than one occasion I have found that the drop off is specific to one location and browser (through creating an advanced segment for those that saw the decrease together). There will sometimes be a low new visit rate, meaning that one computer has been accessing the site a significant amount but then changes the habit.

Why is the data from Analytics different to AdWords?

Never fully trust the data if you didn’t go out and get it yourself. I always work to the fact that Analytics and AdWords PPC data will be different and there are many reasons to support this:

  • Analytics can only track visits from browsers with JavaScript and cookies enabled
  • AdWords tracks clicks, not visits
  • Clicks in AdWords could result in no page loading, or the user leaving before it loads
  • Invalid clicks in AdWords are filtered out of the reports, but may have tracked as visits in Analytics
  • Conversion data will be different due to Analytics using last click attribution and AdWords using first click attribution

So there you have the answers to some common analytics problems. I’ve not linked to a guide to each solution otherwise I might as well have just listed them to start with (nothing to do with the fact that I wrote this on the train with no internet connection, honest!). I hope this helps answer some of your queries and get you the information you need from your Google Analytics.

Overly Long Post About Why You Should Talk More To Non-SEOs

I’ve always struggled with explaining what I do to people who aren’t in the SEO industry, and generally just say that I have an SEO company (blank stare or, as Paul Madden likes to call it, the “dog stick stare”) or work in online advertising. Sometimes I do get more specific and tell them that I build those clickable bits of text on websites. If I’m lucky, I get a nervous slow nod intended to shut me up so they can move on to talking about how awesome they are.

While this is slightly challenging, it’s nothing compared to doing an hourlong presentation to people who need to be taught a basic outline of what I’ve been doing for year, which I did recently at the Parenting Media Association conference in Chicago. I could have spent a semester on this information yet I had to condense it, relate it to what these guys are doing, and make it interesting. For a person like me (wordy, annoyingly wordy, prone to using long-winded sentences to make a short story long, etc.) this was seriously challenging but I realized that it’s probably one of the most beneficial work experiences I’ve ever had.

As wordy as I am, I’m highly annoyed when other people are the same way. I try to blame it on my earlier social work training (which was solution-focused and bigger words and longer sentences don’t get someone helped faster) but in reality, I think it’s because I am highly impatient and have trouble being concise and an efficient user of words. Being online so much is causing my social skills to further erode. Just ask my husband, who tries to tell me about his latest interesting dream and is rewarded with a sharp “what’s the bottom line here?? Could you actually scream or not??”


I always hated Ernest Hemingway anyway.

I’m devolving further as you can see so let’s get back on track and I’ll explain why I loved having the chance to organize my thoughts in such a way that someone who had no clue about link building could start to successfully build links. (And as it turned out, they had way more of a clue than I thought they would, which was a lovely and fun surprise.)

When you can’t assume that someone will understand what you’re talking about, you tend to slow down and think about it more clearly. You can better see the breakdowns in logic and actually think about terms instead of spouting off technical acronyms and catch phrases.

When you’re dealing with people who aren’t just sitting there waiting for you to make a mistake, you realize that if you do say something that’s incorrect, you might cause a lot of problems. For example, if you don’t keep up with Google’s latest updates and you stupidly tell a group of people that using exact match anchor text for 75% of their links is the best idea, you could really screw them. Obviously if you said this to a group of SEOs, you’d get smacked, but when SEOs are not there to check you, you will hopefully research what you’re saying and make sure it’s as correct as possible.

You might realize that what you think is the most important upcoming social network isn’t. I am not a Google + fan really, but I do understand that to do well in Google, I have to play by their rules so I’ve sucked it up and tried to use it. However, I fell into the same trap I fall into a lot, and that’s thinking that everyone knows what I know and thinks the way I think. Most people don’t give a flip about G+ but they’ll use the hell out of Pinterest, a platform that I personally detest. Witness my Bald Board, which hasn’t even been updated recently!! I can’t keep up with all the sheer amazing baldness in the world but based on the last James Bond flick, you may be seeing Ralph Fiennes there soon. In my session with the parenting media group, I asked who used G+. I didn’t see a show of hands. I asked about Pinterest and almost everyone used it. Think about it though…for that niche, Pinterest is going to be much more important since their audience wants to see recipes, crafts they can make with the kids, etc. Recipes and crafts don’t translate as well for G+ and by and large, moms who want parenting content aren’t all jumping on G+ like a duck on a june bug.

I’ve always believed that if a completely clueless site owner emails me and isn’t a condescending asshole, I should do my best to at least point him or her in the right direction as far as finding a competent SEO with openings is concerned. There are so many loudmouthed know-nothings out there and it’s frustrating to just say no, I don’t have room, but best of luck finding someone who does, so I generally try and send these potential clients to someone that I know isn’t going to screw them over further. Most good SEOs are busy as hell these days so it’s getting harder to do that, which has caused me to do a bit of quick and free digging just to see if there is anything I can immediately point to as being a reason these sites are suffering. Many times it’s obvious, as they have backlink profiles full of nothing but spammy footer links on irrelevant sites, or they have an IT guy who forgot to remove the no index nofollow on the whole site. These site owners don’t understand when I say “you really need unique titles and you have some messy 301s going on” so I have to break it down into extremely simplistic little bits of instruction and explanation. You’ve no idea how beneficial that’s been for me, as since I came from a programming background, a lot of technical jargon is an ingrained part of me and I forget that not everyone immediately knows what a crawl issue is.

I also have made the mistake of not always educating my link builders about things that may not immediately impact them on the job. To be honest, some of them probably don’t care and if they’re still doing well and building great links, I’d rather not annoy them. Some of them do care though, and while I feel pedantic explaining why a link one of them got is so good in my eyes, the feedback from my own feedback is always very positive, and it makes me realize what a disservice I’ve done them when I haven’t always explained myself. Again, it helps me, too.

Also I’d like to thank Rae Hoffman from Pushfire for allowing me to use her fantastically efficient socialization plan in my presentation, which saved me from having to write one, gave me a better version of my own plan for my own work, and elicited knowing nods when I mentioned her name. You can read that here:

Why Google+ Fails to Blow my Skirt up

Of late it seems I can barely go a day without bad mouthing Google+

Not only does G+ fail to blow my skirt up, furthermore I rather enjoy poking fun at it and on occasion* being out and out nasty about it.

*These ‘occasions’ are actually pretty frequent.

Want to know what my problem is?

Hold on to your skirts hats :)


Google are making you their bitch.

Google are using a pretty common marketing tactic – engaging with influencers in order to gain traction.


What would you do if you had the Google Algorithms for a day?

Wouldn’t you love to have access to the algorithm that makes or breaks your website’s success?

I’m sure we’ve all dreamt of getting our hands on the algorithm, while trying to improve results or understand a sudden change. Unfortunately, with only a few people in the world actually having access to the full algorithm, and none of those people being SEOs, we all have to try and work it out for ourselves.

But why should we stop dreaming? I asked my fellow SEO Chicks what they would do if they had access to the Google Algorithms, just for a day.

thief stealing google algorithm

The ideas are fantastic. I can imagine Googlers reading this and rushing to make the algorithms even more secure, knowing what we might get up to if we could!

I hope you enjoy these and would love to hear your ideas in the comments.

To kick it off, it’s a strategic approach from Julie Joyce:

First of all, I’d change it so that a good link would weigh more in terms of importance than a horrible link. A footer link on a 2 page site that hasn’t had more than 40 visitors in 3 years should not count as much as a link on a relevant site that is well worked into the content, shows great social signals, and encourages clicks. Since we’re dreaming here, I’d also remove the webmaster guidelines that say you shouldn’t buy links. Tons of people buy links but if they buy good ones, those should be fine. I wouldn’t let bad links penalize a site though. I’d just not count them.

Secondly, I’d separate the QDF signals for various industries. I don’t think that freshness is as important for certain niches as it is for others and I think that it can encourage some sites to continually produce new content for that reason alone, which can, in some cases, simply create a bloated site.

I’d also roll out updates in a manner that wouldn’t have so much collateral damage. Turn the dial low and then crank it up a bit instead of “accidentally” blowing away a non-EMD site when we’re targeting EMDs. In conjunction with this, if a site was indeed unfairly penalized, I’d create a much more efficient method of evaluating it and restoring it.

Last of all, (and yes I know this isn’t 100% relevant) I’d fix the Adwords system so that once I was doing well and the account was running really smoothly, it wouldn’t suddenly crash and cause me to have to jack up the budget just to do well again.

Next up, the incredibly imaginative Hannah Smith:

The easy answer would of course be to ensure all of my lovely clients rank 1st and their nasty old competitors languish in post page 10 obscurity. That would just be selfish of course, plus as it would only last a day it wouldn’t really make much of a dent in real terms.

As such I think I’d like to break Google, albeit for just a day. Google has a strangle-hold in terms of monopoly and I’d like to try to change that – (NB this is pretty unrealistic given I’ve just a day, but whatever).

We’ll call it Hannah’s Hostile Takeover day. On Hannah’s Hostile Takeover day Google will become worse than useless.

The SERPs will appear unchanged.
However, regardless of whether you click on an organic or paid result you’ll be auto-magically redirected here.
Should you elect to type in a search query related to this then the SERP will be compiled entirely of links to sites devoted to jokes about ‘your mum’.

rick astley

I expect this will cause people to hit the back button to return to the SERP. When they hit ‘back’ they’ll see the following message:

“Whoops-a-daisy – looks like that wasn’t what you were looking for. Try Bing instead.”

Many will elect not to do this. Instead they will try again with Google.

Still failing to find what they’re looking for they’ll hit back and see this message:
“Oh horlicks! Looks like that wasn’t what you were looking for. Try Bing instead.”

If they are particularly tenacious and still refuse to visit Bing they’ll be forced to try to complete an impossible captcha in order to search again. Try as they might they won’t be able to solve said captcha.

I would hope that the net result is that many people will try Bing. Some might even like it. Some might continue to use it because they are still a bit sore about the Google messing them around like that – after all they are very busy and important. Google lose a little market share.

The End.

And if anyone is left using Google, Nichola Stott says:

To be honest, I’d probably leave well enough alone. Google employs some extremely smart engineers who are immersed in particular component aspects of the algorithms, day-in-day-out and it takes these engineers some time to build a strong update. It would take a hell of a lot more than a day for me to get to grips with even the simplest components.

Instead I’d like to see what factors really are included within the more erroneous factors, such as engagement metrics. Is in-SERP CTR a factor? Are dwell times and page views per referred visit a factor? If so to what extent are known characteristics identified and extrapolated to make meaningful inferences for sites without Google Analytics?

Oh and at 4pm I’d throw Meta keywords back in, just for shits and giggles.

Nichola’s final idea definitely appeals to Judith Lewis:

*hahaha* That would be so funny – from 4pm you would throw the world into chaos.

Which is, in fact, what I would do if I had the algorithms for a day. I’d create a table called “friends” and one called “enemies” and friends would get automatic 1st page rankings whereas enemies would find a rotating penalty based on search volume from -3 to -90.

I’d then have a “random weirdness” list which took obscure conspiracy theory sites and gave them top billing on one in every 3 relevant searches.

conspiracy theory meme

Then I’d add in a dash of social metrics, forcing in a random twitter profile every 7th search.

At the end of it all, any SEO who only chases the algo would be in hospital and the rest would be laughing down the pub.

But SRSLY I agree with Nichola. Not only is that sucker the size of a small planet in complexity but the guys who are at the coalface are working their butts off trying to make things work properly and with the best intentions for user experience.

And finally, my day would go something like this:

First off I would put on a witches hat and let out an evil cackle.

lego witch


Next, I would try to understand the algorithm so that I can learn from it and use this knowledge in future. Because once I’ve understood it enough for my liking, I would delete it! I would then replace it with a simpler algorithm that is much more aligned to how my mum thinks it works.

Results would be displayed according to which letter of the alphabet the domain starts with. There would be 26 pages to choose from each time you search and you would be shown results from a random letter of the alphabet each time you search, in order to keep things fair for all letters.

On each results page the websites would be sorted according to ‘popularity’ (because that’s how my mum thinks it works!). I would get some awesome developers in (and maybe some technology from the future) to take in to account popularity based on the sentiment of within everything written online, everything written offline and every single thing spoken out loud across the world.

If I had time, I’d even make this location based so that those websites with positive discussion locally to you would perform better than they might in other locations.

Websites could then improve their results by improving their popularity – this could take in to account the other common myth that results are based on how much you pay Google – those who start giving loads of money to charity could become more popular and get better results.

One other thing I’d like to do is slightly more frivolous. I’d like to see the pages livened up a bit with a random image on every results page, preferably a Lego minifigure, or a rubber duck. If I was feeling generous, I’d let users pick from a category of images that they want to see on their SERPs – from lolcats to memes to pretty landscapes!

At the end of the day, I hope I will have learnt a lot, had some fun and made the general internet user think about Google from another perspective and not just lap up everything the internet says.

Oh, and Google would have to start from scratch again, which would hopefully mean some things don’t get put back in the algorithm, it wouldn’t have so many elements to it and they can work in new and improved metrics.

I think it’s safe to say that we could cause five days of trouble for Google, their users and SEOs between us. But what would you do?

Image credits: me, youtube, memegenerator and cjdc.

Yes Virginia, There Is Negative SEO

I would dearly love to believe that so-called ‘negative SEO’ was not possible in this day and age of human editors, link checkers and auditors. Given my recent experience with a museum who were manually penalised, I can say ‘Negative SEO’ is alive, well and easily executed.


From thisseolife:

Proving Negative SEO Works Is Like


It all started when a museum launched a longer-than-average exhibition. They gave this exhibition its own domain and featured blogs from the curators. This was a significant enterprise for them and they worked quite hard on promoting it. My company gave them some support so we had a relationship with them and started helping ensure they could be found but we were not engaged to do SEO at this point or any sort of search support work. Everything was going along well until one day they were hacked.


The hack wasn’t too sophisticated so it was immediately visible that it had happened and they were able to flag it with their IT department. As these things go it was dealt with moderately quickly but the hack was up for awhile. Once resolved we went back to business as usual for them. Lurking in the background, unbeknownst to them or us, was a ticking time bomb.


Eventually we were contacted by them in a panic as they didn’t know what to do. They had dropped out of the rankings completely for their key terms (a phrase they invented). Upon examination we found they had dropped for their own brand term, their key terms and once we got Webmaster Tools access we found they had dropped for almost every key term and there was a message. They had received the dreaded “un-natural links” warning email. Disaster, but why?


Looking back over what they had done we could find nothing they have actively done which would explain the problem. They were a museum engaging in only the normal PR work – so what had caused the problem? Well, a few brilliant minds came together and one spotted it – the spammy links causing the manual penalty. Apparently once the hack happened, it caused a bunch of pharma links to be pointed at the museum site.


The weird thing was, those pharma links were irrelevant and should have been passing no value at all. So why did the museum, who were not selling anything like those links indicated, get a *manual* link penalty? The links were irrelevant and the anchor text was irrelevant so the links should have been passing no value.

What we have is an excellent example of how ‘Negative SEO’ works.


Manual checkers aren’t using common sense to check the sites it seems. How a museum could have been mistaken for a site benefiting from pharma links is unknown to me but it did clearly demonstrate ‘Negative SEO’ was still possible.


It only worked for a short time – eventually Google saw the lack of wisdom in the decision and when we contacted them they removed the manual penalty. This seems to me to be a really effective seasonal ping – throw the spammy links at a site just before a major shopping holiday, rake it in as your competition disappears and voila, you’ve won.


What should you do if you think you’ve been hit? Firstly check Webmaster Tools. We got a note, you’re likely to as well if you got a penalty. Next, check your back links. You can check Majestic SEO on your own site, Google Webmaster Tools, SEOmoz, and a number of other tools. Look at the anchor text of your back links – this is super easy in SEOmoz, Majestic SEO and a few others. Look to see if there is anything untoward or suspect. You should *never* have more keyword links than brand links and your keyword links should be on the theme of your site. If there are spikes on the Majestic SEO graphs around the time just before you got your penalty it could be due to volume of acquisition and in that case you might not have gotten an alert. We didn’t when we accidentally tripped a volume flag. Just dig using free or paid tools and see what is going on.


Once you find the errant links, if they are irrelevant as these were, use a WHOIS lookup and use the email contact to request that the links be removed. If it like it did for us, they will all bounce or be ignored. Once done, use that spreadsheeted list of the bad URLs and include them in your reconsideration request, stating that not only did you clearly not build them but they were contacted to ask for removed and emails bounced or went unanswered. It, of course, helps to be able to email a Google engineer but hardly anyone can. The above should work without extraordinary intervention.


Negative SEO is still possible, poisonous links do still exist and you can still harm an irrelevant site with bad links despite attempts to do so not always succeeding.

Predatory Thinking for SEOs

I was fortunate enough to attend BrightonSEO a couple of weeks ago – big love to Kelvin and the team for organising another fantastic event. For me the stand out presentation of the day came from Dave Trott on Predatory Thinking.

Don't panic, this is not Dave Trott.

Dave Trott is the Executive Creative Director for CSTTG. He trained on Madison Avenue at the end of the Mad Men era, when the three-martini lunch and golf course advertising was for dinosaurs, and the creative revolution was just starting. After 4 years, to avoid getting drafted for Vietnam, Dave came back to London. Some of the advertising he worked on included: Ernie the Milkman for Unigate; Aristonandonandonandon (quoted in a speech by Margaret Thatcher); Red Rock Cider with Leslie Nielsen; the Holsten Pils campaign featuring Griff Rhys Jones and dead Hollywood film stars; and a controversial multi-media campaign, made entirely for free, that helped get the Third World Debt discussed by the world s governments.

There are a whole bunch of round ups you can read for further info on Dave’s talk and the conference in general, but I wanted to focus this post purely on predatory thinking, and what that might look like for SEOs.