May 29, 2018

6 Misleading Google Analytics Metrics You Need to Stop Worrying About

Written by
Eric Siu
Tags: Marketing

Whether your website is primarily used for marketing, serving customers, or really anything at all, you should be carefully monitoring your Google Analytics account. That’s the best way to keep a pulse on your visitors – where they’re coming from, what they do while they’re on your site, and many other metrics.

But this wealth of information can also be a curse. Unless you’re a seasoned professional, there’s a good chance that you won’t know what to look at or what the data you’re looking at means.

Over many years of experience with clients at my digital marketing agency, I’ve seen a lot of people get distracted by misleading or uninformative metrics. This not only wastes time, but can lead you to make bad decisions, too.

Today I’m sharing a few of the most common metrics that regularly mislead people.

Misleading Metric #1 – Average Time on Page

average time on page GA

A lot of people think that the average time on page is a metric to judge engagement or attention – in other words, how long people are spending on a page.

The problem is, this isn’t exactly what’s being measured. You see, Google tracks time by calculating the time between pageviews or actions on your site. So if a customer comes to a page and then just leaves your site, Google doesn’t record time on page for that visitor.

Average Time on Page is calculated as: Time on Page / (Pageviews – Exits)

This is especially bad for landing pages or help resources, which aim to provide visitors with everything they would need in a single visit. Imagine that a prospect finds your blog post on Google, reads it completely in 15 minutes, and subscribes to your newsletter via a popup before closing the browser tab. This visit wouldn’t be counted in your time on page metric at all.

On the other hand, that same metric keeps recording time as long as you have the page open. It doesn’t matter if the the tab is minimized or covered.

This doesn’t mean that the metric is entirely useless, but it’s often very misleading, especially when evaluating engagement with content. Unless you have a specific reason to need it, you’re usually better off ignoring it.

Misleading Metric #2 – Conversion Rate (By Channel)

Before you start to think I’m crazy, let me reassure you that conversion rates are super important to monitor in Google Analytics. The problem is with how most people track them.

By default, Google uses last-click attribution, which means that when a visitor converts, that transaction is credited to the last channel source from which they came to your website.

So if they visited your blog three times organically, then clicked on a Facebook retargeting ad and filled out your lead form, it would credit Facebook CPC with the conversion. Even though the visitor first found your blog through organic search, it wouldn’t count at all for conversions.

Instead, try using a multi-channel attribution model like Time Decay or Linear. These will distribute the credit across the different channels with which a user interacts.

time decay attribution

Misleading Metric #3 – Direct Traffic

Direct traffic is supposed to represent all the visitors that go directly to your URL, not referred by any source. For example, a person types your web address straight into the search bar or clicks on a saved bookmark in their browser.

But the truth is that Google dumps all kinds of traffic in there – essentially anything that it can’t place elsewhere. SEO experts call this “dark traffic” because you can never be sure where it’s coming from.

In fact, back in 2014, Groupon found that 60% of their “direct traffic” was actually organic search! They discovered this by de-indexing their site for a day and seeing what happened.

direct traffic multichannel attribution experiment

That experiment is a bit old, and the percentages will vary a lot, but it demonstrates the inaccuracy of the direct traffic metric.

While a successful TV campaign or something could cause a high number of people to head to your site directly, usually the source is just bad tracking. So just assume that a good portion of it is split evenly among other channels.

Misleading Metric #4 – Bounce Rate

This metric is a bit different. It is actually recorded and reported pretty accurately. The problem is that many people still view it as a negative metric. They optimize to minimize it.

In reality, a high bounce rate isn’t necessarily something to be alarmed by. It might mean that users found everything they needed on the page and didn’t continue further into the site. For example, if a visitor found one of your landing pages and downloaded the white paper you offered before leaving, that should be considered a success, not a failure.

Unlike the other metrics presented, I don’t suggest that you completely ignore bounce rate. It can be useful to look at and see how users are behaving. Just be sure to compare it to your expectations and goals to evaluate the page’s effectiveness.

Misleading Metric #5 – Isolated Metrics

Moving away from specific metrics, I want to talk about a type of metrics called isolated metrics.

Oftentimes clients will ask us if their numbers look good. The problem is that what is “good” is very subjective. I can’t tell you if 10,654 sessions in May is good for your site. For some companies in a narrow niche that would be fantastic. It could generate them a million dollars in revenue. But for other companies, 10,000 wouldn’t even be sufficient for daily traffic.

So I always recommend that you evaluate your metrics in comparison to others.

The most important evaluations are the month-over-month and year-over-year reports.

comparison time over time reports

Google also allows you to compare your metrics against industry data to see how you’re stacking up. Just navigate to Audience > Benchmarking in the left-hand menu. Then select from any of the reports: Channels, Location, Devices or User Flow.

Once you’re in the report, you’ll be able to filter aggregate data based on Industry, Region and Size.

filter your GA comparison report

Misleading Metric #6 – Any Metric from Today

Data in Google Analytics takes up to 8 hours to process, so if you’re looking at any report from today, it isn’t very reliable.

In addition, if you’re running a comparative report, you should never include the current day. Even if the data has been processed, it will not show you a complete set of numbers to compare since the day isn’t a full 24-hour period yet. You’ll end up comparing 7 days last week against 6.5 days this week.

This should help you resist the urge to check analytics every single day, which is a very bad habit because you get too close to the metrics to catch any real changes. But if there is something you need to monitor more frequently, I should note that this doesn’t include Real-Time reports. These reports are calculated with special priority by Google and are calculated to be accurate very quickly.

Conclusion

Unfortunately, I can’t cover every misleading and distracting metric in this article, but these are the most common that we run into with clients at Single Grain. Hopefully, this will save you some time and help you make better decisions.

Whenever you want to add a new metric to your analysis, just do a quick search to see if there are any common problems or objections to the way Google Analytics collects and reports the data. It helps if you always start your analysis with a set of questions and goals. Then focus only on the information that helps address those specific questions and goals.

Eric Siu is the CEO of digital marketing agency Single Grain, which has helped such companies as Amazon, Uber and Salesforce acquire more customers. He also hosts two podcasts: Marketing School with Neil Patel and Growth Everywhere, an entrepreneurial podcast where he dissects growth levers that help businesses scale.