To put it simply, they are using a method only available in HTML5 and only supporting certain browsers with that method. This grid lists which browsers are supported. It
means Google is only covering 73.57% of the browsers. Unfortunately,
the unsupported browsers include a large and growing audience: Safari
and mobile iOS Safari. Which means your traffic from Safari users on
desktop computers, iPad Safari users, and iPhone Safari users are not
being tracked in terms of these metrics: Avg. Page Load Time, Avg.
Domain Lookup Time, Avg. Redirection Time, Avg. Server Connection Time,
and Avg. Server Response Time.
For
some, or maybe most of us, the percentage of your website’s Safari
users may be more than 26.43%. The excluded browsers are not “evenly”
distributed in any way, so your traffic may include a lot of the other
excluded browsers, as well. You think you can just exclude these users
and think you have accurate data? Well, no. Not only would you be
excluding a large portion of your customers, but those 0’s are ruining
the averages for page timing. When you look at Avg. Page Load Time,
Google Analytics is creating an average from all sessions, including
your 0’s, which is not accurate.
Numbers are not what they seem…
To make matters worse, they actually removed past data in page timing, too. Let’s take a deeper look…
The
report we’re looking at here is using all the metrics mentioned, plus
displays the Dimensions for Browsers and then Regions for a particular
client. Looking at Browsers first, you can clearly see what Safari
looks like now in comparison, big fat zeros. Worse yet, if you drill
down to versions to some other excluded browser versions, that data is
now also a big fat zero.
I’m
comparing the same time period, year over year, in this report using
the Regional Dimension. We can see a few states have data in one row
and not the next, going both forward or backward in time. Strange,
right? Especially backwards! Well, because of those 0’s both going
forward, and the old data being replaced with 0’s going backward, we
can see better how they are changing the averages. Even where numbers
are available, there are huge differences in timings for a site that
hasn’t changed in a year. If you drill down into one of these states
with 0’s and look at the browsers in that region, you can see that
there is data. But the data available has more of those excluded users
than not, making the averages ZERO. Going back to the overview of
regions, the average is using those zeros. This means any singular
report on page timing you may be looking at has an inaccurate count in
those average timing metrics; aggregated data using a lot of zeros
tends to throw things off.
Those (not set) folks in the Region report, by the way, 70% of those are Safari users. All zeros…
Page load speed is so important though…
What
to do? You can drive yourself crazy by digging into the data yourself
thinking you can get a better average, but that’s almost impossible,
because it isn’t an accurate average of your customers. The numbers you
use will mainly be Chrome and Firefox users over time, that’s about
it, but I know that isn’t a good sample of your customers. I did it for
a couple hours the other night, and redoing the numbers I found the
percentages in change over time to be wildly different anyway. Larger
numbers will of course produce better results – but that only means you
are excluding a larger audience, as well.
So,
take it with a grain of salt until Google stops removing your data and
changing its algorithms when it comes to page load time. In the
meantime, I will be finding an alternative tracking method for this
measurement. If you have any suggestions, please email me!
0 Comments
Please add nice comments or answer ....