Google Analytics Can Now Exclude Traffic From Known Bots And Spiders
Exclude bots and spiders from your user stats
Google made a small but important update to Google Analytics that finally makes it easy to exclude bots and spiders from your user stats. That kind of traffic from search engines and other web spiders can easily skew your data in Google Analytics. Unfortunately, while generating fake traffic from all kinds of bot networks is big business and accounts for almost a third of all traffic to many sites according to some reports, Google is only filtering out traffic from known bots and spiders. It’s using the IAB’s “International Spiders & Bots List” for this, which is updated monthly.
If you want to know which bots are on it, though, you will have to pay
somewhere between $4,000 and $14,000 for an annual subscription,
depending on whether you are an IAB member. Once you have opted in to excluding this kind of traffic, Analytics will automatically start filtering your data by comparing hits to your site with that of known User Agents on the list. Until now, filtering this kind of traffic out was mostly a manual and highly imprecise job. All it takes now is a trip into Analytics’ reporting view settings to enable this feature and you’re good to go.
Comments
Be the first to write a comment
You must me logged in to write a comment.