Bots & Analytics, II: From Filtering out Bots to Filtering in Humans

“Hard” & “Soft” Bots, Human Signals — and Vacation in Cleveland

Recap: Why common Bot combat methods aren’t good enough for Analytics

The Bot Rush and the Desperation

The Bot rush before and after the “fix”

IT: “These Bots are not our focus”

The 2 Layers of Human / Bot Filtering: Layer 1 in the Tag Management System, i.e., in the browser, makes sure as much traffic as possible does not get tracked into Analytics & Co. in the first place, with the “Soft Bot” logic being the most important part. Analytics filtering logic (Layer 2) is still needed however to clean up those that slip through Layer 1.

Layer 1: Soft Bots, Hard Bots, and Human Signals

The Consent Manager as an inadvertent Bot Prevention System

Impact of an opt-in Consent Manager on the share of human traffic

What do Humans do that Bots don’t?

Let’s analyze — First, get the “botty by default” candidate!

Traffic by Country, split by Direct/SEA/All, Bottiness Rate (% Visits by Bots), and Counter Metrics “% Visits with Login” and “Conversion Rate”.

Americans are Bots — but what about Cleveland?

Secondly, find human signals

How is this implemented in the TMS?

Don’t we lose real human traffic this way?

Dry-running the system takes away the unknown unknowns

Monitor how often the Bot Detection fails, i.e. how often presumed Bots turn out to be humans after all. A high point of 0.38% (was it Swiss tourists in Cleveland?) seems pretty solid.

And what about Bots from Switzerland?

Layer 2: Fall-Through Filtering in Analytics

Epilogue

Monitor your Engines

A little help from the vendors would be nice…

Share your approach!

Digital Analytics Expert. Owner of dim28.ch. Creator of the Adobe Analytics Component Manager for Google Sheets: https://bit.ly/component-manager