Reason # 9001 to be very careful A/B testing – Bot Traffic

Reason # 9001 to be very careful A/B testing – Bot Traffic


Robot spider

An e-commerce client requested a higher converting version of their product page. Their sales cycle is very seasonal (remember this, it’s important later) so time was of the essence.

We took a data driven approach to improve conversion rates and came up with a series of recommendations based on:

  • Analytics data
  • User replay sessions
  • Heat maps
  • User surveys

Our client spent the time and money to quickly code the new product page and place it on the site as the default version with the option to redirect users to the previous product page.

Testing Time:

We setup a split test in Visual Website Optimizer and waited for results.

Our first version broke even:


 So we made some more improvements based on further analysis and started another test.

Our newest version was getting crushed. 


We made some small tweaks to what seemed to be a breakeven page and now our new page was both converting significantly worse and while still early in the test, losing significantly. We knew traffic was slowing down and conversion rates would decrease as we got out of season, but this was completed unexpected.

It didn’t make sense.

We started going through the list of possible reasons.

  • Confirmed numbers in analytics
  • Checked across segments
  • Device specific analysis.

All the numbers lined up, we were getting killed.

Enter Bot Traffic:

Bot traffic is a growing pain in the butt in modern analytics. Bad bot traffic is growing every year and they are getting smart. Changing user agents, IP addresses, reading javascript (this pinging your analytics), and other tricks to make them harder to detect.

Our friends at Distil Networks estimate that in 2015 only about 54% of web traffic were human visitors. 

The remaining site visits are broken down into two categories:

Good Bots and Bad Bots

Good bots are your friends. They include search engine crawlers and performance monitoring software.

Bad bots are unique from many other security threat types in that their manifestations can be as varied as the businesses they target. Bots enable high-speed abuse, misuse, and attacks on websites and APIs. They enable attackers, unsavory competitors, and fraudsters to perform a wide array of malicious activities including:

  • Web Scraping
  • Competitive data mining
  • Personal and financial data harvesting
  • Brute force login and man-in-the-middle attacks
  • Digital ad fraud
  • Spam
  • Transaction fraud

Often through no fault of your own your website gets hit by some of these bots. In most cases, your website is not the target but your analytics data suffers.

However, we did our research and were already excluding bad bot traffic.

bad bot traffic in Google Analytics

Good Bots – the culprit skewing our tests:

This website had a performance monitoring service check their product pages about thirty times per hour to verify add to cart was working correctly. These visitors came from different IP addresses and seemed like real users. However, they never left the product page.

How our test was ruined: 

More importantly, our good bot friends were never redirected to the Old Product Page BUT were counted as visitors for our New Product Page (the control in this test because of how their development team implemented this test). We’ve reported this to Visual Website Optimizer as an issue.

In response, our split test sent more of the traffic it could redirect to the Old Product Page. This was real traffic that could convert. After review we discovered that our Good Bot performance monitor was accounting for about 400 visits per day to New Product Page, 0 visits to our Old Product Page, and 0 conversions. As traffic decreased due to seasonality, this had a larger impact on our second test and skewed the results from break even to losing significantly.

Finding More Conversions:

We used VWO’s custom visitor targeting to exclude the Performance Monitor bot and re-ran the test.

a-b testing improved results

Reminder for your tests:

Treat your winners like losers and double confirm your numbers. In this case our “control” was the new product page and we had good reason to believe it was going to win. The fact that it lost significantly forced us to double check for testing issues.

However, imagine this was reversed and your new page absolutely crushed the old page because you had a flaw in your testing method. Odds are you are less likely to question those results because you want your new page to win.

Always check for issues such as:

  • Bot traffic that impacts only one variant
  • Redirect issues for split tests that alert your viewers something is up


Leave a Reply