A/B tests

If you're going to be a data scientist at a big tech web company, this is something you're going to definitely be involved in, because people need to run experiments to try different things on a website and measure the results of it, and that's actually not as straightforward as most people think it is.

What is an A/B test? Well, it's a controlled experiment that you usually run on a website, it can be applied to other contexts as well, but usually we're talking about a website, and we're going to test the performance of some change to that website, versus the way it was before.

You basically have a control set of people that see the old website, and a test group of people that see the change to the website, and the idea is to measure the difference in behavior between these two groups and use that data to actually decide whether this change was beneficial or not.

For example, I own a business that has a website, we license software to people, and right now I have a nice, friendly, orange button that people click on when they want to buy a license as shown on the left in the following figure. But what would happen if I changed the color of that button to blue, as shown on the right?

So in this example, if I want to find out whether blue would be better. How do I know?

I mean, intuitively, maybe that might capture people's attention more, or intuitively, maybe people are more used to seeing orange buy buttons and are more likely to click on that, I could spin that either way, right? So, my own internal biases or preconceptions don't really matter. What matters is how people react to this change on my actual website, and that's what an A/B test does.

A/B testing will split people up into people who see the orange button, and people who see the blue button, and I can then measure the behavior between these two groups and how they might differ, and make my decision on what color my buttons should be based on that data.

You can test all sorts of things with an A/B test. These include:

  • Design changes: These can be changes in the color of a button, the placement of a button, or the layout of the page.
  • UI flow: So, maybe you're actually changing the way that your purchase pipeline works and how people check out on your website, and you can actually measure the effect of that.
  • Algorithmic changes: Let's consider the example of doing movie recommendations that we discussed in Chapter 6, Recommender Systems. Maybe I want to test one algorithm versus another. Instead of relying on error metrics and my ability to do a train test, what I really care about is driving purchases or rentals or whatever it is on this website.
    • The A/B test can let me directly measure the impact of this algorithm on the end result that I actually care about, and not just my ability to predict movies that other people have already seen.
    • And anything else you can dream up too, really, any change that impacts how users interact with your site is worth testing. Maybe it's even, making the website faster, or it could be anything.

  • Pricing changes: This one gets a little bit controversial. You know, in theory, you can experiment with different price points using an A/B test and see if it actually increases volume to offset for the price difference or whatever, but use that one with caution.
    • If customers catch wind that other people are getting better prices than they are for no good reason, they're not going to be very happy with you. Keep in mind, doing pricing experiments can have a negative backlash and you don't want to be in that situation.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.142.119.8