Why Are We So Obsessed With A/B Testing?
Okay, we need to test things and see how our users react to them, but actually, the way many of us are carrying out A/B testing has become a huge issue for the growth and development of our organizations.
Over the past few months, I’ve been toying with the idea of writing this article. I suspect after people read the title, no one will be offering me a job any time soon – but I do hope you keep reading and let me convince you I’m right by saying… STOP A/B testing everything!
A/B testing is a method where we release two features and create an audience who will be exposed to them – this enables us to understand which one works better. The whole thing sounds great up to this point, right? Well, the problem with this method is that it requires a lot of preparation, and sometimes, the results aren’t worthwhile. Let’s face it, it’s very rare that we actually get the black and white answer we’re looking for.
The Problem with A/B Testing
I love hearing the ‘big numbers rule’ that people are always telling me: “we need to have a threshold of like, a lot, a lot of users’ ‘. Just like saying Android is good for growth and IOS for revenues, we don’t live in a black and white world, you get my point, right? Having more users or less users doesn’t necessarily help determine which feature is the clear winner.
Now let’s talk about those pesky bugs and errors. We all live in the real world. No new feature comes out free of bugs and errors, and yes, the minute you test it, they’ll come out to bite you and your users, contaminating the test in the process. So then we consider starting over, but by now, A LOT of users have already been exposed to the feature – so we ask ourselves, is this the right thing to do? Is it working?
But the thing that makes me even angrier is the belief that you need to A/B test new features. Why? If you developed it to meet the user’s needs, then that means they will love it. So why do you need to test it? Anyway, users will always struggle with a new layout… so save your money, time and energy and don’t test it.
The time it takes to plan a test, to set everything up and prepare everyone for it requires an enormous amount of resources and people.
Instead of A/B testing, could you test other things? Could you launch a new feature?
What happens if marketing didn’t receive the message? Or the product team wasn’t made aware of the test by the marketing team? That’s a lot of time spent on getting things done and hoping nothing contaminates the test.
Contamination… there will always be some contamination. It doesn’t matter what smart software you use or how smart your data scientist is – you will very quickly find yourself in the contamination gray area because something went wrong in the setup, release or in something to do with your user. Either way, again, it’s a lot of effort that ends up nowhere!
The concept behind your design will always be key, but let’s face it, anyone can create a fake advertisement that sells your product as X and then the user arrives and finds it’s actually Y. So does it really work? What are your acquisition costs here? How will your user feel and how will you bring them back? If your only job is to build a big bank of IDFA (I used a dangerous word there!) then good for you! Keep doing it! But if we talk about retention – about creating an experience – you should treat your users better and give them a reason to come back again and again, because this is your way of making money.
Also, marketers comparing apples to peaches doesn’t really work. Neither does comparing Apple search to Google Search; they’re not the same thing. Comparing videos to images will never work out either, and we only try to do it because we are required to explain why X is working better than Y. The thing is, testing doesn’t really mean we can explain; some things can’t be explained and that’s totally fine! Really! Snapchat ads are always working differently to Facebook ads, and there are so many other factors in play, other settings that we set and then some black-box settings we have no control over at all.
What’s Next?
After reading all this about why we shouldn’t bother with A/B testing, you’re probably asking yourself: “so what does this smartass dude think is the right thing to do?” Here are my recommendations when it comes to product and marketing:
Product, take the time you would have spent on A/B testing and invest it into testing how well whatever you are about to develop fits your users’ needs. I already explained my thoughts about product development in my article, ‘You serve users, not the other way around’. I mean it, you should only be developing products where the user is at the center of your thought process, not your next euro. It will never work out for you otherwise. Release 10% to check the stability of the new feature, but don’t always try to quantify it by ad revenues – this is not an indicator.
Marketers? Stay true to your users’ experience. Create a conversation of trust with them, because with all these privacy changes coming in, you will have big big problems if you don’t. You want to test 500 colors to see which one converts best? Great! Set a testing budget of 25% of your total budget and use A/B testing based on multi-armed bandits so you get your fixed results quickly. Then you can run tests for new things every three months or once you see that new users are no longer matching your product.
If you have managed to read all this up to here, I really hope to hear less about A/B testing and more about users and their activity. I want to hear less about why the purple color converted better than the orange one and more about why the orange color attracts more active and engaged users… stop talking in impressions/clicks/installs and talk more about user engagement and the likelihood of them coming back.
Use your user data, track it better and build better archetypes. In general, create a more user-focused approach supported by the right budget and you will have more users. You can run into a bidding fight, testing your creative endlessly, but it doesn’t always mean you are spending your money better. In fact, it’s just another tool that says doing is better than talking.
Now I get that you may want to kill me or shoot at me, but please do tell me what you think. Let’s have a conversation. What is your best A/B testing method? How long does it take you to plan a test?