Optimized Ecommerce EP 075 – How to Use A/B Testing Efficiently and Improve Conversions
This week on the Optimized Ecommerce Podcast, our returning guest from the BGS team—Eric Kwoka joins Tanner Larsson to talk about the importance and benefits of A/B testing as well as some great results BGS had running A/B testing. Listen to this week’s episode as Eric discusses the efficient ways on how to use
Welcome to Episode 075 of Optimized Ecommerce – How to Use A/B Testing Efficiently and Improve Conversions. I’m your host, Tanner Larsson, CEO of BGS.
BGS means Build Grow Scale! It is a community that we founded where eCommerce entrepreneurs and physical product sellers come to learn how to take their businesses to the next level.
Eric Kwoka has been a part of the BGS team for a long time, he is one of our Revenue Optimization Experts who works on all the Amplified stores and is literally in the trenches doing testing, optimization, and tweaking on different stores every single day.
Eric is a big part of BGS, he got one of those amazing brains on which he can test data and turn it into huge wins for our Amplified Stores.
Here’s just a taste of what we talked about today:
Eric discussed what is A/B testing is and why it is important?
The basic idea of A/B testing is when users come to your site, they see an “A” version or, “B” version of a single piece of the page. They get one or the other and be able to see through your analytics, whether that test has changed the user behavior.
For example, on the product page, you have those PayPal buttons but isn’t existing on other pages. Then you see over hundreds of thousands of users, whether this actually translates into differences in conversion rate, add to cart, email signups, and the overall revenue.
The most important detail of running A/B testing is that it is happening at the same time, where users are literally sitting next to each other. They go to the store and could end up seeing different versions of the small differences that were made. And over time helps to smooth out any of those issues that might arise.
And then, Eric talked about the need to limit what people are testing.
The test should ideally be like a single change or something that is pretty impactful. Especially if the store has lower traffic, you want to have something that’s big, that’s most likely to be seen and have an impact.
As your business grows in size, it also changes what you’re able to do with your testing, as well as what your main focus is with the testing. When you’re a small store, a lot of your testing is going to be risk-averse.
Like, you read a blog post, you listen to a podcast, and they said ”hey, do this to your store” you’re like, I’m not so sure on that. You should do a test on that, instead of implementing it right away, to make sure you’re moving in the right direction because you’re going to have low traffic.
Tests are going to take a long time, and the purpose of this method is to double-check that something isn’t hurting you, as opposed to just implementing stuff, which can be great if the source is good, but we don’t know which of those things might be helpful in the long run.
We also discussed a few other fun topics, including:
- How do people get started with A/B testing?
- Some great examples of BGS A/B testing wins and results.
- Why does one test often lead people to what the next test should be?
But you’ll have to watch or listen to the episode to hear about those!
How To Stay Connected With Eric Kwoka
Want to stay connected with Eric? Please check out their social profiles below.
Also, Eric mentioned the following items on the show. You can find that on:
Tanner Larsson 0:07
What’s up everybody, Tanner Larsson here and welcome back to The Optimized Ecommerce Podcast. Super excited to have you guys join us and to talk about today’s episode. I love this stuff, it’s all optimization all ecom all the time. And today we are joined by a special guest and a repeat guest from the BGS team, Eric Kwoka. And Eric is going to be rocking out with the most exciting topic you’ve ever heard of A/B testing. Now, some of you guys like, oh my god, that’s so boring. But I promise you when you actually get into A/B testing and learn what it can do, and the winds and the money, it can make your store, it becomes quite a sexy topic. So we’re going to get down and dirty with the A/B testing type stuff. And then Eric is also going to be showing you some existing and new wins and tests that we’ve run and results in what’s happened there. So Eric, thanks for being on.
Eric Kwoka 0:59
I’m always happy to be here.
Tanner Larsson 1:02
Yeah, if we had to pick one person on the BGS team who is like the Brainiac of the Brainiac when it comes to this stuff. That would be Eric, he’s basically become a walking encyclopedia of all things, optimization, and testing, and user experience, and all of that. So really excited to have him share what’s in his head because he’s always educating us on the team with all the new stuff that he’s coming up with. So very excited about that. So Eric, why don’t we just jump into this a little bit? I mean, we don’t really need to intro you, I guess because they all know you. And if they don’t listen, another episode, Eric has been a part of the team for a long time, he runs a part of our amplified Partnership Program, working with our amplified stores, and is literally in the trenches doing testing and optimization and tweaking on these stores every single day, works with stores in the US works with international stores do all of it. But, Eric, let’s go ahead and just jump into what is A/B testing? And how can someone understand what it is and why they would want to do it?
Eric Kwoka 2:02
Yeah, the basic idea of A/B testing is as users come to your site, they get one version of something like the A version, or the B version, normally of like a single piece of the page. So they get one or the other and you kind of see through your analytics, whether that actually has a change on the user behavior. For example, if you say on the product page, you have those little but it now like PayPal buttons, like you have that on one, and you don’t on the other. And then you see over hundreds of thousands, tens of thousands of users, whether this actually translates into differences in conversion rate, add to cart, email signups, and overall, at the end of the day revenue, as like that Northstar thing. But the main important detail of this is that it is happening like at the same time, where you have users literally sitting next to each other, they go to the store could end up seeing different versions in these small differences. And that over time helps to smooth out any of those issues that might arise. If you look at the data, make a change, and then look again, because you don’t know what else is going on in the world. If you have your new version happens on Black Friday, obviously, it’s going to do better than whatever you had before. But it has nothing to do with what you actually made the change. So the benefit of the A/B testing is that these are all happening at the same time. So even if it runs over Black Friday, theoretically, each side is getting that equally.
Tanner Larsson 3:42
And guys if you’re thinking like how does this work? Well, you’re using a piece of software or a script or something that’s basically using a rotator to rotate based like every other person or 50%, or whatever percentage you set the split test to it will rotate the view. So visitor number one sees A visitor number two sees B visitor number three sees A visitor number four sees B and it kind of goes back and forth like that. And that’s the very basic version of how it operates. So it is a bit geeky, but it’s also super powerful in that it’s the only real way to know if something’s winning or outperforming something else. Now, Eric, we’ll talk about how to do it in a second. But like, how you need to limit what you’re testing your variables. Because I think that’s something people get confused with.
Eric Kwoka 4:35
Absolutely, the test should ideally be like a single change. We’ll kind of get into this a little bit later with some of the example tests that I have. But you want to try to have something that is pretty impactful. Especially if you have lower traffic, you want to have something that’s big, that’s most likely to be seen most likely to have an impact, and ideally as atomic in the sense like that it’s just a single thing on the page that you’re not changing up a bunch of random things that may not be even connected in a way, because then your data’s gonna be all over the place, and you’ll have no idea what thing is contributing in which direction. But a big thing is, I know a lot of our listeners are probably small stores, as well as possibly some bigger ones. As you grow in size, it really changes also what you’re able to do with your testing, as well as what your kind of main focus is with the testing. And when you’re a small store, a lot of your testing is gonna be just on being risk averse. Like, you read a blog post, you listen to a podcast, and they said, Oh, hey, do this to your store, you’re kind of like, I’m not so sure on that, you might do a test on that, instead of just straight implementing it, just kind of make sure you’re moving in the right direction because you’re gonna have low traffic. So tests are going to take a long time, and you’re not necessarily going to be able to do super great science, you can kind of double check that something isn’t hurting you, as opposed to just implementing stuff, which can be great if the source is good, but we don’t know which of those things might be actually helpful in the long run.
Tanner Larsson 6:16
And contrary to what’s kind of a layman’s or popular opinion about A/B testing, it’s not about testing two completely different pages. If you have a product page layout, you don’t necessarily typically you’re not testing a completely different layout, you’re testing and changing one element on the existing page, I hear a lot of people talking about split testing, like, I’m going to whip up a completely different page to test. The problem is when you do something like that. You don’t know what on that different page is actually working or whatever, you can do different layout tests. But those are not your typical A/B tests. So don’t get in this mindset that you have to come up with a completely new page design or a completely new thing to test, we’re just literally if on your product page, you might test adding a wizard that walks them through that, that could be the test between just your static image based versus a wizard, that’s one icon that we’re changing or one thing, or you’re testing a new product title or something along those lines. The nitty gritty of it could be button testing, like testing button colors. But we don’t recommend that unless your store has massively high traffic and you’ve tested everything else under the sun first.
Eric Kwoka 7:27
If you’re on Amazon, you can test a button and probably get a good result in an hour. But most doors are never going to be meaningful, like so long as your buttons stand out whether it’s blue or red, probably not going to be making a huge difference.
Tanner Larsson 7:42
Yeah. Cool. So in terms of why is there anything else you want to touch on and why they should be using it. I know we covered some of it already. But is there anything else you want to touch on that?
Eric Kwoka 7:52
Just the main things I mentioned smaller stores and risk aversion, just kind of double checking some of those things that you might have implemented otherwise, but you just kind of want to run some of them through something to see, instead of just assuming that it’s going to work kind of knowing that it does or doesn’t, sometimes it’s just gut checking yourself where you think this idea is so fantastic and amazing. And then you test me like, oh, not quite what I was expecting. But then as you get bigger and have more traffic, you can start doing more like fine tuning and optimization. And when you get into super big stores, you get into like automation, and personalization, and all those different factors where you really start really micro adjusting experience. As a huge level, it’s into major rethinks, where you can test the larger thing. But the main thing is just about being able to be making changes to your store that aren’t just following a fad or somebody said to do this, so I’m going to do it, and kind of making it so that you’re making more data informed decisions. So this thing is actually getting more users to buy as opposed to it’s the hot thing.
Tanner Larsson 9:07
And if you think about it, guys, a lot of times you’ll have like four things you do to your site. Well, let’s say you add four things to your site, make four changes. And let’s say three of them, actually give you a lift, but one of them takes away from your lift and actually hurts you enough that it negates those other three. So you would think those four things made no impact but you have no idea that one change you made is actually causing a problem. So that’s where the A/B testing comes in. Because you can test one element at a time, and then figure out what’s actually winning because it happens a lot guys. Even with us, if we put too many things on a site at one time, we may have something that’s hurting us. That’s negating the game that we would have had if we had known what was working. So does this step by step and we use a lot of testing to figure this out so that we don’t make those mistakes, because a lot of times you just don’t know. And honestly, Eric, when you say that our gut is usually wrong? We do the test, a lot of times we think we’re gonna win, don’t win.
Eric Kwoka 10:12
Yeah, there’s definitely a few where it’s, this is a sure thing. But we’ll just test it anyway. And then we go back, and just like, what happened? What could have gone wrong? What do I have to change about my whole belief system to make this fit in now?
Tanner Larsson 10:29
Cool, A/B testing is a scary subject. So how can the average person or even not the average person, how do people get started with it without having to become a data geek?
Eric Kwoka 10:41
Yeah, so right now the best tool, especially for getting started is Google Optimize, it’s free, it links in with your Google Analytics, and the editor and stuff are pretty good. Like that, you can get actually a lot of those entry level tests done without having any coding knowledge at all, as long as you can get through, it’s a very simple guide for installing it to your store, then everything actually works pretty quickly and easily from there. And it does do a lot to simplify some of the statistics that come out the other end, it doesn’t do the most powerful job of really explaining the intricate details of the results that you’re getting. But unless you’re super into making your own spreadsheets and formulas and stuff, then it’ll get you most of the way there, especially when you’re just getting started, you’re not necessarily having a bunch of energy and resources that you can put towards a serious testing program, they can at least give you some of that 80/20 that simple 20%, that will get you most of the results.
Tanner Larsson 11:48
And guys, we use heavier duty testing platforms like Convert.com and some of the other tools out there that are super heavy duty. But we also use Google Optimize, and a lot of our stores because for a lot of the testing that needs to be done, it’s completely acceptable. And it provides great stuff. And it’s way easier to use and a hell of a lot more affordable, since it’s free. So we use a mix of different platforms and testing tools, but for sure, Google Optimize is where you should start, absolutely. And it’s pretty good. But on top of that, Eric is actually in the process of finishing up a Google optimized course that we’re going to release, and that’s how to use Google Optimize specifically for e-commerce stores so that everybody can test the way that Eric does, and the way that our team does, and that’ll be something that we’ll be releasing as soon as he’s got that finished. So that’ll be available to our Ecom Insider for free. It’ll also be available for sale if someone just wants to buy that course, and they’re not an Ecom Insider. If that’s something you want, I highly recommend you look at that if you want to get into user testing, because the one downside I’ll give about any kind of like Google Analytics, or Google tool, or whatever is when they teach you how to use it. They’re teaching the generic web process that would work for any store or whatever, like Google Analytics, if you try to install it on a website, but it’s not specific to a store. Whereas what Eric is going to be putting together in this course, and has done is how do we do it, use it specifically on ecom stores? And how do we test specifically for ecom. Because there are nuances that are different from other platforms and other types of sites.
Eric Kwoka 13:23
It’s actually even worse than that, Google’s official on their YouTube channel optimized, little lessons on it are totally awful, you will be going in the totally wrong direction, they don’t teach you how to install it correctly, a show test that logically doesn’t make any sense. Like they don’t work, even what they demonstrate how to do, I’m not sure who wrote that stuff. But when I was even getting more familiar with it, I was like, that doesn’t make any sense. Hopefully, our thing can help solve that. And it’s designed in a way where we take you through tests, start to finish, and simple ones when you’re really just getting started. And maybe you don’t have a lot of traffic or a lot of statistical knowledge or coding stuff, and then more intricate stuff as it goes on until it’s built-in that kind of layered, like a cake, a beautiful A/B testing cake.
Tanner Larsson 14:17
Awesome. So guys, not a pitch to the program or anything. But if you are using Google Optimize, and you’d like to get the most out of it, that course will be released very, very soon. So check the BGS website or if you’re an Insider, just log in, it’ll be there. Moving on from that though, Eric, why don’t we start showing some examples because I mean, this will help kind of drive at home for people and why it’s important and also everybody always loves test results because like, Oh, they figured it out. I can go do that.
Eric Kwoka 14:43
Yeah, awesome. I have a selection of five tests here from you don’t have screenshots and stuff because I didn’t have the time to get that cleared with clients and stuff. But we’ll go through it I’ll describe everything. Go over the data on why we made the decisions that we did and how we chose to analyze them. So you can kind of understand the thought process that goes into a test, as well as enjoying some of these interesting test results. The first one I’ve got this was one that I actually ran on one of the stores that I manage. And this is the Express checkout buttons in the Shopify checkout. I think by now they’ve totally rolled it out that you can’t suppress these without having Shopify Plus. But in the past, this really worked well, in our own app, then Shopify stopped us from being able to use it. And then we found another way, and then Shopify stopped that way. But I was able to get in. And before that happened, do a nice test to see how it went through. This was common practice for us, as well as some widespread usability testing, pointed out that these Express checkout buttons can be a bit distracting on that first step of the checkout, when people are just putting in their shipping information, as well as it can take away from getting people to submit their email for marketing, that doesn’t then finish the checkout. Because if you start into that, PayPal, the PayPal Express checkout thing, and you start putting in your information, but then you stop, the store gets nothing, if you as a store owner, you don’t get their email or anything to follow up. Well, at least if they do that first step in the Shopify Checkout, you might have their email to be able to go. So these are the kinds of ideas of like, why is this maybe important to get done. So anyway, as we went through, and we did this test, it ran for quite a while. And we found that in terms of the actual effect on revenue, it was basically just nothing, it didn’t really affect it at all. Interestingly enough, a lot of people say, Oh, you need to have these, because people use them, they just want to buy with their Apple check out Google pay. We found customers were still getting through pretty much equally. The difference is so small that you can’t even say that there is a difference, the coin toss, really. So that was extremely surprising for me, and I’m really expected to do something, either distract people, as a lot of the past stuff had done, or help people get through it faster. And I imagine a lot of this has to do with the changing landscape. Because a lot of these things, a few years ago, were new, could be a lot more mysterious, especially for older audiences, which this store happens to have. But eventually, I would expect that that would have been, become more and more natural for people to do, and maybe we’re kind of at that tipping point. But what was especially interesting here was the effect on email captures was the opposite. We would have expected that people going through PayPal, and those things because it’s actually more difficult to get the emails that way, at least than we expected from the UX standpoint. But it turned out that individuals that went through those or had those Express checkout buttons, were significantly more likely to actually have signed up for marketing emails, by the time that they got through the whole process. So that was especially surprising. Now, this was not on our Shopify Plus store. So we didn’t get to really dive into the full event tracking. But that was pretty significant, and not a way that I had expected, or how it really went against some of the knowledge that we had expected to have true, you test those sacred cows, and you find out that they’re just not so anymore.
Tanner Larsson 18:51
Or more on a case by case basis as well as to whether the store by store basis.
Eric Kwoka 18:57
Yeah, and ended up actually being a 10% difference in email captures. And for this specific store, that translates into about 60 pounds per 100 customers to actually buy, like based on projecting that into the future because of the fairly low normal cart value store. So that’s just this difference of have we not had the buttons, adding them would get us this kind of a value, well luckily Shopify is making us do that anyway. So you know, that kind of kick that off.
Tanner Larsson 19:35
Cool. And that’s a perfect example of a test that we would have assumed would never win.
Eric Kwoka 19:41
Tanner Larsson 19:42
Yeah. What else have you got?
Eric Kwoka 19:46
So the next one is, this was a collection of changes on the product page. This is for a product that’s a natural therapeutic jewelry product and kind of just doing some list of changes. So normally, you wouldn’t want to do multiple changes in a single test. But sometimes, depending on traffic, or how big these individual things are, you might almost need to lump them together just to make sure that you have an impact. Because if you make a small thing, you might spend three weeks testing, and it does nothing. But if you test four small things together, it’s more likely to get an impact that you can see statistically. So in this case, it was primarily about truncating the description and making it so that the description instead of being the full thing right on the page, which in this store on mobile was many viewports worth of description, instead, it would be about one viewport and description and then have a read more button that you could expand it out. And what we found in this was actually that getting users to the review section, which was right underneath that description, translated into a 10% boost in revenue for this store. Customers like those reviews and this seem to show a much better way for them to get there in the first place. And we kind of tied this back to cognitive load issues that before, maybe it was just kind of seeing a bit too much not knowing what was important to them, kind of scrolling through getting that scroll fatigue, and then just moving on. While here, they get to that review section. And they kind of see a bit more of an overview of the product. And if they want that details they can go in. And that was really good for this store, because that 10% boost was 100 grand, over six months, we project that out, just for this simple test of what was essentially adjusting the description to be a bit smaller as the default position on the page, as well as just adding a size chart link. But that was kind of a smaller thing.
Tanner Larsson 22:04
Right, it’s also a good example of it’s not all about the massive tests or massively changing things. The product information was still there, it was just set up in a way that the customer didn’t have to view it if they didn’t want it right away. And it showed them what they wanted, which was the reviews faster.
Eric Kwoka 22:19
Absolutely. So sometimes just cleaning up that path for users to make it so that they can choose their own adventure what information do they want when they want it, as opposed to you strictly dictating that can really help users have that more confidence and not get frustrated with the store.
Tanner Larsson 22:40
And another example of that is a type of progressive disclosure. So using tabs or accordions or things like that are similar ways that you can control the display of information at the right time and put it in the user’s hands and not forcing them to scroll quite as much. So something you guys can all test as well. Alright Eric, what’s the next one?
Eric Kwoka 23:01
The next one is on the same products, these therapeutic necklaces, but it’s all about the wizard. And I think there’s been a podcast in the past talking about the effect of wizards that can have. In this case, this store already had a wizard, but it was kind of tucked away somewhere like up in the navigation buttons somewhere that you kind of had to go looking for it. And they were finding that some of the most common landing pages, the collection pages were the main place that this store was sending traffic to. And some of the top landing pages were actually having the worst overall performance for the store. And kind of looked at us like, maybe people that are coming into the store are just not sure where they want to go. Like when it comes to these kinds of natural remedy type things. People might not know their way around the landscape or how to even interpret the store right away. So user tests were showing that customers were having trouble differentiating between a lot of the products. When we’re doing the user testing, and they’re telling us about their feelings and thoughts as they’re going through. That was the thing that came up a lot. And the wizard had been there before but was not being used very often. But when it was used, people were buying, like it was a significant boost. Like if somebody had made it to the wizard they were much more likely to buy. So the change was just adding a small link underneath the collection title, just a small link thing, you’ll need help finding your perfect product. And it was just on these top landing pages, the top five ones. And this had a rather large boost, like so just on mobile alone. This increased revenue by over 11%. Th]is is a store that was already doing pretty well, just in the testing data. For the people that were involved in this testing on these landing pages, the conversion rate was already over 10% during this period, and then they still got an additional 11%. Like that’s really huge for a store like that.
Tanner Larsson 25:11
That’s awesome, yeah.
Eric Kwoka 25:12
So this translates into 75,000 for this store, just for adding this link that they already had to a wizard, it would have been nice to see the full wizard because setting up a whole wizard is a bit difficult. But sometimes once you have it and getting it into the right places can have a very meaningful impact on the customer experience.
Tanner Larsson 25:35
Yeah, and I mean, as he said, it was a big lift. But also, now we’ve got some information, hey, putting these links over here. And then like Eric said, maybe the next thing would be nice to show the actual wizard or, build it into those pages. So that can be the next test is like, hey, this validated that having this here gave us an 11% boost, what would happen if there was a way to access the wizard in the page and just fill it out right there is that going to give us an even bigger lift? So one test often leads you to what the next test should be?
Eric Kwoka 26:06
Yes, and that is the perfect segue, I almost can’t believe you didn’t know what I was gonna be talking about next. The perfect one, is because there was a second follow up test to this, which is a great way when you find tests that have a big impact, even if it’s bad, like good or bad, if you make a change, and you have a big impact from it, then you know that this is something important, there’s something important there to your users. So let’s pretty common, I love to do is if I have a big impactful test, do something else in that same area, because we know this is going to have an impact. And you kind of keep doing that until the difference becomes too small, and you’re kind of at your local maximum for that. So in this case, before it was just on the top five landing pages, and we’re like, why not put it on all the collections pages. So they went through and just exact the same implementation, but now on every other collections page. And the big shocker here is that it was actually the opposite effect. So I found that once that was no longer the major landing pages that people were coming to, but once that they are already on the site, it then had an actual negative impact, a loss of 6%, to have it on these other pages that were not that main landing page that users were arriving at. So luckily, like they did this test, and not just push that out to everything, because it’d be really easy just be oh, it was on the five, let’s just throw it on everything else. And we’ll see that when. But in this case that started to violate the user’s expectations. Once they were in the store and going through things, they might have had more of a lay of the land. And now this link was a distraction for people just initially landing that maybe aren’t familiar with the products yet. It was very helpful. But once they’re there not as helpful anymore.
Tanner Larsson 27:58
Yeah, and that’s, again, a perfect example of why testing is so important or further testing as well, because, man, even me, I would be like, yeah, let’s put it on everything. But clearly, that doesn’t always work.
Eric Kwoka 28:14
Yeah, absolutely. And it’s great that they did this test because that could have easily undone most of the work that they had just gotten through in that last test.
Tanner Larsson 28:23
Absolutely, for sure. What else? What’s the next one?
Eric Kwoka 28:28
So the next one is, this is a health supplement store. So it has actually a really high average order value. Because typically when people were buying supplements, supplements aren’t cheap. But also people like to buy big orders in a single dose with these. And in this case, a lot of users were adding stuff to their cart, and then abandoning so the normal cart abandonment, like they’re having over 50%, which is pretty high. Not necessarily extremely high for high-value carts, but certainly higher than you might want it to be. And they were finding that on the cart page because this is the add to cart and you get sent to the cart page. Once users were there, pretty much all of our click maps and other kinds of things. We’re just showing that people weren’t ready to move forward. So they were clicking continue shopping and going back to collections and such. So the basic thought is maybe users aren’t really ready to go further down the funnel. We’re off to the cart. And initially, it was tested on how can we make the cart better, because maybe the cart itself is just the problem. And they did some tests in there and it wasn’t really resolving this issue. So then this test specifically was instead of using a full cart page to use the little Ajax cart or the cart drawer that kind of slides out from the slide where you don’t have to go to another page. You’re still wherever you were, and it doesn’t fully interrupt. Your browsing experience and some interesting parts of this is on this cart drawer there was no Express Checkout button like the PayPal button which was on the cart page and had been winning on the cart page and pass test and there were also were no you UPS in this cart drawer which there were on the cart page like the little UVP icon saying like oh, free shipping or products or yada yada yada, those things so just implementing this cart drawer that just kind of updates as you add stuff in and just kind of slides out this had an insane impact on the store. Like just on mobile, revenue was up over 40% just having the cart drawer instead of the cart page and on desktop, it was 100%. I mean there’s much lower traffic on desktop but that’s a massive increase. I can’t imagine anybody was expecting that big of a change like even thinking it was going to be better. And overall this ended up translating into 250,000 in revenue just over the next six months assuming everything else is the same which of course at the store is growing becomes bigger and bigger and keeps making money forever. So it just snowballs further and further. So just this change was translating to 250,000 in six months, which is massive for what can seem like such a small conceptual change to the process of the main insight to come away with from that is just like maybe pushing users down the funnel isn’t always what they really want to be doing. Because they’re still collecting things up and deciding, imagine being in a store and someone just keeps trying to get you to go over to the checkout after you pick up one thing be the worst experience ever, you never want to go back to that store. And it works just the same here where the cart drawer was much less interruptive to the customer’s experience as compared to a cart page where they have to go through the loading the end up over there and then hit Continue shopping and load another page.
Tanner Larsson 32:12
And guys this test, I mean, we obviously we still tested on every store but this is why you hear us saying one of our best practices is to always use a drawer card or an Ajax cart, it’s one of those things that we typically start off with any of our app partner stores we still test against it because sometimes it doesn’t win and sometimes there are other configurations that we want to use but I mean obviously 100% increase on desktop and massive increase on mobile as well. And again, but from what Eric is talking about is a normal process is to push them down the funnel, that’s what we were kind of trained to do from a direct response standpoint, but what we’re learning more and more about from usability is it’s the buyer’s journey and it’s got to be optimized for the buyer. And it’s not just optimized to make money but optimize in a way that doesn’t piss off your customers or potential customers who are coming to come back later or whatever so you’re kind of optimizing multiple levels of this stuff as you’re doing it so the old school direct response principles don’t always hold true in this kind of stuff especially on ecom stores.
Eric Kwoka 33:18
Yeah, and this could also easily go the opposite way if it’s a single product store or things where people just buy the one thing because maybe you just show them your small upsell or something but then otherwise move them down because there isn’t much browsing to do but in terms of health supplements they have a fairly wide selection and people like to really gather that stuff up. So this really worked for keeping that AOV high for that situation.
Tanner Larsson 33:43
Yeah, absolutely. Awesome. Do you have anything else? Or is that I think it was the last one.
Eric Kwoka 33:49
One last one.
Tanner Larsson 33:49
All right, give it away.
Eric Kwoka 33:50
So this is one that even going in this actually comes up a bit in the Facebook group too. I know you talked about this even there too. We tested a Wheelio against a custom-built Klaviyo form. And like everybody hates these things but Wheelio, most people on the marketing side hate it. But we always just say, it works.
Tanner Larsson 34:15
Yeah, like it’s cheezy and it can make money, right?
Eric Kwoka 34:16
Tanner Larsson 36:56
And even more important in today’s environment, all the iOS stuff like you’re not able to target so it’s even more important to grab those emails or push notifications every single way you can, because you got to have a build a list separately. So a 90% increase in list building, without a significant or even noticeable difference in income is a win like that’s totally worth doing.
Eric Kwoka 37:20
Absolutely, especially if you have good tracking of your lifetime value of those email signups you can start much more easily measuring that against it. And in this case, the lifetime value of just getting a sign-up was around 10 pounds, and the cost was going to be over six months, something like less than 5000 pounds. They’re getting twice as many signups in this case in two weeks, it was another 250 signups, which translates in two weeks, that can be 3000 pounds. And it’s only costing 5000, over six months, assuming that what we saw was exactly what the impact will be. That’s an easy trade-off for most people.
Tanner Larsson 38:08
Absolutely. And guys, he keeps talking about pounds, because this particular store is a UK-based store. So it’s actually in pounds, not dollars. But this depends on what the story is to what the currency has been. In this case, it is not he’s not talking about weight. He’s talking about pounds, like British dollars.
Eric Kwoka 38:28
Yeah, so the main insight there is just, even if we think it’s dumb, and we just hate it.
Tanner Larsson 38:35
And we do, guys, I know you do, too. It’s what we hear more than anything we talk about, Wheelio or any of these different gamification, just you know, where we mimic the span or whatever? We think they’re cheesy, we think they’re ugly. We think they’re obnoxious, but they win, and they make us a lot of money. So we kind of have to bite our tongue since we’re in the game of making money, right?
Eric Kwoka 38:56
Tanner Larsson 38:59
Awesome. Any last thoughts on testing you want to share with anybody before we wrap up?
Eric Kwoka 39:06
I’m just that, if you haven’t done any testing, just get in there, test something simple. One of the best first tests that you can do is just on your main landing page, like where you’re sending your traffic to, change your headline. It’s just like when you’re doing your ads, changing that main text is going to be the most impactful thing for you to really look at where the image, in that case, go to your landing pages, change your headlines, change your banners, and test them against different things. And even if you’re a fairly small store, you’re pretty likely to see an impact from that.
Tanner Larsson 39:44
Yep. And for other ideas on testing, guys, you could take any of the ones that Eric just talked about here, you could go to any of our other podcasts or anything, we’re talking about results. And if you hear us talk about a test or a thing we did or optimization, Oh, I like that cool. Don’t just take our word for it. Don’t run it as a test. And that way, I mean, honestly, that’s what most of our Econ Insiders do. We give them the information, and then they go and test based on that, sometimes they don’t test it, sometimes they just take it. But they’re not having to generate the ideas for what to test. We’re giving them hey, here’s what works, here’s what didn’t work. And the thing is if it doesn’t work on the supplement store, we tested it on, that doesn’t mean it’s a losing test, we’re gonna run that test, more than likely across every other site in our network, because sites are contextual. Just because it didn’t work here doesn’t mean it’s not going to work on other sites. Also, just because it does work on this site, doesn’t mean it’s going to work on other sites. So we want to test it across as many different platforms as we can to give you guys the best information. But you need to test it yourself as well, there’s no way to get around that.
Eric Kwoka 40:50
Yep, absolutely. As much as we’d love to say that we’re always right. And what we tell you to do to your store is going to just make you super rich, we know some of it isn’t going to work on every single store that’s out there. And definitely, sanity checks, tell us where we’re wrong. But if you have the data, we’re happy to be wrong, as long as you’re learning.
Tanner Larsson 41:13
And absolutely, it’s kind of a collective thing. Again, I gonna keep talking about why it’s so powerful because within our Amplified program, we’re taking a test and we’re running it across, 20 plus different seven, eight-figure stores, which is great. But then we take that same data, and we roll that data out to our Econ Insiders, there’s 500 of them. And they’re testing it across 500 more stores, different niches, different industries, different everything. So that group has the data of hey, this optimization works 96% of the time across any store, this works great on supplements, this one works great on POD, but not on supplements, or this works great with audiences between this age and this age. And that’s what really makes BGS so powerful is we have that community and we have the volume of stores to test all this stuff across multiple things. So when we say it’s the best practice, it’s because it wins most of the time. But again, we still test everything, it’s always contextual, we could both have Eric could have a POD, a nurse’s site, and I have a POD nurses site, the test wins on his site, it could lose on my site. It’s just you have to test it. And what Eric showed you today and with using Google Optimize, and following Eric’s course on Google Optimize, it’ll be super easy for you to do that. And always make sure that what you’re putting on the site is helping and if it’s not helping you know it so you can take it off and try something else.
Eric Kwoka 42:37
Absolutely, it can get really easy to be like, oh, all these apps are gonna make me more money. And you can’t ever trust the apps own dashboard, because it’s gonna take credit for everything, just like many people see with Facebook, where it’s like, oh, no, it wasn’t really Facebook that got me that purchase. It was the email that I sent. But Facebook’s just like, all me.
Tanner Larsson 42:57
Yep. And any app that reports sales, we’ve seen it, we’ve seen the backends. And we know how they work. You’ll have Klaviyo, you’ll have your SMS One, you’ll have your Facebook Messenger, you’ll have your add to cart, you’ll have all these apps claiming the same sale. So that’s why the testing is so important to know where it actually came from and what’s actually making the impact. But guys right now, I don’t want to keep this going any longer. Check out Google Optimize, it’s free and it’s a great way to start. If you need help with that, then check out the course that Eric is going to be releasing on the BGS site very soon. If you’re an Ecom Insider again, you’ll have access to it as soon as it’s released. And to get the show notes on this podcast. And hear the test results again or whatever, go to BuildGrowScale.com/podcast. Also, if you’re not subscribed, you can get all the links to subscribe on your favorite platform there. And with that, we will see you next episode. Thank you so much for joining us and I look forward to seeing you guys next week. See ya.
Ecommerce Store Audit
Want us to do an Audit on your e-commerce store and show you how you can make some quick changes that will dramatically increase sales and profits without increasing your traffic?
Ecommerce Store Audit
Want us to do an Audit on your e-commerce store and show you how you can make some quick changes that will dramatically increase sales and profits without increasing your traffic?