One of the biggest challenges is testing across multiple platforms. After all, what works on a desktop browser doesn’t necessarily work on a mobile device. Consider that the same user may prefer a green button on a laptop but a blue button on a smartphone screen.

“Digital publishers should learn what they can from A/B web testing and apply it to A/B mobile testing,” notes Slade. “Mobile testing is more complicated, so you probably won’t want to test as many factors on mobile as you do with web. But it’s worthwhile running separate tests on both because web and mobile experiences are so different.”

Remember that mobile apps “represent a more captive audience, which significantly changes behavior and how testing is accomplished,” says Greg Hinkle, CTO and co-founder of Evergage. “[Mobile] users may require time to get used to a new pattern or feature, so tests may need to run longer.”

Web-based apps can typically undergo the same type of testing you’d conduct for a traditional webpage. But if you’re testing a native app, the procedure is much different and implementing changes is more complicated. “The store providing the app will need to approve app changes, and your customers will need to download the app update, whereas web changes occur instantly. Also, you’ll need to find an A/B test provider for apps as opposed to websites,” Slade says.


A/B testing certainly isn’t going away anytime soon. Experts predict that this popular process of experimentation will continue to progress in the coming years, particularly in key areas such as ease of use. “Early testing tools required advanced HTML and JavaScript coding skills to create alternative experiences, but current testing tools offer ‘visual editors’ that allow nontechnical users to make adjustments to a webpage without knowledge of coding,” says Nelson. “In the future, tools might use industry best practices and machine learning to suggest page layout options. This could help generate testing ideas and speed up the time it takes to design alternate experiences for tests.”

Targeting and personalization should also improve. “The leading optimization tools, today, can automatically adjust which segments of customers receive each test variant,” says Nelson. “While this technology is available today, we can expect its adoption and performance to continue to expend in the coming years.”

Abisambra predicts a forthcoming era of A/B/C or even A/B/C/D testing. “I can see artificial intelligence supplementing how and when experiments are executed to not waste a single minute and to improve the percentage of experiments that yield a lift or improvement,” he says.