You can experiment with different ad monetization strategies (for example: add or remove a new network to see incrementality, a new price point, regional waterfall optimizations) and identify opportunities that will maximize LTV.
Creating an A/B Test
To create an A/B test, first select MAX > Mediation > Manage > Ad Units in your MAX Dashboard. Click the ad unit for which you would like to set up an A/B test. The Edit Ad Unit window appears. From the ⋯ menu in the Default Waterfall tab, select Create AB Test.
You then see a small pop-up dialog that describes the process. Check the “Copy existing ad unit configuration…” checkbox and click Create AB Test to copy your existing waterfall to your new test.
MAX copies your existing waterfall structure into the test structure you create. You only need to make those changes that you want to test in your new Test Ad Unit. You do not have to build a complete new waterfall from scratch.
When you create a new A/B test, by default MAX applies your Test configuration to 50% of your users. You can modify this value, for example if you want to run tests longer at a lower volume of traffic. To do so, change the value of Test Group Allocation when you create your A/B test.
You then see a new toggle called Test in the Default Waterfall tab. In this toggle setting, you see your test configuration (which at first is the same as the waterfall configuration you just copied over). Type in a new AB Test Name for your test, and start populating the waterfall with any new line additions or subtractions that you would like to test.
AppLovin recommends that you name your A/B tests with a taxonomy that reflects the start time for the A/B test (for example, “0812_01UTC_FBBiddingAdded”). This makes it easier for you to keep track of changes and to monitor the results.
After you make the changes that distinguish your test waterfall, click Save. The MAX ad unit page updates with your new A/B test.
When you return to your ad unit page, your default view will still be your original control waterfall. Click the Control toggle to switch to Test and compare your two waterfalls.
You can visualize each waterfall setting by selecting Preview Waterfall from the ⋯ menu in the Default Waterfall tab. Then, in the Waterfall Preview dialog, choose the country for which you want to visualize your waterfall. You then see a visual representation of the non-bidder ad networks in your waterfall for that country (and also a list of any bidder networks, which supplement your waterfall).
Monitor the impact of your test in terms of requests, impressions, and revenue for 24–48 hours after the test begins, in MAX > Mediation > Analyze > Advanced Reporting or MAX > Mediation > Analyze > A/B Tests. You then can either promote your test waterfall to all of your users or you can deprecate it and revert to the original control configuration that you had in place. Your A/B test should appear in production and be present in your reports within 20 minutes after you save it.
Any change that you notice reflected in requests, impressions, and revenue indicates a change in the IMP/DAU, REQ/DAU, and ARPDAU, and this was likely caused by the changes you made in your test waterfall.
The example screenshot above shows a 28% drop in IMP/DAU, a 16% decrease in REQ/DAU, and a 27% drop in ARPDAU. If these results hold up after a few hours (to generate a larger sample), this test would be a good one to deprecate, due to the potential revenue loss.
AppLovin recommends that you promote changes that have been stable for 24 hours, above 10,000 impressions per day, and above 2% incrementality for sustained improvement in your ARPDAU. On the other hand you should deprecate any test that has not resulted in a lift.
To deprecate or promote a test, go to the Default Waterfall tab of your ad unit page, and set the Control/Test toggle to Test. Select End AB Test from the ⋯ menu. Then, in the End AB Test dialog that appears, click either Deprecate Test or Promote Test. Then click Save. Your changes will be live in production traffic in a few minutes.
MAX provides an A/B test result page for each test that generates over 10,000 impressions in the test group. This page shows results the day after these impressions were generated. These pages are listed in, and linked to from, the A/B Tests page (MAX > Mediation > Analyze > A/B Tests).
You can read this A/B test result page to understand the Network Revenue share-of-voice changes, the ARPDAU lift anticipated with the test, and many other data points that will help you make the right decision to promote or deprecate the test.