Automation Playbooks

What We Learned Running 10,000 A/B Tests Across Native and Display

We ran thousands of creative and placement tests across our platform. Here are the patterns that surprised us and the ones that did not.

Over the past year, campaigns running through Adxe generated over 10,000 A/B test comparisons across native and display placements. We analysed the aggregate data to find patterns. Some confirmed what we expected. Others surprised us.

Native Outperforms Display for Prospecting. Every Time.

This was the most consistent finding. For cold audiences seeing a brand for the first time, native ads delivered 2-3x higher engagement rates than standard display banners. The reason is straightforward: native ads match the look and feel of the content around them. They earn attention rather than interrupting it.

Display still wins for retargeting, where brand recognition already exists. But for top-of-funnel, native is not even close.

Headlines Matter More Than Images in Native

We tested hundreds of native creative combinations. The single biggest lever on CTR was the headline, not the image. Swapping headlines while keeping the same image produced CTR swings of 40-60%. Swapping images while keeping the same headline moved the needle by only 10-15%.

If you are only testing one element of your native creative, make it the headline.

Display Creative Fatigue Is Faster Than You Think

The average display banner starts losing performance after just five to seven days. We saw this pattern across industries. CTR drops 25-30% in the first week, then another 15-20% in week two. Brands refreshing creative every two weeks consistently outperformed those running the same banners for a month or more.

Video Native Beats Static Native by 35%

When we compared static native ads against short-form video native ads (under 15 seconds), video won on engagement rate by an average of 35%. Completion rates on video native averaged 68%, which is strong considering these are in-feed placements.

The Unexpected Finding: Time of Day Barely Matters

We expected to find clear performance patterns by time of day. We did not. Once our models controlled for audience and placement quality, time-of-day effects were negligible for most verticals. The exception was B2B campaigns, which performed 20% better during business hours. For everything else, the impression quality mattered far more than the clock.

The Takeaway

Testing is not optional. The brands that treat creative and placement decisions as hypotheses to validate consistently outperform those running on assumptions. Our platform makes this easy with automated split testing and real-time budget allocation toward winners. But the mindset matters as much as the tooling.

OTHER BLOGS

Read more

The next generation AI-powered DSP for cross-channel campaign optimization.

Stay Updated

Get expert insights on programmatic ads, AI optimization, and industry trends.

© 2026 Adxe Pty Ltd. All rights reserved.

ACN: 684 683 289 | ABN: 29 684 683 289

The next generation AI-powered DSP for cross-channel campaign optimization.

Stay Updated

Get expert insights on programmatic ads, AI optimization, and industry trends.

© 2026 Adxe Pty Ltd. All rights reserved.

ACN: 684 683 289 | ABN: 29 684 683 289

The next generation AI-powered DSP for cross-channel campaign optimization.

Stay Updated

Get expert insights on programmatic ads, AI optimization, and industry trends.

© 2026 Adxe Pty Ltd. All rights reserved.

ACN: 684 683 289 | ABN: 29 684 683 289