Heritage Action for America

How carousel ad format promoting a survey decreased clickthrough rates

Experiment ID: #11061

Heritage Action for America

Experiment Summary

Ended On: 03/20/2019

Heritage Action sends their Annual Grassroots Survey to subscribers and donors on their file each year, and in 2019 they wanted to test whether it could help them acquire new subscribers and donors by promoting it on Facebook.

Because the survey works well with a carousel ad format, and because Facebook traditionally promotes the use of more advanced ad formats, we wanted to test the impact of a carousel ad which used in-app “swiping” between 5 screens, where there was a “cover image” + 4 “question-specific” images to promote the survey.

Research Question

Does a carousel ad format increase ad clickthrough rates? Subsequently, what impact does it have upon downstream metrics like email acquisition rate, donor conversion rate and revenue?

Design

C: Control
T1: Treatment #1

Results

 Treatment NameClick RateRelative DifferenceConfidence
C: Control 3.7%
T1: Treatment #1 3.1%-16.4% 97.8%

This experiment has a required sample size of 6,796 in order to be valid. Since the experiment had a total sample size of 18,772, and the level of confidence is above 95% the experiment results are valid.

Flux Metrics Affected

The Flux Metrics analyze the three primary metrics that affect revenue (traffic, conversion rate, and average gift). This experiment produced the following results:

    16.4% decrease in traffic
× 0% increase in conversion rate
× 0% increase in average gift

Key Learnings

With a 97.8% level of confidence, we saw a decrease in ad clickthrough rate of 16.8%.

Subsequently, the static ad generated an instant donor conversion rate of 2.24%, while the carousel ad generated no donors or revenue.

Our hypothesis is that the control ad didn’t “give away the goods” of what was contained within the survey itself, such that the intrigue of what was in it drove a higher clickthrough rate. Subsequently, when taking the survey, the emotion was built within the survey responders within the control experience such that it moved them to give a gift.

Meanwhile, the treatment (with a carousel ad type) exposed what was included in the survey at the ad impression level, which made the audience pre-determine (a) whether or not they wanted to click (or take the survey at all), and also spent their emotional capital potentially at the impression level, where they couldn’t be tipped to give a gift when they completed the survey (e.g. the emotional release was too early in the process).

Moving forward, we recommend continuing to optimize for the ad clickthrough rate by leveraging intrigue and not using the carousel ad type.


Experiment Documented by Greg Colunga
Greg Colunga is Executive Vice President at NextAfter.

Question about experiment #11061

If you have any questions about this experiment or would like additional details not discussed above, please feel free to contact them directly.