The Heritage Foundation

How a longer, more rigorous survey affects email acquisition

Experiment ID: #7122

The Heritage Foundation

Founded in 1973, The Heritage Foundation is a research and educational institution—a think tank—whose mission is to formulate and promote conservative public policies based on the principles of free enterprise, limited government, individual freedom, traditional American values, and a strong national defense.

Experiment Summary

Ended On: 10/02/2018

The Heritage Foundation was running their Annual Member Survey, which asked members for their opinions on key policy initiatives and then gave them the opportunity to give. The initial treatment for the survey came from the direct mail piece, which was a very basic survey that just asked members to check boxes with the policy initiatives they thought were important. This might have worked in direct mail, but digitally, this seemed very underwhelming to the Heritage team.

They crafted a second survey that was much longer—asking multiple, nuanced questions for each policy area and leaving room for open-ended responses. They knew that this was risky—common wisdom says that many people don’t know what to write in open-ended responses—but they thought that deeper engagement would result in higher conversion.

Research Question

How will a longer, more rigorous survey affect completion rate?

Design

C: Control
T1: Treatment #1

Results

 Treatment NameConv. RateRelative DifferenceConfidence
C: Control 71.4%
T1: Treatment #1 77.2%8.2% 100.0%

This experiment has a required sample size of 428 in order to be valid. Since the experiment had a total sample size of 5,686, and the level of confidence is above 95% the experiment results are valid.

Flux Metrics Affected

The Flux Metrics analyze the three primary metrics that affect revenue (traffic, conversion rate, and average gift). This experiment produced the following results:

    0% increase in traffic
× 8.2% increase in conversion rate
× 0% increase in average gift

Key Learnings

The longer survey delivered an 8.2% increase in completions—which is counterintuitive due to its length and requirements. But it suggests that people might look at the control survey and think that it is either too light and not worth completing, or requires too much repetitive motion. Some of the success of this experiment might be in the design of the treatment survey, that it varies between style of question and allows open-ended response.

Interestingly, the treatment survey also produced an 81% increase in donor response, which meant that it was immediately rolled out everywhere.


Experiment Documented by Jeff Giddens
Jeff Giddens is President of NextAfter.

Question about experiment #7122

If you have any questions about this experiment or would like additional details not discussed above, please feel free to contact them directly.