The Fund for American Studies

How an offer-focused welcome series impacts donor conversion rate

Experiment ID: #39294

The Fund for American Studies

Experiment Summary

Timeframe: 10/01/2020 - 11/24/2020

When The Fund for American Studies was just beginning to focus on building their online program, they began promoting a downloadable PDF resource for prospecting targets to request. They would market this offer with digital ads, which directed people to click to a landing page with an email signup form to have the resource sent to them. Finally, when they completed the registration form on the landing page, they would be directed to an instant donation page.

Initially, they did not have many content offers to provide to prospects, which forced them to initially launch with a welcome series that was focused on telling the story of the organization, or “proving the concept” of their impact by sharing story-based appeals that focused on high-profile alumni. This series also ended with a direct appeal, which was trying to get them to become a first-time donor.

Over time, the program began to build very quickly, and the organization expanded the number of available “content offers” to the market.

They wondered if moving them from the org-centered welcome series to a series that was focused on promoting more content offers like the one they had downloaded would improve their ability to produce first-time donors that had just been acquired.

Research Question

We believe that focusing on promoting similar content offers for newly acquired emails in a welcome series will achieve an increase in donor conversion rate.

Design

C: Org-focused welcome series
T1: Offer-focused welcome series

Results

 Treatment NameConv. RateRelative DifferenceConfidence
C: Org-focused welcome series 0.02%
T1: Offer-focused welcome series 0.19%920.0% 99.9%

This experiment has a required sample size of 1,414 in order to be valid. Since the experiment had a total sample size of 34,096, and the level of confidence is above 95% the experiment results are valid.

Flux Metrics Affected

The Flux Metrics analyze the three primary metrics that affect revenue (traffic, conversion rate, and average gift). This experiment produced the following results:

    0% increase in traffic
× 920.0% increase in conversion rate
× 0% increase in average gift

Key Learnings

With a confidence level of 99.8%, we observed that the new offer-focused welcome series improved the donor conversion rate by +920%.

Beyond that, here are some additional metrics that were achieved along with this new approach during the study period:

  • Email Click Rate: +2.8% (LoC: 79.5%)
  • Email Open Rate: +86.1% (LoC: 100%)
  • Unsubscribe Rate: -42.0% (LoC: 100%)
  • Revenue: +1,091.4% (LoC: 99.9%)

It’s important to note the time frames these two experiences ran and the audience makeup for this study, as well. Those notes are outlined below:

  1. The “control” welcome series ran for a substantially longer period of time (a year) vs. the new “treatment” (offer-focused) welcome series only ran for around 45-days.
  2. The “treatment” experience overlapped the “control” experience for the time that the “treatment” experience was running.
  3. We split the contacts by adding a pre-checked checkbox on the signup forms. If the box remained checked, they would go to the “control” experience, whereas if it were unchecked, those emails would go into the “treatment” experience.
  4. During this time of overlapping experiences that were running, roughly 89% of the people that signed up went into the “control” experience, and only 11% went to the “treatment” experience.
  5. It can also be assumed that the people that unchecked the box were presumably a less motivated audience than those that left the box checked upon signing up—why? Because they intentionally took a step to say “I don’t want anything else from you” when they signed up. Even so, this much smaller and assumed to be less motivated audience outperformed the “control” experience not only for the 45-day period of time when they were running simultaneously, but also when the performance data for the “control” experience was expanded to include a year’s worth of results, which is what is included in the study data above.

Why would this new welcome series be so much more effective than the control?

Our hypothesis is that because we focused on what THEY (our audience—to get more resources/guides they were interested in reading about) wanted, instead of what WE (the organization wanted—to tell them about ourselves, and acquire a new donor) wanted, then the results improved so dramatically.

This is a tried and true principle in business: If you take care of your customer, and give them value, they will reward you with what you want (by becoming a loyal customer). Therefore, it’s no wonder that when applied that principle to how we welcomed new names on to our file, they rewarded us with dramatically improved results that we were ultimately looking for.


Experiment Documented by Greg Colunga

Greg Colunga is the Executive Vice President at NextAfter

Question about experiment #39294

If you have any questions about this experiment or would like additional details not discussed above, please feel free to contact them directly.