CaringBridge

How adding a lower option to a gift array affects conversion and revenue

Experiment ID: #33477

CaringBridge

CaringBridge offers free personal, protected websites for people to easily share updates and receive support and encouragement from their community during a health journey. Every 7 minutes, a CaringBridge website is created for someone experiencing a health event.

Experiment Summary

Timeframe: 07/10/2020 - 07/29/2020

CaringBridge had long held their gift array at $50 / $100 / $250 / Other. However, throughout their site, there were multiple mentions of a $30 gift handle, which powers a CaringBridge site for one month. They wanted to test the concept of adding this option to the site. They hypothesized that it would increase conversion, but decrease average gift. If that happened, and the variance between the two metrics was acceptable, then they would take more donors over a slight decrease in immediate revenue. 

Research Question

How will adding a lower option to a gift array affect conversion and revenue?

Design

C: Control
T1: $30 Option

Results

 Treatment NameConv. RateRelative DifferenceConfidenceAverage Gift
C: Control 7.1%
T1: $30 Option 7.6%6.2% 99.6%

This experiment has a required sample size of 26,744 in order to be valid. Since the experiment had a total sample size of 117,839, and the level of confidence is above 95% the experiment results are valid.

Flux Metrics Affected

The Flux Metrics analyze the three primary metrics that affect revenue (traffic, conversion rate, and average gift). This experiment produced the following results:

    0% increase in traffic
× 6.2% increase in conversion rate
× 0% increase in average gift

Key Learnings

The treatment, with the $30 option, increased gifts by 6.2%. However, it decreased revenue by 7.7%, which left CaringBridge with a difficult decision. Do they take more donors, which gives them more people to eventually retain and keep downstream, or do they keep the higher average gift, which brings more money in the door right away?

This inspired more data analysis to understand the downstream impact of this before rolling out—but solidifying the revenue and conversion impact while mitigating risk (as only half of the audience saw the treatment) gave them extremely valuable data to work with. 


Experiment Documented by Jeff Giddens
Jeff Giddens is President of NextAfter.

Question about experiment #33477

If you have any questions about this experiment or would like additional details not discussed above, please feel free to contact them directly.