How more links in a digest style email affects clickthrough
NextAfter
Experiment Summary
Timeframe: 01/09/2019 - 01/15/2019
In an effort to share more of our content, we had previously run this same experiment in order to see if we could craft a digest style email with more links, while maintaining our click-through rate. The first test saw no change in clicks whatsoever, but we wanted to run it again to determine if that was a fluke result, or if there was more going on than that test result showed.
Research Question
Will more featuring more links in a digest style email affect clickthrough rate?
Design
Results
Treatment Name | Click Rate | Relative Difference | Confidence | |
---|---|---|---|---|
C: | Long Value Prop | 8.0% | ||
T1: | Short Value Prop, More Links | 5.6% | -30.9% | 100.0% |
This experiment has a required sample size of 787 in order to be valid. Since the experiment had a total sample size of 8,450, and the level of confidence is above 95% the experiment results are valid.
Flux Metrics Affected
The Flux Metrics analyze the three primary metrics that affect revenue (traffic, conversion rate, and average gift). This experiment produced the following results:
30.9% decrease in traffic
× 0% increase in conversion rate
× 0% increase in average gift
Key Learnings
The treatment led to a 30% decrease in clicks with a 99% level of confidence. This significant shift in results from our original experiment was a bit confusing as our initial experiment had no significant change in clicks. In fact, the original experiment had a slight increase in clicks over the control, even though it wasn’t significant.
Upon reviewing the original experiment, there is one key difference between the formatting of the variants. The original experiment’s control utilized section headers that made it feel more like a digest style email, even though the copy was written very personally and in a long-form style. This experiment’s control did not use section headers, but was one long email – more like a friend or colleague would write.
My hypothesis is that this addition of headers into the control in the original experiment took away the human factor and nullified the intent of the experiment. This new experiment is a more accurate “humanized email” vs “digest email” experiment – and possibly the more reliable result.
A future test could be to run the humanized email with headers against the humanized email without headers to determine if that alone is the significant factor in affecting clicks.
Question about experiment #10437
If you have any questions about this experiment or would like additional details not discussed above, please feel free to contact them directly.