Hank Stoever
part time nerd, part time gnar.

Real World A/B Testing Results

posted over 3 years ago - 2 min read

In this article, I'll be explaining two different A/B tests I've ran recently. One of them led to some great results, and the other one was inconclusive. Both times, I learned something!

Test 1 - Description vs. Table of Contents

At Uludum, I use the split gem for running A/B tests. I like it because it's easy to use and is pretty fast, as it uses Redis for persistence. It's not too simple and not too complex; I can configure it just the right amount. It's API is as easy as:

greeting = ab_test("title page greeting", "hello", "sup?") 

It also provides a handy dashboard for viewing test results.

Recently, I ran an A/B test for Uludum on course homepages. I was curious whether conversions would do better if the course description was above the 'Table of Contents' or vice versa.

If I had to guess, I thought the Table of Contents would do better. To me, it seems like a nice, concise representation of the course material.

Of course, that's why we shouldn't make assumptions about what will improve conversions. A/B tests to the rescue! After running the test for some time, here are the results:

test name               | participants  | conversions

description first       | 350           | 11
table of contents first | 369           | 3

The difference in conversion rates is huge. Displaying the course description first converted at 3.05% vs 0.81%, which passes a 99.9% confidence test. I'm not running the test anymore; description wins.

Test 2 - language used in my arbitrage course banner ad.

At the top of my website, I used to display a link to the crowdfund for my bitcoin arbitrage course. The text is different now, but it used to say this:

I'm crowdfunding a course about creating a bitcoin arbitrage bot. Pre-order the online course for just $10.00.

I wanted to A/B test this, so I decided to test some slightly different wording in the second sentence. I tested the following 2 sentences:

"Pre-order the online course for just $10.00"
"Reserve early access for just $10.00"

I honestly didn't know which one would perform better. Let's just skip to the results:

Pre-order the online course     | 3150  | 202   | 6.41%
Reserve early access            | 3037  | 181   | 5.96%

Pre-order the online course did just slightly better, but not by too much. The results are, statistically, inconclusive and we can't declare that either result was truly better.

Lessons learned

A/B test all the things! It's typically harmless as long as you aren't testing very extreme alternatives. You never know when you might strike conversion gold.

I'm running a few new tests right now, like the background color of the banner ad at the top of this page. In a few weeks I'll make a post about the results of that test. If you'd like to get notified when I write that post, just join my no-spam mailing list on the left side of this page.

comments powered by Disqus