Howuku is lightweight and complete web analytics to help you track user interaction, boost engagement, and improve conversion rates.
Get started for free
Faster result, better ROI. Finally, you can run experiments, personalize, and publish new updates directly to your website without going through your dev team.
Explore AI A/B Testing
CONTENTS
Education
3
Min read

A/B Testing as a beginner

Donald Ng
January 2, 2020
|
5-star rating
4.9
Reviews on Capterra
A/B Testing as a beginner

In this post, I would share about my experience with and without A/B testing on Howuku landing page and how adopting A/B testing has improved my landing page conversion rates by 60%.

We can all agree that the landing page of a SaaS is one of the most important aspects if not more important than the application itself.

A landing page is essentially the storefront of your SaaS and if you did it correctly, it would pay off dividends by acting as a self-promoting machine to your audience.

Therefore, it is really important to keep your landing page content appealing to your audience and selling them the idea to your SaaS.

If you’ve been running a heatmap report for your landing page you would know that most visitors do not scroll past the headline of a website. That means it is really crucial to pick your words carefully in order to give your audience a reason to scroll down to find out more. Otherwise, it might have just become your last words to visitors. (Spoiler alert: this is really bad for conversion)

I believe everyone who runs a website has been through the phase of constant refining website copies hoping to find the one best resonate with your audience pain point and giving them a reason to stay a little bit longer to eventually convert into a paying customer.

So without using a proper A/B testing tool, you will need to do everything manually. From updating the experiment from time to time to keeping track of the conversion rates while recording the result somewhere else (probably in an Excel sheet), and analyzing the collected results.

But what is this A/B test?

At its core, A/B test is an experiment that involves two or more of a given test subject with each presented to a distributed portion of your total audience.

• Control: This is the original copy of your landing page.

• Variation: Since you cannot assume that the control version is optimal, you need to create new versions that contain a distinct variation to find out if these variations could outperform the original.

Image result for gif red pill blue pill
Red pill has successfully converted Neo.

A/B testing would determine which of your test variation are performing the best depending on each of their conversion rates.

Ideally, you should run test on one simple component at a time (e.g. headline, main CTA button, one marketing copy), as this would eliminate the problem of too many factors and unable to identify the effectiveness of each tested variation.

Manual A/B testing

Before I was introduced to A/B testing, I would go around gathering inspirations for marketing content and copies and start testing them out on my website and see if they actually worked.

By testing I mean, I would keep tracks of each tested variation conversion rate with an excel file. These tests were usually a simple change on the headline of my landing page.

The process usually goes like this…

  • Control: Starting with my original headline after been running for a month or so was seeing a modest conversion rate at 5%. To further optimize my conversion rate, I shall create a new challenger Variant 1.
  • Variant 1: Spin up the headline with a value-oriented statement and running it for 2 weeks I am getting a conversion rate of 12% which is pretty good compared with the original copy. Ideally, we would retire the underperformed and keep trying to refine our test to achieve higher conversion rates. Now we are going to add a new challenger Variant 2.
  • Variant 2: quickly comes up with a new headline to test and let it run for another 2 weeks duration. Was it perform any better? NO, it was actually performed the worst out of all 3 with a merely 3% conversion rate!

My initial thought was that It is not a big deal as we can always go back to the best performing copy, right?

Plot twist: My website conversion rate did NOT perform any better as I resort back to Variant 1!

I quickly came to the realization that it was more than just a headline change that was happening at the given time frame, there were also other variables I did not account for such as quality of traffic, the sample size of conversion, and other unexpected events.

Problem with manual testing

During a different period of my tests, I might have been experimenting on other growth methods like:

  • Experimenting on Google Ads campaign by changing my targeted audience or Ads copies
  • Startup community promotion such as Indiehacker, Betalist or Product Hunt.
  • Receiving some traction from some SaaS groups, Reddit, or etc.
  • Other alternative growth methods such as experimenting on Bing's free Ads credit

I realize those tests were each running on a different time period hence the final conversion rate has varied accordingly to unexpected events at that particular time.

The hypothesis is that tests should run in parallel to see accurate results coming from the very same targeted group of audiences and this way we can eliminate all the variables of different events, source, and quality of visitors.

The pain point of A/B testing tool

I was planning to build one for myself because being a cheapskate myself like to believe that, “I am a developer, why not build one myself, how hard would it be”. Also, it would be a great compliment to my current stacks in Howuku to offer a complete CRO solution.

So I talked to a few CRO experts and see if they have any pain point and the problem with the existing market A/B testing tool that I can provide value.

I've come to learn the price of A/B testing tools is relatively costly. One CRO manager has told me that: "Cost is a limiting factor. I use Optimizely but I've heard it's quite expensive".

I look around the internet and find Optimizely to be the golden standard of enterprise-grade A/B testing tools.

My curiosity has brought me to Optimizely website trying to find out how costly it could be, but apparently they were not very transparent about their prices. So, I dug deeper and went on to see other people’s reviews on it, and to my horror, I found it would cost a whopping $50,000 per year as the minimum entry price!

“Wait a minute, I can do that!”, thinking to myself.

So I spent a month working on it and come out with a working MVP to carry out my hypothesis on Howuku landing page.

Running an A/B Testing

I’ve set up my first test to find out how effective it would be to just change one line headline to my conversion rate (sign up). I am running an A/B/C test and the following is my setup:

  1. Control: A better way to understand user behaviors and better UX
  2. Variant 1: Create a high performing user experience for website
  3. Variant 2: The easiest way to understand your website visitors

All incoming visitors to my landing page have a 33% of pseudo-randomized chance to see each of the headlines stated above.

After a month of experimenting with A/B testing Variant 1 is winning the competition very convincingly and it is really amazing to see such a result.

the actual result of my landing page captured from Howuku platform

If I were to make some assumptions of why the result would turn out this way, it would be Variant 1 is more of a value-oriented statement. Visitors were able to see the end goal or benefit of using my service which is to create a website with a really good user experience that can convert more users into customers.

On the flip side, the value proposition is not clear with Variant 2 and it does not explain why should care? Why should they understand their website visitors and what are the benefits to it?

While Control is somewhere in between of the two, but the statement itself feels very long and not as straightforward as Variant 1, hence it is getting a mixed result which is not bad but it is not optimal result in this case.

Conclusion

The A/B testing result has been a great success for my landing page so far. It is almost magical for me to see how the result plays out and performed in a statistic significant fashion.

Anyway, I shall let the experiment run for perhaps another month to see if I can get anything new from it.

I would definitely suggest anyone who have a landing page to try out A/B testing and see if you can manage to make some improvement out of your current conversion rates.

Try add a new variant of your website and experiment to see if a slight variation will perform even better.

At Howuku, we have recently launched a free A/B testing tool that comes with a visual editor to help you quickly spin up a new variation of your website.

Howuku Analytics is a business analytics platform that track user interaction via on-site feedback, heatmap, recordings, and landing page A/B testing.

Register a new account with 14 days free trial on any of our plans today!

Supercharge your startup growth with Howuku

Start your 14-day free trial today.