I have two landing page designs with almost identical messages. One landing page converts at about 14% and the other one about 8%. They are very similar looking landing pages and quite frankly I would have bet the farm that the loser would have outperformed the winner, however I could not have been more wrong. I have no idea why there could be such a huge conversion difference between the two. Additionally, I have never had such high conversion rates before and I don’t know what I am doing well in the winner
I could come up with a couple of reasons why variant B is winning, but first want to check with you what the number of visits and conversions are. Do you have statistical significance of 95% or higher on the test?
I second Finge’s question.
It does make a difference what your traffic levels are and total conversions.
Also, are you tracking through a separate analytics software (ex. Google Analytics) the breakdown between desktop and mobile conversions?
A few quick observations:
It’s not a good idea to “stuff” you page with hidden keywords.
Your hidden keywords are breaking the mobile version of the page(s).
Haha. I didn not notice those hidden keywords. That has to be an ancient and obsolete SEO trick
At a glance I would have agreed with you, the loser looks like it should beat the winner. This is the beauty of testing, the unexpected result!
I am going to also echo Hristian, take those hidden kw’s out of there… Your mobile experience is suffering.
Without knowing total traffic volume, conversion volume, breakdown of conversions etc. I am going to offer my thoughts.
I think the main reason the winner is “winning” can be attributed to one key difference. Right at the top you say “our process is simple, here are the 3 steps”. On the losing version, the line our process is simple is smaller and not connected visually to the 3 steps.
In the winning version, you are using visual hierarchy to tell people what the most important thing is, the process is simple and proving that by showing them the 3 steps.
In the loosing version, no single element has that same visual emphasis. All of the elements seem to have (at least visually) the same weight.
Great a/b test. Looking forward to seeing some of the raw data.
Plenty of great feedback above. Only thing I’d add is to follow the revenue. Sometimes it’s easy to get caught up on the conversion rates. Hopefully the higher conversion rates correlate nicely for your bottom line
First, thanks for every ones thoughts.
Here is a screenshot of the data: http://prntscr.com/9wnc5y
ÒAlso, are you tracking through a separate analytics software (ex. Google Analytics) the breakdown between desktop and mobile conversions?Ó - I don’t have it setup yet in Analytics but plan to. However all the traffic is coming from paid search so there is no SEO traffic mixed in.
One potential reason I have thought of is that at the top of the form on the winner it has ÒRequest NNN Property ListÓ and the other one does not. http://prntscr.com/9wneoj
The keywords were a test to see if they could improve my quality score on PPC. It did improve them but it is spammy and I agree taking off is best.