You bring up a good point that page load time have been shown to directly impact conversion rates…who wants to buy from you after waiting 20 seconds for your page to load? Nobody, or at least not as much as they wanted to before having their browser put on hold
Have you tried integrating Google Analytics to keep an eye on page load time?
Of course I have. But I’m referring specifically to A/B tests. When comparing a control to a variant, we should factor in what the avg page load time was for each variant. Or even better the %age of visitors to a particular variant whose page load was above 1 sec, then above 2 secs, then above 3 secs, and so on. Otherwise we could think that A is better than B because of copy/colors/design, etc. when maybe for some reason A just loaded faster for most people and that made a difference?
We assume there shouldn’t be a significant difference between A/B variants in page load timings, but who knows? Internet technology is hardly consistent on that front.
EDIT: Joe, I realize now that maybe you meant track the avg speed for each variant on Google Analytics. Looking into how to do that now. Thanks.
I see what you are saying now. I think your point brings to light the number of unknown, or really *unmeasured*, variables that impact A/B tests. Just as well, source of traffic, keyword match type, ad copy, degree of message match, location, age, gender and so on. There are known unknowns depending on what degree of analytics you have in place and how strictly you follow conversion rate optimization as a science. But to answer your question, I am sure this can be monitored in Google Analytics and you can track it in a spread sheet to take into account when considering the validity of your A/B tests.
Yes, you’re right.
EDIT: Though I’d personally much rather have server-side stats than use something like GA.