What does your CRO practice look like?

  • 29 April 2022
  • 6 replies
  • 74 views

Userlevel 2
Badge +1

Hey folks!

Curious as to what your CRO practice looks like. We’re pushing harder on CRO at the moment and it’s brought us back to thinking about the basics in order to get the foundations right.

Things like:

  • Testing roadmap: how are you collecting test ideas, selecting, prioritising them.
  • Learning agenda: how do you structure presenting test results and learnings so that your best practices evolve over time vs repeating past “errors”
  • Team structure: do you have people focused solely on CRO or do they have multiple responsibilities? Do you have dedicated design/UX/dev capacity or do you work with the product team and fit into their backlog?
  • Testing tools: what tools do you use to implement the tests?
  • Reporting tools: do the testing tools have sufficient reporting capabilities or do you pull data elsewhere to analyse? Do you have standard dashboards that allow you to keep an eye on tests as they run?
  • Time to run a test: do you figure out your expected outcome, how long it will take and stick to it? Or do you mostly look for statistical significance?

6 replies

Userlevel 2
Badge

Super solid discussion starter, @GregorySR ! Thanks for posting. Will be monitoring this thread, as I’m very curious to hear from everyone myself! @Nicholas @julien_level @Ekaterina_Howard @Finge @Zoe_Tattersall tagging you all for visibility, in case you have anything to share!

Userlevel 7
Badge +4

Great questions! I don’t have all the answers, but will share just a few tips from our side:

  • For a testing calendar/prioritization framework, we actually created our own, inside of a Google Sheet. But there are many good prioritization frameworks for CRO available. The most well known is probably the ICE framework.
  • For testing tools, other than the native tools built inside of Unbounce, we’re huge fans of Microsoft Clarity (for session recordings and heatmaps), and Splitbee (for quickly visualizing some stats in a cleaner interface than what you’d get in Google Analytics.

One area I’d really like to improve is in our test reporting, so I’d love to hear what others are doing for this!

Userlevel 1
Badge +1

Not on your list of questions, but adding anyway: distinguishing between

  1. Tests to validate assumptions
  2. Tests to optimize for conversions

May not be relevant in your case, but I think it’s important to document the data behind tests and the level of confidence.

(Inspired by “Oh, we’re A/B testing our homepage value prop to see which messaging resonates” convos)

Userlevel 2
Badge +1

Those are great points, thanks!

 

 

  • Testing roadmap: how are you collecting test ideas, selecting, prioritising them.

 

@GregorySR  strategies are evolving very fast so it keep us mobilizing making new roadmaps every fortnight and keep an eye on results

 

Team structure: do you have people focused solely on CRO or do they have multiple responsibilities? Do you have dedicated design/UX/dev capacity or do you work with the product team and fit into their backlog?

A complete Digital Marketing team specially analyzing CRO, error removals, web developer, and looking optimization myself

 

  • Testing tools: what tools do you use to implement the tests?
  • Reporting tools: do the testing tools have sufficient reporting capabilities or do you pull data elsewhere to analyse? Do you have standard dashboards that allow you to keep an eye on tests as they run?
  • Time to run a test: do you figure out your expected outcome, how long it will take and stick to it? Or do you mostly look for statistical significance?

 

Built in tools are great and no private tool can compete them and provide accurate data,
As mentioned fortnightly

As a Digital Designer / CRO expert, we usually do testing as part of out Landing Page Optimization process (PPC Agency here!) Here are the steps we follow:

  1. Run our Unbounce LP through the Unbounce Analysis Tool & gather data.
  2. Analyze & gather data from Microsoft Clarity.
  3. Put our suggestions in an LP Optimization template (areas of improvement, design changes, copy changes, other changes and notes - we usually try to include some sort of testing with two new variants. Smaller changes on both A+B variants and one big change on the B variant (i.e. form type, picture change, etc.)
  4. Get changes approved by team working with that specific client.
  5. Make changes and publish.
  6. Standard 3-4 week test before looking at winning variant (test time varies depending on volume & number of qualified conversions)

(This might be changing in the future as we’ve found a lot of good points in this article: https://unbounce.com/landing-pages/landing-page-testing/)

Reply