Always know how you’re going to prove your project was successful before you start working on it.
Create a testing plan
Before you dig in and start building your campaign, it’s worth taking the time to set up a plan for testing and measuring success.
To create your plan, ask yourself:
- What’s your goal? What is it you’re aiming to achieve?
- What’s your hypothesis? How will you achieve it?
- What does success look like? What are your KPIs? What numbers do you need to hit?
Write down the answers. They form the basis of your test plan. I also recommend adding in your baseline numbers – your conversion (for example) over the past 30 to 90 days.
When you’ve got that worked out, you can start looking at how to measure success.
There are a few steps to testing and proving results. In an ideal world, you’ll cover them all, starting with usability testing.
In the old days (aka, 10 years ago), you’d bring 5-7 people from your target audience in to run through a prototype. They’d complete set tasks and talk through their thought processes so you could capture their first impressions, watch their reactions, and make sure they understood your offerings. You’d have eye trackers connected to the monitor to see where they were looking (in the real old days, these were like torture devices on their heads). And you’d have observers behind a mirrored window taking copious notes.
After the tests, you can go back to your designs and iterate and refine to solve the problems uncovered, and then test again. This process means you solve the majority of the issues with your campaign before you launch, so you know you’re going forward with the best option.
If your project is a redesign of an existing website or user experience, it’s worthwhile running AB testing before launching to the entire audience. Just because your prototypes performed well in user testing, doesn’t mean they will perform better than what you have right now.
Like with everything else, AB testing needs a plan. Before you get started, ask:
- How long does the test need to run to achieve statistical significance? How many users are required to go through each page or flow to generate results you can rely on, and, based on standard visitor numbers, how long will it take to get that?
- Are there any other campaigns or promotions running that you need to schedule around?
- Is there an event or holiday (like Valentines Day or Christmas) that might skew results?
- Are their certain users that should be excluded from the test? For example, you may want to keep high value, repeat customers on the old site until the new one proves more valuable.
You can then time and schedule your test and start pushing customers through the new experience. When results come back, you’ll have a direct comparison of the new vs the old flow so you can make an informed and confident decision about how to proceed.
Psst. Want a free AB Testing Plan template to help you get started?
Of course, your work’s not done yet. While AB testing shows you that your project is working, you still want to come back after launch to ensure your campaign has delivered the results you sought to achieve.
You’ll want to keep a close eye on analytics after launching your project so you can quickly identify if something’s going wrong. As we all know, anything that can go wrong will go wrong, so you need to be available to make fixes or revert to the old site if required.
As well as tracking day to day performance, you should check in after 30 and 90 days and compare your site analytics against the baseline you defined back in your test plan. Remember not to compare one day against another, but look at trends over time so movement for special events, holidays and promotions can be taken into account.
And you’re done
Or are you?
These days, nothing is ever final. We should always be iterating and improving to continue driving results and growing our business profitability.
Before you do revisit and refine your projects, though, it’s worth taking time host a retrospective to review how everything went. What went well? How could things have been handled better or more efficiently? What did you learn, and what will you repeat? Answering these questions will help you run your next project a little more smoothly and with a little less frustration.