6 Performance-boosting A/B Testing Tips For Your Marketing Assets and Programs

March 16, 2021

It's a fact of life that success takes practice. That's why most marketing teams and organizations understand, at the very least, that they need a solid plan for a/b testing marketing assets, campaigns and tactics.

That said, understanding isn't doing. A given organization may know that they need A/B testing, but if they're like a lot of folks, they hit a wall when moving or restarting test plans forward. Most organizations' roadblocks derive from a few broad areas of unease:

  • They don't know where to begin
  • Putting a strategy into practice feels too complicated
  • They don't know what to do with the data gathered
  • Past testing efforts just didn't yield any practical results

These are natural feelings. BUT, if you're running into these same concerns, just remember that an effective A/B testing and optimization strategy has the potential to boost performance by up to 2-3 times (remember that stat when we talk about selling the idea of testing to higher-ups).

How to get started with A/B testing your marketing assets

A/B testing and optimization is critical to being empathetic to your users' needs. However, you can't predict what users will react to, or engage with. You need data and empirical evidence, and a testing and optimization strategy can give you just that. 

Putting these strategies into action will enable you to learn how users interact with your campaigns and assets so you can deliver an ever-more intuitive and seamless experience (i.e. making it easier for them to find information, engage with your assets and ultimately convert into actual customers). 

If you're still on the fence about the utility of A/B testing or lost on how to start, just remember that it needn't be overly complicated. The following are simple standards to help you confidently apply A/B testing and optimization into your programs and workflows. 

1) It all starts with the plan

Applying a testing program after launch is just putting a kink in the chain. So, you should internalize the idea that every launch plan should include some sort of testing schema. When testing's a given, it gives your internal conversations direction and allows you more freedom in ideating concepts and tactics. 

Throughout the planning phase of a campaign, document ideas and concepts that seem to bubble up more than others. These will give you a starting point for developing effective hypotheses so you can gather truly actionable insight throughout the test. Speaking of hypotheses... 

2) Hypotheses are your roadmap

Testing should never be done for the sake of testing. Think back to grade school and the scientific method. You need a specific hypothesis to test against. For us marketing and advertising types, that means making an educated, specific guess as to what your audience will respond to, and testing one variable at a time until you reach the best result.

Let's look at an (admittedly, oversimplified) example:

Hypothesis: "I think that my audience will click more on ads with puppies rather than kittens." 

In this example, you probably have plenty of anecdotal evidence that your audience prefers puppies to kittens, but to create more effective ads, you need to know for sure. So, you test the two options (we'll get into that in a moment) to see which gets better results.

Having a specific hypothesis in mind not only gives you critical insight into ways to optimize the experience, it's also an effective way to get doubters to shut up. You can effectively stop internal churn and too-many-cooks syndrome in their tracks by using unassailable data and evidence to prove everyone else is wrong. But, you know, nicely.

3) Only test one component at a time

Once you get started with testing, the urge to test everything at once is tempting. But, once again, if you remember back to grade school, you remember that you only test one variable at a time. 

Testing multiple variables at once will provide unusable data and insights because you have no way of knowing what's driving the best result. That's why you need a control (the component that doesn't change) and one variable. Let's look again at the puppy/kitten hypothesis:

The Test
  • The Control: Copy, message and basic ad structure (channel, layout etc.) are identical across all ads
  • The Variable: The specific imagery used
  • Test A: Ads with puppy imagery
  • Test B: Ads with kitten imagery

By eliminating as many variables as possible, we will have a leg to stand on when we assert which will be more effective in subsequent ads (the puppy would win, hands-down).

4) Creative flexibility provides testing flexibility

The goal of testing is to find the right way to do things—and the "right way" is often not the way it's been done before (after all, why are you testing in the first place?).

That means in order to effectively test the variable, brand elements and guidelines, usually considered gospel-truth, need to bend a little. Testing (especially something major like new messaging pillars or potential brand concepts and elements), requires giving your creative and strategy team the flexibility and freedom to try something new. 

That's, however, not carte blanche to go off the creative deep-end.

It's critical that you work closely and collaboratively with your creative and strategy team to discover ways to strike the balance between ownable elements and testing for more effective ways to get results—which is a win-win for everybody.

5) ALWAYS test email subject lines

Up to now, we've offered more broad ideas to keep in mind, but this bit of real-world advice is too important to leave off because the potential impact is massive.

Even a simple tweak to text in a subject line can mean open-rates increase upwards of 60% (clicks and engagement increases are often just as dramatic). 

It's a small, simple addition to your workflow, but it will pay off almost immediately.

6) Don’t rush things

If you only remember one thing from this entire article, remember this: designing a good program takes patience.

Gathering statistically significant testing data takes time. Ideally, most variable tests need to run at least two weeks (or until you get 5k views—whichever comes first) to yield actionable insights.

This may give you a bit of pause because "time" and "patience" are words the business side of organizations don't tend to understand. But if you can demonstrate testing's utility with the aforementioned 2-3x boost in performance (and ultimately, revenue), the business folks will probably hip to the program rather quickly.

Make A/B testing marketing and advertising a priority

Like we mentioned in the first line of this article, success is an evolution and a practice. It is a process that can only make your programs more effective and you better at your job.

Testing and optimization just take a little planning and dedication to make it a part of your normal workflow. 

It also takes the right team. Everyone you work with should be bought into a testing and optimization strategy. Our team at Mighty & True definitely is. But if some folks on your team (or, probably, your organization's leadership) need a little more prodding to see the light, just show them the results of some testing we recently did. They'll be believers in no time.