Message testing for busy marketers

How to get started without losing your time (and mind 🧠)

Marketers who don’t test their messaging are wasting budget ❌

But messaging is qualitative, so how do we do it properly?

When I surveyed 62 product marketers in B2B SaaS, this was how they were testing it:

  • 18.4% with sales and pipeline metrics

  • 15.8% with website and digital performance

  • 15.8% with conversion metrics

  • 15.8% with product adoption and revenue impact

data from my messaging survey

The rest was mainly qualitative, and most of these were bundled together.

But then, in the answers, I’ve seen an old enemy: A/B Testing

The A/B Testing Trap

Let me tell you a story:

I was a PMM working on an email campaign sequence with our internal content team.

We’re wondering about the copy from a subject line, I’m on Teams (yuck 🤢) and I feel like it’s an empty debacle, the number of people receiving the emails don’t care about this product in the first place, it’s a commodity.

Then, a manager from the content team says: “Let’s just A/B Test it.”

I try not to, but I cringe. More people rally the subject like it’s the logical choice.

A/B is a method, but it’s used as crutches by making you dodge to commit, and it’s only in specific cases of large sample size to draw actual statistical insights on performance.

It’s not a bad method, but in this situation, I felt like it was useless.

There’s also many components in email that can change the dynamics, like: deliverability, open rate, right person, etc, that it’s making the sample size even smaller when you filter them in.

It also doesn’t give you the context.

Is the email is sleeping in a database and this is a unwanted product update, or if it’s actually an onboarding sequence?

My point is, this is just wasted time if not part of an infrastructure.

Here’s how you can build your testing infrastructure, as a busy marketer 👇

The alphabet of your test infrastructure

Ok so if A/B test isn’t the way to test properly, what is the alternative?

Well, it’s not really a one-size-fits-all, but your need your own test infrastructure.

It will act as a repeatable messaging validation system 🔁

The next best thing after A/B Testing?

Adding more letters 🙃:

I created a nice visual because every product marketer I know loves Canva too.

A) Audience – Ensure you’re testing with the right ICP segment (rented or owned audience) and have enough traffic (e.g. ~300 visits) to achieve practical significance.

B) Budget – Allocate enough funds to get meaningful data (e.g. ~$500-$2,000) and validate uplift objectives within two weeks.

C) Context – Match your message angle (e.g. Urgency, Differentiation, Emotion) with the right offer and channel to avoid skewed results.

D) Destination – Set up landing page tracking (Microsoft Clarity, ICP Filtering) to measure real engagement beyond just clicks.

But where should you start?

Great question Antonio

Well, the most important “first step” is making sure stakeholders are on board.

👉 No buy-in, no testing.

If you’re new here, I wrote a newsletter 2 weeks ago on how to dodge messaging by committees

🔁 Test → Analyze → Adjust → Scale.

And it all starts with a four-part framework:

A) Understand Your Audience

Ok, you’ve got the green light 🟢, now you can start building your infrastructure.

The first part is our audience

Your audience is either rented or owned

Let’s do an example with a early-stage HR-tech startup selling to Enterprise, the audience split might look like this:

Rented Audience (Cold Traffic)

Your rented audience are people that don’t know you, mainly prospects that will serve as cold traffic for your infrastructure.

  • 900 cold LinkedIn prospects (via a ICP account list)

  • 10,000 prospects from LinkedIn ads (from lookalike subscribers)

  • 50 outbound emails per day (split across messaging angles)

This is giving us more people that don’t necessarily know your product/service, which means that it’s important to use the right core angles to catch their attention.

Owned Audience (Warm Traffic)

Your owned audience are people that know you or are at least considered warm prospects a bit. They are your customers or people that have seen you a couple of time online.

  • 300 product subscribers (email sequence + retargeting)

  • 5,500 LinkedIn connections (organic post reach)

  • 15 existing customers (qualitative interviews and surveys)

Now, we have some potential of tests we can do with this owned audience.

It will give you your total number of the audience we can go after from the test.

B) Split Some Testing Budget

Once the audience is projected, now you can go back to your uplift objectives.

And identifying which ones can be tie to this experiment.

Your messaging hypothesis that will form your uplift objectives

Now that you have your test audience, you need to understand the budget needed to test your messaging with them.

Everyone should agree if the split is enough to have enough touchpoints with your ICP,

Now, it’s important to split your budget in what would be more valuable to the business, and what is required to achieve these uplift objectives.

📌 Given uplift objectives: 

✅ Boost SQL conversion rates by 15-20% → YES, measurable within 2 weeks.
✅ Shorten sales cycle duration by 20% → ❌ NO, takes longer than 2 weeks.
✅ Increase average deal size by at least 25% → ❌ NO, deal size impact requires a full sales cycle.
✅ Improve win rate by clearly differentiating from competitors → YES, if linked to sales call feedback within 2 weeks.

🎯 Final 2-Week Test Objectives:

Increase SQL conversion rates by 15-20% (measurable via CRM)
Improve win rate by reducing objections related to differentiation (measurable via sales call analysis).

Then, you need to breakdown your budget according to the test angles, and the channel context to make sure you have enough data to see a baseline movement from your first data sample.

C) Provide The Right Context

There’s always a different context. A cold email to your ICP that is looking to buy will be completely different than a long-time customer answering a survey.

You know what to look for after your sprint

A messaging test isn’t just about words—it’s about where those words live.

📌 The Right Context Means:

  • Offer: What’s being promoted? (e.g., Free resource, demo, feature announcement)

  • Channel: Where will it be tested? (LinkedIn, email, landing page, sales calls)

Uplift objectives usually comes from the messaging gaps or performance decrease signals. So we’ve identified that with our HR-tech startup, there’s a messaging gap on:

→ Lack of urgency from these enterprise deals,

→ Lack the differentiation against their market leader

→ Unable to evoke emotions from their prospects.

Our core angles will then be urgency, differentiation, and emotions.

Urgency: Test messaging clearly conveying what’s at stake without immediate action.

Differentiation: Test how clearly customers perceive unique strengths/benefits vs. competitors.

Emotion: Test messaging that emotionally connects deeply to customer pains and desired outcomes.

And to test them, they will be split across these channels:

  • Urgency Messaging → LinkedIn Ads + Cold Email

  • Differentiation Messaging → Retargeted Email + Sales Calls

  • Emotion Messaging → Organic LinkedIn + Customer Interviews

👉 If you mismatch message & channel, you’ll get misleading results.

D) Focus on the Destination

A secret that not a lot of PMMs are leveraging are heatmaps and sessions recording tools.

But that’s the best way to understand really what is happening when people visit your landing page.

There’s 4 key behaviours I’m looking for:

  • Scroll Depth: Measures if users reach critical messaging sections before bouncing.

  • Session Replays: Understand hesitation points (e.g., where do users pause before clicking?).

  • CTA Clicks: Measures engagement with urgency/differentiation messaging.

  • Heatmaps: Shows attention hotspots—where users spend the most time reading.

I’ve analysed my visits when I launched my started pack, and I made a video about it:

Here’s what I learned:

✔️ Some web page elements are unclickable, causing confusion

✔️ Specific social proof or data in my headlines are boosting conversions

✔️ Small, incremental tweaks can drastically improve messaging performance

And then I’ve dropped the data in my workbook:

Now you might be wondering:

“Ok Gab that’s cool but how does this support your messaging?”

  • If users scroll past the key benefit statement but don’t engage with CTA, the messaging isn't strong enough or lacks urgency.

  • If users drop off early, the above-the-fold messaging isn’t resonating—test a stronger emotional hook.

  • If users hover over pricing/feature sections, but don’t convert, differentiation may still be unclear.

Ultimately, you need to make sure your ICP has been filtered from these visits, otherwise the data don’t make sense.

It’s also important to have the right tagging in your CRM.

Can you achieve practical significance?

Before launching your test, ask:

“Do we have enough volume to get meaningful results?”

✅ If YES → Launch the test
❌ If NO → Increase traffic (LinkedIn Ads, email volume, etc.)

📌 Test Expansion Plan Example:

  • First 2 weeks → Test with owned audience (~300 visits)

  • If results are unclear → Scale with LinkedIn Ads ($1,000 for 2,000 impressions)

P.S. I follow these exact steps when I work with B2B SaaS startups to test their messaging after a repositioning, a drastic ICP revamp, new messaging, or stress-testing before a launch.

Get Message Testing done right ✔️

Salut la gang 👋

Messaging is the most expensive part of your GTM

Most product marketers are awesome at messaging, but they lack the influence 👎 to make stakeholders adopt your version

Most startups I've worked with were sales-led and I was handling the GTM strategy, and crafting, testing and aligning messaging for outbound sales team, across multiple industries, ICP and markets.

I'm technical enough to show you the right tools and processes, while respecting your time as a busy marketer

👉 TLDR;

What I do:
Audit your messaging → We grade your messaging foundations and identify misalignment, gaps, and optimization opportunities.
Build a custom testing infrastructure → We follow this custom infrastructure process and I implement it with you, focusing on quick wins and performance uplift.
Guide your through a structured sprint → Let data speak instead of opinions on messaging with stakeholders.

Link ROI in weeks, not months.

🔗 Check out my new offer and book a call if you want a message testing audit.

No engagement nor strings attached, I legit just want to help 🙂 

Thank you for reading! This newsletter was a big one.

Next edition, I’m telling you how to differentiate your messaging, without having to wait 2 months to get your core roadmap updates

Bye bye 👋