Process

Is Your Marketing Actually Underperforming (Or Does It Just Need Time)?

A CEO's framework for evaluating marketing performance

Article Highlights

  • The strategic gap between what CEOs expect from marketing and what it can realistically deliver is an information gap, not a competence gap—and it's where good programs get killed prematurely.
  • B2B marketing takes 5-6 months to show meaningful pipeline impact. Months 1-2 build data and test hypotheses. Months 3-4 show improving leading indicators. Months 5-6 reveal true pipeline contribution.
  • Five signals of actual underperformance: flat leading indicators after 90 days, no documented learnings, same mistakes repeated, no hypothesis for improvement, and zero pipeline contribution by month six.
  • Phase-based evaluation prevents micromanaging while maintaining visibility—ask what the team has learned, not just what they've produced.
  • The key question isn't "are we hitting targets?" It's "are we learning and improving?" A program showing progress on leading indicators needs runway. A program with no learnings needs intervention.
  • Month six is the hard deadline. If a demand generation program hasn't contributed any pipeline by month six, continuing isn't patience—it's avoidance.

You're three months into a growth marketing program. Pipeline isn't where you expected. Your agency team keeps talking about "learnings" and "optimization." Meanwhile, you're burning budget and the board wants to know when this investment pays off.

Here's the question you're really asking: Is this not working, or does it just need more time?

It's the right question. It's also the hardest one to answer—because the symptoms of a program that needs patience look almost identical to one that's failing.

This framework will help you tell the difference.

The Strategic Gap Most CEOs Face

There's a gap between what leadership expects from marketing and what marketing can realistically deliver in a given timeframe. This isn't a competence gap. It's an information gap.

Most CEOs understand business fundamentals better than anyone. But the mechanics of how modern growth marketing builds momentum aren't intuitive—and they're not something you'd encounter unless you've run these programs yourself.

When this gap exists, two things happen. Promising programs get killed too early because they're judged against unrealistic timelines. Or underperforming programs survive too long because there's no framework to evaluate them.

Either outcome costs you money and time. Closing this gap is how you make better decisions about your marketing investment.

How Long B2B Marketing Actually Takes to Work

Let's establish a realistic baseline. These timeframes assume a competent team with adequate budget and proper infrastructure.

Phase Timeframe What You Should See What Not to Expect Red Flags
Foundation Months 1-2 Data collection, hypothesis testing, baseline metrics Pipeline, lead volume, revenue No testing plan, no data infrastructure
Optimization Months 3-4 Improving leading indicators, documented learnings, budget shifts Final pipeline targets, scaled results Flat metrics, no learnings, same tactics
Pipeline Impact Months 5-6 Marketing-sourced opportunities, scalable channels identified, sales feedback Fully mature program, hitting final targets Zero pipeline, no scalable channel, vague plans

Months 1-2 are about establishing baselines and testing hypotheses. You're learning what channels reach your audience, what messaging gets attention, and what your actual conversion math looks like—not industry benchmarks, but yours. Expecting pipeline in this phase is like expecting a harvest the week after planting.

Months 3-4 are where leading indicators should start moving in the right direction. Not pipeline yet—but signals that pipeline is coming. Engagement rates should be improving. The team should be able to tell you what's working and what isn't. Budget should be shifting toward higher-performing channels.

Months 5-6 are when demand generation programs should start contributing to pipeline. Not carrying the number—but contributing meaningfully. If you're not seeing pipeline impact by month six, you have a real problem. Before month six, you might just have a timing problem.

The Five Signals of Actual Underperformance

Patience is warranted. Blind patience isn't. Here's how to tell if your marketing is genuinely underperforming versus simply needing runway.

Signal 1: No Improvement in Leading Indicators

Leading indicators—engagement rates, click-through rates, traffic quality, content consumption—should improve over time even before pipeline shows up.

If these metrics are flat or declining after 60-90 days of optimization, something is wrong. Either the targeting is off, the messaging isn't resonating, or the team isn't learning from the data.

Question to ask: "Show me how our leading indicators have trended over the past 90 days."

Signal 2: No Learnings to Report

A program that's not yet producing pipeline should still be producing insights. Every campaign teaches you something about your market—what messaging resonates, which segments engage, what channels work.

If your marketing team can't articulate specific learnings, they're not running experiments. They're running activities.

Question to ask: "What have we learned in the past 30 days that changed our approach?"

Signal 3: Same Mistakes on Repeat

Early-stage programs make mistakes. That's expected. What's not acceptable is making the same mistakes repeatedly.

If the same issues keep surfacing—poor targeting, weak creative, misaligned messaging—without correction, you have an execution problem, not a timing problem.

Question to ask: "What mistakes did we make last month that we've now fixed?"

Signal 4: No Hypothesis for Improvement

Marketing teams in the optimization phase should have clear hypotheses about what will improve performance. "We think focusing on this segment will improve conversion because we've seen X signal."

If the answer to "how will this get better?" is vague or nonexistent, there's no strategy—just hope.

Question to ask: "What specifically do you believe will improve our results next month, and why?"

Signal 5: Zero Pipeline Contribution by Month Six

This is the hard deadline. If a demand generation program hasn't contributed to pipeline by month six, the program isn't working.

Not "contributed enough." Contributed at all. Any pipeline. If the answer is zero, you have a failed program.

Question to ask: "How many opportunities in our current pipeline were sourced or influenced by marketing?"

The Framework: How to Evaluate Without Micromanaging

You need enough visibility to make good decisions without undermining your team's ability to execute. Here's a phase-based framework.

Phase 1 Evaluation (Months 1-2)

What to look for:

  • Infrastructure is in place (tracking, attribution, reporting)
  • Data is being collected systematically
  • Multiple hypotheses are being tested
  • Clear baseline metrics are established

What not to worry about:

  • Pipeline contribution
  • Lead volume
  • Revenue impact

Red flags:

  • No clear testing plan
  • Single-channel dependency
  • Inability to show what data is being collected

Phase 2 Evaluation (Months 3-4)

What to look for:

  • Leading indicators trending upward
  • Clear learnings documented
  • Budget shifting based on performance
  • Optimization happening weekly, not monthly

What not to worry about:

  • Hitting final pipeline targets
  • Scaled results

Red flags:

  • Flat or declining leading indicators
  • No documented learnings
  • Same tactics as month one
  • "We need more budget" without data to support it

Phase 3 Evaluation (Months 5-6)

What to look for:

  • Pipeline contribution (even small)
  • Identified scalable channels
  • Sales feedback loop functioning
  • Path to hitting targets is clear

Red flags:

  • Zero pipeline contribution
  • No sales feedback or negative feedback without adjustment
  • No clear scalable channel
  • "We need more time" without specific hypotheses

Questions That Surface the Truth

These questions cut through marketing jargon and surface real performance data.

On progress: "Walk me through the trend lines on our three most important leading indicators over the past 90 days."

On learning: "What did we believe 90 days ago that we now know is wrong?"

On execution: "What did we try that failed, and what did we do about it?"

On pipeline: "Of the opportunities that closed last quarter, how many did marketing source or influence? How do you know?"

On scalability: "Which channel do you believe we can scale, and what evidence supports that?"

On honesty: "If you had to bet your compensation on this program hitting target, what odds would you give it?"

What to Do With the Answers

Once you've evaluated, you have three options.

Option 1: Stay the Course

If leading indicators are improving, learnings are being applied, and the team has clear hypotheses for continued improvement—give them runway. Check in again in 30-45 days.

Option 2: Adjust the Program

If some elements are working but others aren't, make targeted changes. This might mean shifting budget, narrowing focus, changing channels, or adjusting messaging. Don't blow up what's working to fix what isn't.

Option 3: Stop the Program

If you're past month six with no pipeline contribution, no clear learnings, and no credible hypothesis for improvement—end it. Continuing isn't patience. It's avoidance.

The Conversation to Have Now

If you're reading this because you're uncertain about your marketing performance, here's your next step.

Schedule a 30-minute conversation with your marketing leader. Ask them to prepare answers to three questions:

  1. What have our leading indicators done over the past 90 days?
  2. What have we learned that changed our approach?
  3. What's your hypothesis for how this improves, and what evidence supports it?

The answers will tell you whether you have a timing problem or a performance problem. And once you know that, you can make the right decision for your business.

When You Need Help Closing the Gap

Sometimes the gap between where your marketing is and where it needs to be isn't a timing issue—it's a capability issue. You don't have the team, the expertise, or the bandwidth to execute at the level your growth targets require.

That's a different problem. And it requires a different solution.

At Mighty & True, we work with technology companies to close strategic marketing gaps—whether that means accelerating a program that needs more firepower, diagnosing one that's underperforming, or building the infrastructure to scale.

If you've done the evaluation and the gap is bigger than time will solve, let's talk about what it would take to close it.

Subscribe to our newsletter

Interested in getting regular updates with time saving tips, templates and tools from the Mighty & True team? Sign up for our Drag & Drop newsletter.

By subscribing you agree to our Privacy Policy and provide consent to receive updates from our company.