Skip links
Is your marketing team effective? A 5-question test

Is your marketing team effective? A 5-question test

In an increasingly complicated martech world, it’s easy to feel overwhelmed. Columnist Justin Dunham offers a handy checklist to help you determine whether you’re running the ultimate marketing machine.

Come back with me to the year 2000, when the Y2K bug was a recent memory, and the phrase “rockstar developer” was still being used without irony. Joel Spolsky, a software engineer who would go on to found Trello and Stack Overflow, published a 12-step test to help determine the quality of a software engineering team.

Though The Joel Test is almost 20 years old, it’s still known among software developers due to its continued relevance and usefulness. It’s a short checklist, with a series of simple “yes” or “no” questions, that can give us a sense of whether a software team follows best practices.

As a marketer, I’m in envy of The Joel Test. In some ways, the output of a marketing team is easy to measure: clicks, conversions and revenue, for example. In other ways, marketing is still responsible for things that are a lot harder to measure: branding, press coverage, positioning and messaging, pricing and so on. And marketing is more complicated than ever before, with new channels, techniques and technology popping up every day.

Wouldn’t it be useful to simplify? To have a quick checklist you can run through to see if you, as a marketer, are running an effective operation?

Here are a few ideas for just that. Let’s not get too creative yet. We’ll call it “The Joel Test for Marketers.”

1. Do you know your customer acquisition cost and customer lifetime value?

How much does it cost you to acquire a customer? And once you do, how much will that customer net you? Your business depends on lifetime value being greater than the cost to acquire. But a lot of marketers don’t have this data.

Partly, that’s because it’s tricky to capture this information.

To measure cost per acquisition, you need reliable lead attribution, at least for first touch, for each channel. You need accurate costing for each channel. And you need a way to pull all this together into one place so you can see what it is, and how it’s changing over time. It can’t be too difficult to update, either; the data’s only useful when you can use it to shift your dollars to maximize results.

Estimating the value of a customer isn’t easy, either. You can track revenue per customer, and churn (or repeat purchase) rates, which can give you a sense. But do you have different products or customer types? Do customers vary in their value depending on how they were acquired? How much do these numbers change over time?

Even though this data isn’t easy to get, it’s crucial for any marketer who wants their strategy to fit in with the broader business. You can’t make reasonable investment decisions (whether you’re investing time or money) if you can’t gauge the impact of those decisions.

I think you get close to full credit if you don’t have the data yet, but you have a model and an approach for figuring these numbers out. An Excel (or Google Sheets) spreadsheet with inputs for average customer lifetime and deal size, or purchase size and frequency, or whatever metrics you need to calculate these numbers for your business, is a great start.

2. Do you have a clear value proposition for your company and for each product, and does everyone on the marketing team understand it?

I have a tendency to work for companies whose names do not explain what they do. I think that’s pretty common in tech. Nobody blinks today when you say you’re going to “Google it,” but in 1999?

So I get asked a lot, “What is it that your company does, again?” And I often feel tremendous shame when I try to answer this question, because I can’t always answer it intuitively.

  • “Well, it’s a framework for real-time multiplatform narratives.”
  • “It’s agile natural language processing, in the cloud.”
  • “It’s like Uber, but for carrots.”

A great value proposition has to say, briefly, some combination of what you’re selling, the benefit of using it, who it’s for, and why it’s different. It has to use the language of your customer. But mostly it just has to make clear, right upfront, what value you’re offering to a prospect.

Unbounce is doing a great job with this. “Build Landing Pages Fast & Get More Conversions.” Pretty clear why you would want to use them, right? Not too difficult for me to explain at a party. But a lot of companies don’t do this. Or they try to do this, but they themselves don’t know what value that they provide, so the value proposition ends up muddled and filled with jargon.

Embedded in a good value proposition is a huge amount of information for everybody on the team, that helps them get on the same page. Take Unbounce’s value proposition. It says:

  • What the company does, of course: It’s a landing page builder.
  • What value they provide to the customer: Landing pages take a long time to build (implied), so Unbounce saves time. Unbounce also delivers more conversions, which means more leads and more revenue.

This context makes everyone much more efficient. If the Unbounce value proposition were “Build Beautiful Landing Pages That Make People Trust You,” our junior graphic designer would focus our design style on ideas of trustworthiness and aesthetic value, instead of efficiency and results. You can imagine those two color palettes and icon sets might look very different.

3. Do you have five metrics that tell you whether your efforts are successful, and do you report on them at least weekly?

Some aspects of marketing are relatively easy to measure, like lead generation. Other parts, like brand equity or market awareness, are very difficult to measure.

But it’s worth setting goals for all of these things and measuring them in some way, even if it’s very informal or not rigorous. Measurement lets you:

  • know whether your efforts are working.
  • document your progress.
  • set context for your team on what’s important.
  • focus on a few things at once, which improves efficiency dramatically.

It’s not important which five metrics you use. They could change quarter by quarter — maybe for the moment, you have churn and new customers as separate metrics because you’re working on them separately, but next month, you’ll roll them into a single net retention measure. Or maybe this quarter you’ve committed to helping the product team activate more of their users, so you’re tracking that.

For something really intangible like mindshare or market awareness, look for some kind of proxy, even if you acknowledge the weakness of that proxy. Maybe social engagement is helpful for organic website traffic to your site? Press mentions? Saying “we can’t measure this, so we won’t try” deprives your team of context.

Why weekly? I guess it could be a little less (or more!) frequent than that. It just has to be frequent enough so that people can adjust their work.

Why five metrics? It could be three or seven. It needs to be enough metrics to give a rough picture of everything that is going on, but no more than a team can pay attention to and act on.

4. Do you have a process for planning work?

On Monday morning, you can choose to do 1 percent of an overhaul of your website content, or you can finish up a blog post. You can respond to some negative tweets, or you can attend a demo for a software product that has a 5 percent chance of doubling your conversion rate.

There’s the long-term, the short-term, things that are easy and things that are hard. There are things that maybe have a high return, and things that definitely have a low, but non-zero, return.

Unfortunately, we get paid to do what’s truly valuable for the business, and none of these things (ease of execution, time spent, certainty of results) actually tell us that. You need some repeatable process you follow that helps you understand what’s truly valuable, and you need a way to actually ship before the next fire pops up.

The best method I’ve seen for this is experiments. If you have your important metrics set up, pick something smaller that you think will improve one of those metrics.

Or go even deeper: Pick some data point that will help you figure out what to do next. Focus on doing that thing, or collecting that data, for a limited period of time. Look at the results, document what you learned, and then repeat, on a larger scale if you can.

For example, let’s say you know you want to drive more signups for your carrot-delivery service (“Uber for carrots”). Before you commit to a series of blog posts on the history of the carrot, try just one and see how it performs. Before you overhaul your website to focus more on organic carrots, figure out how to determine if that’s what’s really holding you back, and then if it is, roll out the work in stages.

If you don’t like the experimental approach, that’s understandable. At the very least, have a backlog of everything you need to do, and revisit the entire backlog very frequently, to make sure you’re prioritizing the right things. Ideally, track your time so you know how the effort compares with the results.

5. Does Marketing Ops report to a C-level exec?

This might be a controversial one.

Marketing Ops is the operating system for marketing, responsible for the database, lead capture, reporting, analytics and all marketing systems. Marketing Ops serves everyone in the organization, and the reporting it executes is a key input into marketing strategy.

Marketing Ops needs to be thought of in a very specific way to be effective: process-, data- and technology-driven. Because it’s the operating system for all of marketing, its work needs to be available to everyone equally. And there isn’t really another discipline within marketing that Marketing Ops can be added to.

For example, I’ve seen Demand Generation (which is usually run as a campaign-oriented, sales type of function) run Marketing Ops. But the skills required to run demand generation are completely different from the skills required to effectively run Marketing Ops. Maybe Marketing Ops can be in a Business Intelligence group, or something like that. Partial credit for that one. But that’s tough to justify when marketers rely on Ops to send out newsletters and set up landing pages.

What did I miss?

Have any to add? Tweet them to me at @jwyattd. I think there are a few that I might be missing, but I haven’t figured out whether they’re important yet:

  1. Do you use a bug tracker for keeping track of known improvements?
  2. Do you have a campaign reporting template and scheduled reviews of campaigns?
  3. Can you attribute all of your customers to their originating channel and campaign?
  4. Do you fix the fundamentals before starting on new things?
  5. Does everyone on the marketing team understand the personas?
  6. Some test for alignment with sales. (Is there a yes/no to test this?)
  7. Some test for alignment with product. (Is there a yes/no to test this?)

I also talk more about these on my podcast.

Engineers have realized unbelievable productivity increases over the past decade. If we as marketers want to realize these gains ourselves, we need to develop data-driven best practices and stick to them. This checklist is a good start.

___
by Justin Dunham
source: MARTECH TODAY