The Results Report: Marketing’s Most Creative Literary Genre

The Results Report: Marketing’s Most Creative Literary Genre

Of all the documents produced by the marketing industry, the results report may be the one that requires the most creative skill to produce and the least to understand. Not because it’s well-written — most aren’t. Not because it’s visually sophisticated — most look like a spreadsheet had a mild aesthetic crisis. But because the craft involved in constructing a results report that tells the story the author intends, using data that may or may not support that story, is genuinely, uncomfortably impressive.

The results report is where the creative industry’s relationship with measurement reaches its most revealing expression. Marketing exists to create value; measurement exists to prove that value was created. When the campaign works — when the numbers tell a clear story of cause and effect — the results report is simply a narrated version of good data. When the campaign doesn’t work — when the results are ambiguous, disappointing, or simply impossible to connect causally to the campaign activity — the results report becomes something else entirely: a work of interpretive fiction, sophisticated enough to look like analysis and optimistic enough to ensure the relationship survives for another campaign.

The Metric Selection Process

The most important creative decision in any results report is which metrics to feature. This decision is rarely explicit — nobody writes in the brief “select the metrics that make us look best” — but it is almost always operative. The campaign that didn’t drive sales will feature reach, frequency, and brand awareness lift. The campaign that didn’t drive brand awareness will feature click-through rates and engagement. The campaign that didn’t drive engagement will feature sentiment analysis. There is almost always a metric that went in the right direction, and that metric will be in the executive summary.

This is not exclusively bad faith. Sometimes the nominated metric genuinely is the right one for the objective. But the pattern of metric selection across the industry — the consistent tendency for the featured metric to be the one that performs best regardless of whether it was the agreed KPI — reveals something uncomfortable about how marketing accountability actually works.

As we’ve noted throughout the Insurgency Journal, from KPIs that mean nothing to the award-effectiveness paradox, the creative industry has a persistent difficulty connecting the work it makes to the business outcomes it was commissioned to create. The results report is where that difficulty becomes most visible — and most artfully managed.

The Benchmark Shuffle

The second major creative technique in results reporting is benchmark management. A 2.3% click-through rate sounds underwhelming until you learn that “industry average is 1.8%” — at which point it becomes a 28% outperformance of benchmark, which sounds considerably better. The choice of benchmark is as significant as the choice of metric, and it receives proportionally less scrutiny.

Industry averages are blunt instruments. They aggregate performance across campaigns with vastly different objectives, audiences, channels, and creative quality. Comparing a carefully targeted, high-quality creative campaign to the industry average for all digital advertising is a comparison of apples to a very large number of highly varied fruit. But “outperformed industry average by 28%” travels better in an executive presentation than “performed as expected for a campaign of this type and quality.”

The Correlation-Causation Special

The third great technique of creative reporting is the correlation-causation elision. Sales went up 12% in the quarter the campaign ran. Brand awareness scores improved 8 points over the period of the sponsorship. Customer satisfaction metrics rose during the product launch campaign. These are real numbers. Their relationship to the campaign activity ranges from proven to plausible to entirely coincidental, and the results report typically presents them with a confidence that the underlying analysis rarely supports.

Attributing business outcomes to marketing activity is genuinely difficult. External factors — economic conditions, competitor behavior, seasonal patterns, product changes, pricing adjustments — all affect the same metrics that marketing campaigns claim credit for. Isolating the campaign’s contribution requires experimental design, control groups, and statistical rigor that most campaigns neither have the budget nor the organizational patience for. In the absence of that rigor, the results report fills the gap with correlation and calls it causation. And most readers don’t have the statistical background to notice the difference.

How to Read a Results Report Honestly

If you’re on the receiving end of a results report, here are the questions that cut through the narrative:

What were the agreed KPIs before the campaign launched, and how did those specific metrics perform? Note which metrics in the report were not agreed in advance, and ask why they’re featured. Ask for the primary agreed KPI explicitly if it’s buried or missing.

What’s the basis of the benchmark comparisons? Whose performance is being compared to what, for which kind of campaign, in which channel, in which time period? Benchmarks that aren’t sourced and contextualized aren’t benchmarks; they’re assertions.

What changed in the market during the campaign period? Any honest results analysis should acknowledge the external factors that affected the metrics, because any analysis that doesn’t is either naive or selective.

If you’re on the producing end and you genuinely care about doing this honestly — which is rarer than it should be — the most useful thing you can do is agree on measurement methodology before the campaign launches. What will we measure, how, against what baseline, with what controls? That conversation is uncomfortable precisely because it limits the post-hoc narrative flexibility. Which is exactly why it’s worth having.

Writing a results report that requires more creativity than the campaign it’s reporting on? Our shop is for people who’d rather do honest work and report it honestly. They exist. We know they do.

Related Articles

0
    Your Cart
    Your cart is emptyReturn to Shop