Once a month, the social media manager sends the report. It is a substantial document, often twelve to twenty slides, sometimes a spreadsheet with multiple tabs, occasionally a dashboard link that requires logging in and that nobody outside the social team knows how to navigate. The report contains: follower counts and their month-over-month changes, reach by platform, impressions, engagement rate, top performing posts, link clicks, video views, story views, and a series of charts that show lines going in various directions with annotations explaining what caused the peaks and valleys.
The leadership team reviews the report. They nod. Someone asks what the reach number means. Someone else asks why Instagram engagement went down but LinkedIn went up. Someone asks if the follower count is good or bad. The social media manager provides answers, all of which are technically correct and none of which result in any change to strategy, budget, or direction. The meeting ends. The report is filed. See you next month.
Why the Report Doesn’t Work
The social media report fails at its primary function — helping people make better decisions — because it is built to answer “what happened” rather than “what should we do differently.” It is a historical document masquerading as actionable intelligence. This is not the social media manager’s fault. They are reporting what they’ve been told to report, which is usually “the standard metrics,” which are usually whatever the platform’s native analytics dashboard shows, because those are the easiest to pull.
The standard metrics are: reach (how many unique accounts saw the content), impressions (how many times the content appeared on screens, including people seeing it multiple times), and engagement rate (total interactions divided by reach or followers, depending on who you ask and which platform you’re looking at). These numbers exist because they are trackable. They are not the same as being useful.
Consider engagement rate. A post with a high engagement rate generated disproportionate interaction relative to how many people saw it. This sounds valuable. What does it tell you about whether the content is achieving the brand’s business objectives? It tells you nothing without knowing what the objective was. High engagement on a brand awareness post might mean something. High engagement on a post designed to drive product discovery might mean people found it funny rather than useful. The metric doesn’t know the difference. The report doesn’t explain it.
The Vanity Metric Problem, Revisited
Vanity metrics are metrics that feel good without necessarily indicating business health. Follower count is the classic example. Having 50,000 followers sounds like a success. Whether it’s a success depends entirely on who those followers are, how they arrived, whether they’re engaged or dormant, whether they’re in the target market, and whether the platform’s algorithm is showing content to them or filing it quietly. None of these variables appear in “50,000 followers, up 3% month-over-month.”
Reach and impressions are increasingly vanity-adjacent. In the era of organic reach decline, high reach numbers often indicate that content was boosted with paid budget rather than that it earned distribution. Impressions can inflate because one piece of content was seen seventeen times by the same person. These are real numbers that describe real things that happened. They’re just not the things most brands actually need to know about.
The irony is that the metrics that would actually be useful — Are people visiting the website after seeing social content? Are they purchasing? Are new customers citing social as a touchpoint? Is brand search increasing in markets where social investment is high? — are harder to track, require integration across tools, and can’t be pulled from the platform dashboard in fifteen minutes before the monthly report deadline.
What a Useful Report Looks Like
A useful social media report starts with the question: what are we trying to accomplish, and did this month’s activity move us toward it? This requires having answered that question before the month began, which requires a strategy, which requires someone to have agreed on what success looks like before content was created. In many organizations, that agreement doesn’t exist, or it exists in a strategy deck from 2022 that hasn’t been reviewed since.
A useful report is shorter than the current one. It focuses on two to four metrics that have a clear connection to stated objectives. It provides context: not just “reach is up” but “reach is up, which is consistent with the increased posting frequency we tested this month, and here’s whether the quality indicators — saves, link clicks, profile visits — moved correspondingly.” It ends with a recommendation, not a summary. “Based on this month’s data, we recommend increasing investment in educational content formats and reducing frequency on platform X.”
That is a report people can act on. It also requires trusting the social media manager to have opinions and make recommendations, which some organizations find uncomfortable because it means the strategy conversation is ongoing rather than concluded. But a social media function that only executes and reports is a function that will never be effective, and the monthly report will remain a ritual of shared confusion.
The KPI Shark collection at No Briefs Club was made for exactly this: for the person who looks at the monthly metrics and knows there’s a better question being asked. Put it on your desk. Point at it during the next reporting meeting. And find your people at No Briefs Club — the ones who track things that actually matter.


