case study
8 min read
When you can’t trace revenue back to its source
Your pipeline isn’t empty. Deals close. Revenue comes in. But when the CEO asks which campaign drove last quarter’s largest deal, the room goes quiet. Or worse, three people give three different answers. You’ve spent $340,000 on marketing this year and the most honest summary of what’s working is a shrug. The data exists across your ad platforms, your CRM, your analytics, your email tool. It just doesn’t connect into anything that could survive a board meeting. A B2B company we worked with had $2.1 million in closed revenue last year and could confidently attribute only $380,000 of it to a specific source. The other $1.72 million? “Somewhere.”
A real engagement
An enterprise client came to us with five teams all defining analytics events independently. Product, lifecycle, paid media, web, and sales ops had each built their own naming conventions for events, impressions, and clicks. Leadership had plenty of reporting, but almost no confidence in what any of it meant because the data models could not reconcile across teams.
We led a full tracking re-architecture across the organization: shared event taxonomy, standardized impression and click instrumentation, QA gates before launch, and a unified identity layer from first touch through CRM opportunity. Each team kept ownership of execution, but measurement shifted to a common contract so cross-channel reporting finally matched reality.
Before the rebuild, it took roughly nine months to assemble trustworthy ROAS visibility after major campaign changes. After rollout, the team reached full-funnel ROAS visibility in under 30 days, and budget planning moved from retrospective guesswork to active monthly optimization.
The pattern
The attribution gap doesn’t form because your team is unsophisticated or because you chose the wrong tools. It forms because the modern marketing stack was never designed as a connected system. It evolved as a collection of point solutions: one tool for ads, one for email, one for web analytics, one for CRM. The average B2B company uses 12 to 16 marketing tools, each built by a different vendor, each tracking users with its own cookies, its own identifiers, its own definition of a “conversion.” Nobody sat down and designed an architecture where a single customer could be recognized across all of these systems. The stack grew tool by tool, each purchase solving an immediate need, and the gaps between them became invisible until someone asked a question that required data to cross the boundary.
Here’s how this plays out concretely. A buyer clicks a LinkedIn ad on Tuesday morning from their work laptop. They browse your site for four minutes and leave. On Thursday, they return via organic search on their phone, read a case study, and subscribe to your newsletter. Two weeks later, they open three emails, click through to your pricing page from the last one, and submit a demo request. That buyer’s journey touched 8 distinct interactions across 14 days and 2 devices. In LinkedIn’s reporting, this is an ad click with no conversion. In Google Analytics, it’s two separate anonymous sessions followed by a known conversion attributed to email. In your CRM, it’s a new lead with source “website.” Three systems, three stories, zero overlap. The buyer’s actual journey, the one where LinkedIn created the initial awareness, organic search built consideration, and email closed the loop, doesn’t exist in any single report. You’d have to reconstruct it manually, and nobody has time to do that for every deal. One company we worked with had 1,200 closed deals per year. Reconstructing each journey manually would require an estimated 15 minutes per deal: 300 hours of analyst time annually, roughly $45,000 in loaded labor cost, just to understand what already happened.
The organizational response to this confusion is predictable and damaging. Teams default to last-click attribution because it’s the simplest model and the only one that gives clean, unambiguous numbers. CFOs accept it because it produces a tidy spreadsheet. Channel owners accept it because at least someone gets credit. But last-click systematically lies about how B2B buying actually works. Research consistently shows that the average B2B purchase involves 6 to 20 touchpoints over 27 to 90 days. Last-click ignores all but the final one. It favors the channel that catches the customer at the moment of decision and erases the channels that created the conditions for that decision to happen. One of our clients discovered that when they paused their $8,500/month content program because last-click showed it “generated zero conversions,” their paid search cost per acquisition rose 38% within two months, from $185 to $255, because fewer buyers were arriving pre-educated and brand-aware. They were treating the symptom while deepening the disease.
3.2x
Attribution inflation factor
Platforms collectively claimed 620 conversions against 195 actual deals
The technical debt here is real but often misunderstood. The problem isn’t that attribution is philosophically hard. It is, but that’s not what’s killing you. The problem is that most organizations lack even the basic plumbing required for attribution to be attempted. In an audit of 23 B2B marketing operations, we found that 78% had inconsistent or missing UTM parameters on more than half their campaign links. Sixty-one percent had no shared identity layer connecting anonymous sessions to known contacts. Eighty-three percent tracked only pageviews, not meaningful engagement events like content depth, video completion, or feature-page return visits. Only 3 of the 23 had a data warehouse that unified ad platform, analytics, and CRM data. Without these foundations, attribution isn’t inaccurate. It’s fiction.
The fix
Closing this gap requires dropping the fantasy of perfect attribution and building toward useful attribution instead. The goal is not decimal-point precision in credit assignment. The goal is enough visibility to make confident resource allocation decisions, to know, directionally, whether LinkedIn or organic search or email is doing more of the heavy lifting so you can invest accordingly. That bar is dramatically lower than most teams think, and dramatically higher than where most teams currently sit.
The practical starting point is three layers of infrastructure. First, UTM discipline: a documented, enforced taxonomy that every link, every campaign, every ad uses consistently, so that traffic sources can be compared apples-to-apples. One team we worked with implemented a 6-field UTM standard across 340 active campaign links in 10 business days. Within the first month, they identified that a partner channel they’d been ignoring was generating 22% of their qualified pipeline at one-fifth the cost per lead of paid search. Second, identity resolution: connecting anonymous sessions to known contacts using a combination of first-party cookies, form submissions, and CRM matching, so the multi-device, multi-session journey of a single buyer can be reconstructed. This alone typically resolves 40% to 60% of previously “unknown source” leads into traceable journeys. Third, a unified data layer, even something as simple as a well-structured spreadsheet or a lightweight warehouse, that pulls data from your ad platforms, analytics, and CRM into one place where cross-system questions can be asked.
None of this requires a six-figure analytics platform or a data science team. The median implementation cost across our last 15 engagements was $14,000 in tooling and 6 weeks of elapsed time. It requires intentionality. The organizations that close the attribution gap do so not by buying better tools but by making a deliberate decision to connect the tools they already have. One company using HubSpot, Google Ads, and LinkedIn Ads built a functional multi-touch model in BigQuery for $320/month in warehouse costs and 4 weeks of setup. Within 90 days, they reallocated $4,200/month from underperforming paid campaigns to organic content that their data now showed was sourcing 3x more qualified pipeline per dollar. The hard part is getting organizational agreement that the current state of measurement isn’t good enough, and that “we can’t prove what works” is not a tolerable permanent condition.
Read next
Your analytics show growth. Your revenue doesn’t.
Traffic is up. Engagement looks healthy. But revenue is flat. Here’s why the metrics that feel like progress often aren’t.
Most marketing dashboards report activity, not impact
Your dashboard is full of charts. But is it telling you what’s actually working? Here’s how reporting becomes performance theater.
More data isn’t more clarity
You’re tracking everything and understanding nothing. Here’s how data abundance creates decision paralysis instead of insight.
Not sure where the gap is?
We can map your current attribution setup and show you exactly where the signal breaks down. No pitch deck, just a clear picture of what’s visible and what isn’t.
Schedule a scoping call