Platform vs pixel conversion divergence

Why Platform-Reported Conversions Diverge From Pixel Truth

6 min read • attribution
Admaxxer is a DTC analytics platform with built-in Meta + Google ad ops. The pixel fires deterministically on last click; platforms add modelling, view-through, and data-driven credit on top. The short answer: Meta and Google will almost always report more conversions than your pixel truth because they include modelled and view-through conversions, and you should calibrate against Shopify (not either platform) as ground truth. ## TL;DR - Platforms report modelled + deterministic; pixel reports only deterministic - Meta's modelled conversions fill the iOS 14.5 gap — real purchases, statistically assigned - Google's data-driven attribution (DDA) splits credit fractionally across the path - Deduplication via `event_id` is the only way to keep platform and pixel roughly aligned - Use Shopify as ground truth; platform numbers are for directional bidding, not P&L ## How platforms model conversions When iOS ATT opt-out reached 80%+, Meta lost its ability to deterministically tie iPhone purchases back to ad clicks. The fix was *modelled conversions* — a statistical estimate that says "given your past performance, this campaign probably drove X conversions that we can't observe directly." These are real purchases happening; Meta is just assigning them via model rather than pixel fire. On a typical DTC account post-ATT, 20–40% of Meta's reported conversions are modelled on iOS, and 5–10% are modelled on Android (despite Android being less privacy-restricted — Meta still gap-fills for blocked browser events). The implication: Meta's conversion number will always exceed what the pixel directly observes, and the gap grows as ATT opt-out climbs. Google does something similar but calls it "conversions." The Google pixel (gtag) fires deterministically; Google's ads platform then layers on Enhanced Conversions (server-side user_data), view-through conversions, and data-driven attribution credit. The result is a reported number that's 10–25% higher than what gtag alone would produce. ## Diagnostic steps ### Step 1: Pull the modelled-vs-deterministic split In Meta Events Manager → Overview, look at the "Events received" vs "Conversions" lines. Events received is what the pixel/CAPI observed. Conversions is what Meta assigns after modelling. The gap is modelled conversions. ### Step 2: Compare to Shopify paid orders Shopify's paid orders (not abandoned checkouts, not tests) is the floor of your true conversion count. If Meta claims 200 conversions and Shopify shows 180 paid orders, and you know Meta contributed roughly 60% of orders, the math is: 60% × 180 = 108 true Meta-driven orders, vs 200 claimed. Meta over-reports by 85%. This overlap includes modelled conversions plus view-through plus any double-attribution with Google. See [Meta Shopify purchase mismatch](/guides/meta-shopify-purchase-mismatch) for the full reconciliation. ### Step 3: Check CAPI match rate The Admaxxer [CAPI match rate tile](/docs/capi-match-rate) shows the share of server events that matched a browser event via `event_id`. High match rate (>90%) means pixel and CAPI are well-synced; low match rate means you're double-counting or missing events. ### Step 4: Disable view-through to see deterministic-only In Meta Ads Manager → Comparing Windows, switch to "1d click" (no view-through). This removes all view-based attribution and gets closer to pixel truth. Expect a 15–30% drop from the default 7d-click / 1d-view numbers. ### Step 5: Use the Admaxxer MMM as a tiebreaker The [MMM channel contribution](/docs/mmm) module decomposes total revenue into per-channel contribution using OLS with geometric adstock. It's independent of platform self-reporting. If Meta's reported ROAS is 4× but MMM says Meta's contribution is 2.3×, you have a 40% over-attribution problem. ## Google's data-driven attribution Google's DDA replaced last-click as the default in 2023. Instead of giving 100% credit to the last click, DDA uses machine learning to assign fractional credit across every touchpoint on the path. This sounds like a good thing — it attributes the top-of-funnel generic search that *preceded* the brand search that actually converted. But it has two consequences: 1. **Reported conversions inflate.** A single purchase might receive 0.3 credit on a generic keyword and 0.7 credit on brand search, summing to 1.0 total. But DDA's fractional credit isn't always well-calibrated and can sum to 1.1 or more across all channels, inflating the total. 2. **Reconciliation gets harder.** You can no longer say "Meta drove this purchase, not Google." Every touch got a fraction. The practical fix: use DDA for bidding (it's better than last-click for optimisation signal) but use Shopify UTM-based first-click or last-click for P&L reporting. ## Related signals Three metrics correlate with platform-pixel divergence: - **iOS share of traffic** — higher iOS = more ATT opt-out = more modelled conversions on Meta - **View-through conversion share** — if view-through is >15% of total, you're leaning on weak signal - **CAPI match rate** — the [match rate](/docs/capi-match-rate) should be 85%+ or deduplication is broken The Admaxxer [Claude agent](/docs/ai-agent) can report all three in a single `query_metrics` call and flag when they drift. ## FAQs **Which number should I trust for my P&L?** Shopify paid orders and revenue. Platform numbers are bidding signal, not accounting. **Why does Meta over-report more than Google?** Meta relies more heavily on modelled conversions because its pixel loses more iOS data to ATT. Google's gtag works in more contexts and has fewer gaps to fill. **Does CAPI fix the divergence?** CAPI reduces the gap but doesn't close it. CAPI recovers 15–30% of lost iOS conversions as deterministic, but modelling still layers on top. **How does Admaxxer resolve this?** Admaxxer's [Tinybird pipes](/docs/tinybird) store deterministic event data from the pixel and CAPI. The blended MER tile uses Shopify revenue as ground truth, so MER is modelling-agnostic. You can still query platform ROAS for comparison. **What about Enhanced Conversions?** Google Enhanced Conversions sends hashed user_data server-side to improve match rates. It increases deterministic conversions (good) but doesn't remove DDA modelling.

Frequently Asked Questions

Which number should I trust for my P&L?

Shopify paid orders and revenue. Platform numbers are bidding signal, not accounting.

Why does Meta over-report more than Google?

Meta relies more heavily on modelled conversions because its pixel loses more iOS data to ATT. Google's gtag works in more contexts and has fewer gaps to fill.

Does CAPI fix the divergence?

CAPI reduces the gap but doesn't close it. CAPI recovers 15–30% of lost iOS conversions as deterministic, but modelling still layers on top.

How does Admaxxer resolve this?

Admaxxer's Tinybird pipes store deterministic event data from the pixel and CAPI. The blended MER tile uses Shopify revenue as ground truth, so MER is modelling-agnostic.

What about Enhanced Conversions?

Google Enhanced Conversions sends hashed user_data server-side to improve match rates. It increases deterministic conversions (good) but doesn't remove DDA modelling.

Put This Knowledge Into Action

Bring Meta and Google ads into one self-hosted workspace.

Get Started Free