Why Admaxxer and Meta/Google Disagree on Conversions

TL;DR: Admaxxer measures with a deterministic first-party pixel + server-side payment events. Meta, Google, and TikTok measure with their own logged-in graphs, modeled conversions, view-through windows, and cross-device fingerprinting. The two will not match exactly — and they should not. A 5–20% gap in either direction is normal. This guide explains why, and gives you a five-step reconciliation that ends with a defensible answer to "which number is right?"

Why this happens — five real causes

1. Attribution windows are different by default

Each platform counts a conversion as theirs based on when, in time, an ad touch happened relative to the purchase. The defaults are not the same:

If you compare a Meta dashboard at 7d-click + 1d-view against an Admaxxer dashboard at last-click 7d, the platforms are measuring different things. Always align windows first before reading the gap.

2. Modeled conversions are not real events

A growing share of every platform's conversion count is modeled — statistical estimates that fill the gap left by lost signals (iOS 14.5+ ATT, browser tracking restrictions, ad blockers). Modeled conversions are real conversions that happened — the platform just did not directly observe them, so it estimates them from cohort behavior.

3. Cross-device attribution we cannot replicate

If a user clicks an Instagram ad on their phone, then converts on their laptop the next day, Meta can stitch those two devices together because both are logged into the same Facebook account. Admaxxer cannot. Our pixel uses first-party cookies + fingerprint hashing, but we have no logged-in social graph. If you see Meta reporting a purchase that has no matching session in Admaxxer, cross-device is a likely cause.

This shows up most in:

4. iOS 14.5+ App Tracking Transparency (ATT)

When a user denies the ATT prompt in the Facebook or Instagram app, Meta can no longer pass the device's IDFA back to the conversion event. Meta compensates with AEM modeling — but that modeling is, by definition, an estimate. Same store, same week, same ads:

5. CAPI / Enhanced Conversions Match Rate

Server-side events are only useful if the platform can match them back to a known user. CAPI Match Rate (Meta) and Enhanced Conversions match-rate (Google) measure the percentage of purchase events the platform was able to join with one of its users. A low match rate (below 60%) means the platform is starved of signal and will lean even harder on modeling. Improving match rate — by passing email, phone, FBC, FBP, and a clean external_id — closes the gap between platform-reported and Admaxxer numbers. See CAPI Match Rate in the Metric Glossary for the exact definition and how Admaxxer measures it.

Step 1 — Align the time window

Before anything else, make sure both reports cover the same window in the same time zone. Common silent mismatches:

Pick a single 7-day window in the same timezone. Run both queries again. Often the "discrepancy" disappears here.

Step 2 — Align the attribution window

In Meta Ads Manager, set the comparison window column to 7-day click only (drop the 1-day view). In Admaxxer, set the dashboard window to last-click 7d. Now you are measuring the same thing on both sides.

Numbers should be much closer. The remaining gap is real measurement disagreement — not a configuration mismatch.

Step 3 — Inspect CAPI Match Rate

Open Dashboard › Analytics › CAPI Match Rate. Targets:

See Metric Glossary › CAPI Match Rate for the calculation. Hyros markets this as their core differentiator — Admaxxer ships the same metric for free in every plan.

Step 4 — Check for "spend without sessions"

If Meta reports spend on a campaign that produced zero sessions in Admaxxer, the pixel is not firing for that traffic. This usually means:

For each campaign with a high "spend without sessions" rate, fix the install before reading attribution numbers. A 0% session rate makes the gap to Meta look enormous; a 95% session rate makes it look normal.

Step 5 — Triangulate against Shopify (or your source-of-truth)

The customer's checkout system is the only deterministic ground truth — every purchase must pass through it. Compare:

If Shopify says 51, Meta says 47, Admaxxer says 52, the truth is roughly 51 — and now you can read the gaps:

If Admaxxer is materially under Shopify (more than ±5% off), see Sales mismatch troubleshooting. If Meta is reporting purchases that Shopify cannot find at all, Meta is over-modeling — and you should pull back trust in Meta's column for that campaign.

Worked example

A real reconciliation we ran for a DTC supplement brand, week of 2026-04-21:

SourcePurchasesWindowNotes
Meta Ads Manager477d-click + 1d-viewDefault. Includes view-throughs and modeled events.
Meta Ads Manager (aligned)387d-click onlyStripped view-throughs. Closer to Admaxxer.
Admaxxer52last-click 7dIncludes purchases attributed to non-Meta last-clicks (email, organic).
Admaxxer (Meta only)40last-click 7d, channel=metaFiltered to last-click attributable to Meta. Now apples-to-apples with row 2.
Shopify orders (paid)51order date in windowGround truth. All orders, any source.

Reading the table: the real story is 51 paid orders. ~40 of those had Meta as their last paid touch (Admaxxer + Meta agree at this filter). The other 11 came from email (Klaviyo), organic search, and direct. Meta's headline 47 included 9 view-throughs + modeled conversions that we cannot independently verify — they may be real but we are not counting them.

The brand chose to optimize on the aligned 7d-click number (38 / 40 — nearly identical) because that's the most defensible signal across both platforms. View-through and modeled adds give Meta's bidder a fuller signal, so we did not turn them off in Ads Manager — we just stopped reporting against the inflated number.

When to trust which source

Diagnostic queries

Run these against your Admaxxer workspace to surface gaps faster:

# Sessions per Meta campaign vs spend reported by Meta
# (in Admaxxer dashboard) Filter: source=facebook, last 7d
# Compare against Meta Ads Manager, same campaign IDs, same window.

# Spend-without-sessions rate per campaign
SELECT
  campaign_id,
  SUM(spend) as spend,
  SUM(CASE WHEN session_count = 0 THEN spend ELSE 0 END) as spend_without_sessions,
  ROUND(100.0 * SUM(CASE WHEN session_count = 0 THEN spend ELSE 0 END) / NULLIF(SUM(spend), 0), 1) as pct
FROM ads_meta_with_attribution
WHERE date >= today() - 7
GROUP BY campaign_id
HAVING spend > 0
ORDER BY pct DESC;

When to escalate

Open a support thread (/support) when:

Include screenshots of both reports with timestamps, the windows used, and your CAPI Match Rate. We will help you isolate which of the five causes above is dominant.

See also

Metric Glossary · Sales mismatch (Shopify vs Admaxxer) · Missing orders · CSP errors blocking the pixel · Platform vs pixel conversions · 1-day vs 7-day click attribution