Following on from our earlier WooCommerce walkthrough
If you’ve implemented server-side Google Tag Manager with Meta’s Conversions API and everything appears to be “working” — events firing, HTTP 200 responses from Meta, no obvious errors — but Event Match Quality is poor, you’ve likely hit the same wall we did.
This post documents the missing piece: how user data actually flows through sGTM, why it often silently disappears, and how to fix it properly without hacks, custom clients, or undocumented endpoints.
This applies whether your source is WooCommerce, LifterLMS, a custom checkout, or anything else.
The symptom: everything fires, match quality is bad
Our setup looked correct:
- Browser purchase event fires
- Server-side purchase event fires
event_idis consistent for deduplication- Meta responds with
200 OK - Events appear in Events Manager
And yet:
- Match quality stuck around ~4
- Diagnostics warned:
- “Send email address”
- “Send first name / last name”
- Meta insisted Advanced Matching was missing
But we were sending email, first name, and last name — hashed, consistently, and visible in GTM preview.
So what gives?
The uncomfortable truth: “seeing it in preview” isn’t enough
In server-side GTM there are three distinct layers involved in getting user data into Meta:
- Incoming request (usually
/g/collect) - Event data model inside sGTM
- Meta Conversions API tag template
Most guides stop at layer 1 or 2.
The failure happens at layer 3.
Why user data doesn’t “just work” in sGTM
The Meta Conversions API tag template does not read arbitrary event parameters.
It looks for specific internal keys, many of which are not documented in the UI, for example:
x-fb-ud-emx-fb-ud-fnx-fb-ud-lnx-fb-ck-fbpx-fb-ck-fbc
If your incoming GA4-style event contains:
em
fn
ln
…those values exist in the event, but Meta’s tag template will happily ignore them unless they’re mapped into the expected internal fields.
No error.
No warning.
Just quietly missing match signals.
The missing link: sGTM transformations
This is the part that almost no walkthrough explains clearly.
Transformations in server-side GTM allow you to:
- Augment event data
- Rename fields
- Expose values to specific tags
- Remove conflicting data
They run between the incoming request and the tag template.
This is where user data must be bridged.
The correct mental model
Think of it like this:
- Your application emits business truth
- GA4 forwarding transports it
- Transformations translate semantics
- Meta tag consumes Meta-shaped data
sGTM is not schema-agnostic. It’s a router with opinions.
The transformation we actually needed
In our case, the incoming event already contained hashed user data:
em
fn
ln
We did not want Meta to infer or re-hash anything.
We just needed to put those values where Meta expects them.
Transformation type
Augment event
Matching conditions
event_name equals jw_purchase
(or whatever purchase event you use)
Affected tags
Meta – Purchase (Server)
Parameters added
| Name | Value |
|---|---|
x-fb-ud-em |
{{EDV - em}} |
x-fb-ud-fn |
{{EDV - fn}} |
x-fb-ud-ln |
{{EDV - ln}} |
That’s it.
Once this runs, the Meta tag template immediately starts populating:
{
"user_data": {
"em": "...",
"fn": "...",
"ln": "..."
}
}
And Meta’s diagnostics stop complaining — after reporting catches up.
A subtle but critical gotcha: duplicate sources
We hit one more trap that’s worth calling out.
The Meta template will also try to populate names from:
user_data.address.first_name
user_data.address.last_name
If both exist, it can result in concatenated hashes, which Meta silently discards.
If you’re injecting x-fb-ud-fn / x-fb-ud-ln, make sure you don’t also feed address-based name fields into the same event.
In our case, excluding those address fields resolved a bizarre double-hash issue that only showed up in the outgoing request body.
Why this feels “hidden” (and why it kind of is)
This isn’t really hacking — but it does require understanding how the tag template works internally.
The official Meta + sGTM guide focuses on:
- Clients
- Endpoints
- High-level flow
It does not clearly explain:
- Which fields the template actually reads
- How GA4 event parameters map to Meta user data
- That transformations are effectively mandatory for good match quality
The only way to be certain is to:
- Inspect outgoing Graph API payloads
- Read the template source (as we eventually did)
- Correlate that with Meta diagnostics
That gap is what causes most real-world implementations to stall at “it fires, but it’s bad”.
Browser + server events: still both required
One important clarification:
We did not remove browser-side events.
The final architecture still uses:
- Browser events (for coverage, cookies, attribution)
- Server events (for reliability and delivery)
Meta deduplicates automatically using event_id.
This is the recommended approach, not a compromise.
Final checklist for future you
If you’re revisiting this in six months (or handing it to another developer), here’s the short version:
- ✔ Business events emitted by the application
- ✔ Deterministic
event_id - ✔ Web GTM forwards via GA4 with Transport URL
- ✔ sGTM Meta CAPI tag configured
- ✔ Transformations mapping user data to
x-fb-ud-* - ✔ Outgoing Graph API payload contains
em,fn,ln - ✔ No duplicate name sources
If all of that is true, Meta will optimise correctly — even if Events Manager takes a day or two to admit it.
Closing thoughts
Server-side tracking is more complex than traditional pixels, but the complexity isn’t random — it’s structural.
Once you understand that:
- GTM is a routing layer
- sGTM is a schema translator
- Meta expects very specific shapes
…the system becomes predictable again.
This transformation step was the hardest part of the entire setup, not because it’s conceptually difficult, but because it’s under-documented and easy to miss.
Hopefully this write-up saves someone else from burning a few evenings wondering why “everything is firing” but nothing is improving.
If nothing else, future-me will thank past-me for writing it down.