How to tell your ad account is optimizing for bad data
An ad account optimizes for bad data when primary conversions do not reflect business value.
Short answer
An ad account optimizes for bad data when primary conversions do not reflect business value. Typical symptoms are cheap leads with no quality, a sudden jump in conversions after a measurement change, high performance in the platform and low performance in the CRM, or campaigns that learn from clicks, visits, and duplicate events.
Symptoms of the problem
First signal: CPA drops, conversion volume grows, but the sales team reports worse leads. Second signal: after deploying a new GTM setup or CAPI, conversions double but orders do not. Third signal: PMax or Meta brings in many forms, but the CRM shows spam, irrelevant inquiries, or zero close rate.
Fourth signal: most conversions come from micro-actions. The user viewed contact details, clicked a button, opened a form, but nothing commercially binding happened. If these actions are primary, the account is optimizing badly.
Where to look for the cause
Start with the list of conversion actions. Which are primary? Which are used in campaign goals? Do they have a value? Do they count once or every time? Are they duplicated? Are they imported simultaneously from GA4 and through the Google Ads tag? For Meta, check Pixel/CAPI deduplication and event_id.
The next layer is the CRM. Compare platform conversions with leads in the CRM. Take a sample of 50 conversions and find out how many were qualified. If quality is low, the problem is not only in the campaign. It is in the signal the campaign receives.
Bad data is not always a technical error
Sometimes measurement is technically correct, but the business definition is wrong. For example, every form is truly submitted, but half of them are irrelevant inquiries. The solution is then not to fix a tag, but to send only qualified leads into bidding or add values according to form type.
Other times, the problem is in the offer or landing page. The campaign brings exactly the people the page appeals to, but the offer attracts low quality. The data is then not technically broken, but strategically broken.
How to retrain the account
First, fix measurement. Deduplicate events, remove micro-actions from primary, set values, and start sending qualified leads or offline conversions. Then roll out changes in a controlled way. With Smart Bidding, an abrupt change of goal can cause short-term volatility.
A good approach is to first collect the new high-quality conversion as secondary, verify its volume and quality, then switch it to primary and keep the old weak signals only for reporting. For Meta, the same applies to better events and CAPI deduplication.
How to recognize improvement
After the fix, the number of conversions does not have to grow. Often it drops instead. That is fine if quality, value, and close rate grow. Track cost per qualified lead, lead-to-sale rate, pipeline value, backend-based ROAS, and the difference between platform and CRM reporting.
The best ad account is not the one with the most conversions. It is the one that sends budget into segments with the highest chance of becoming revenue.
FAQ
Frequently Asked Questions
Next Article
custom loader GTM GA4 server-side
Custom Loader for GTM and GA4: what exactly it changes and why server-side is often not enough without it
Server-side GTM by itself does not mean measurement avoids blocking. If the website still loads standard gtm.js from googletagmanager.com and sends the first request to a recognizable tracking endpoint, part of browser blocking can intervene.
Looking for someone who can take this off your plate?
We will audit your advertising data and show whether Google Ads and Meta optimize for business value or for falsely positive signals.