What We’ll Discuss Today
Today, we know that simply tracking conversion events is no longer enough. It is essential for platforms to access a sufficient amount of data, but more importantly, that data needs to be of high quality. For this reason, I’ve decided to revisit the topic of the Pixel, but this time, focusing on evaluating data quality.
While the importance of providing quality data to improve campaign performance is well understood, assessing the quality of the traffic we’re sharing with the Pixel is not always straightforward.
Let’s dive into the heart of the matter, but before we begin, let’s take a brief journey back to the early days of tracking systems.
Why Are We Talking About Meta Pixel? (More or Less)
The installation code for today’s Meta Pixel is a JavaScript snippet that, once placed on your website, downloads Meta’s library and sends the data you’ve chosen to share.
However, if we retrieve the installation script for the Pixel from the developer documentation, we can see that the code to place on the page is composed of two parts:
- A Script.
- A
<noscript>
tag.
The second part of the Meta Pixel tracking code is used to handle tracking issues caused by browsers that do not support JavaScript. The <noscript>
tag includes a 1-pixel image, which is a basic tracking method that helps us understand how simple tracking systems work.
Let’s explore it together.
When a user visits our website, the browser downloads all the content on the page to display it. Since the browser has to download everything, it also includes the image within the Meta <noscript>
tag. Meta’s servers will then receive a request to download the image (this signals that an important event has occurred for us). This request can be enriched with GET parameters, such as the pixel ID and event name (ev).
This basic tracking system allows us to monitor user actions on our site without complex systems. However, today, it’s not robust enough to address the challenges we face.
From Pixel to Dataset
Now that we understand why many tracking systems are referred to as “Pixels,” we can move forward and highlight a change that Meta has been implementing for some time.
I still remember when the “Dataset” section was added to the settings menu, and many previously created Pixels were no longer visible under the “Pixel” section but under “Dataset.”
This seemingly minor change is actually quite significant because it shifts Meta’s focus on what they want us to prioritize.
According to Meta’s documentation:
Datasets allow you to connect and manage event data from various sources, such as your website, mobile app, physical store, or business chat, all in one place.
As you can see, the concept of a Dataset is much broader than that of a Pixel. Meta isn’t just interested in tracking users’ navigation data on your site—they want to follow them throughout their entire lifecycle to understand them better and precisely determine if a conversion is attributable to their advertising system.
In such a complex landscape, data quality becomes increasingly important. I want to teach you how to assess the quality of the Meta events you’re receiving.
Verifying Event Numbers
The first thing to evaluate is the number of events being sent to your Dataset. If events aren’t coming through or are inconsistent, there’s no point in proceeding with further evaluation.
Event numbers are delicate, but the primary tool for this analysis is Meta’s Events Manager. Through this tool, you can monitor your Datasets and understand the flow of data—both quality and quantity.
If the trend line moves sharply downward without explanation, it’s a sign that something is wrong.
Verifying CAPI Quality
Next, we should assess the quality of Conversion API (CAPI) configurations. You can easily verify whether CAPI is active for all events by checking the “Integrations” column in Events Manager.
If events are being sent exclusively via browser or server, they will be noted accordingly. Ideally, events should be sent via multiple methods, indicating that both browser-side and server-side CAPI are working correctly.
Evaluating Event Coverage and Deduplication
Coverage refers to the proportion of events sent via both browser and server, ensuring redundancy to account for potential errors in either channel. Deduplication, on the other hand, eliminates duplicate events that may arise from sending the same data via both browser and server.
By properly implementing deduplication with the correct event IDs, you can avoid discrepancies in your data.
Evaluating Event Association Quality
Lastly, Meta assigns a score to each event based on how well it can match user data to its own records. The more accurate and complete the data you send, the higher this score will be.
Conclusion
That’s it for today’s post. While some points may have gotten technical, these are the critical aspects to consider when evaluating your Meta Pixel setup. Thanks for reading, and stay tuned for the next post!