Facebook advertising has varying degrees of success for all industries, but I think the general consensus is that it is one of the most powerful demographic marketing platforms available to today’s online marketer. Where else can you accurately target men that are 25 to 35, married, and that love Philadelphia sports teams? (Disclaimer: don’t target that segment because I won’t click on your ad)
One of the primary obstacles that Facebook advertising does present is attribution to the final conversion. In most situations, Facebook ads are likely one of the first touch points in the conversion funnel and thus do not receive attribution in the last-click conversion world that most companies live in. Regardless of this fact, any Facebook ad traffic being driven to a website (not a Facebook page) should be tracked appropriately within your web analytics in order to determine its onsite performance and how it interacts with other online channels. Due to most websites using Google Analytics as their primary web analytics tool, this requires the use of the Google Analytics URL builder or a handy-dandy Excel version like we’ve created for our own in-house use.
It’s because of this extensive tracking of Facebook ads that we recently uncovered something very interesting within Google Analytics when looking at the initial launch of consistent optimization of some of our clients accounts.
I recently launched a highly geographically targeted Facebook advertising initiative for one of our clients. During the initial research and setup phase, I built approximately 10 starter campaigns from scratch including target demographics and multiple ad variations within each campaign. Each of these ads were tagged with appropriate Google Analytics tracking parameters, so when I check performance the day after launching the first set of campaigns I was perplexed by what I found as you’ll see below.
Having run Google AdWords and other PPC campaigns over the years, I know clicks and visits are hardly ever equal; however the significant discrepancy here raised a major red flag and made me dig deeper into what was truly occurring.
My first step in analysis was creating a secondary dimension in order to view visits by Facebook campaign. I discovered that campaigns I hadn’t even launched yet had received visits, thus triggering the great ‘Huh?’ moment. Continuing to dig deeper, I next did a secondary dimension of city to identify where these visits were geographically coming from. This is what I found:
As mentioned previously, this is a highly geographically targeted initiative focused on the tri-state area around New York City so of course seeing San Francisco on this list triggered an even greater ‘Huh?’ moment. Furthermore, there was a significant amount of visits coming from the magical city of (not set) which does happen time to time for other online channels, but not to this extent for a brand new initiative.
The First Hypothesis
The proverbial light bulb went off immediately upon seeing San Francisco on the list of cities that had produced traffic: Facebook’s headquarters is located in Menlo Park, CA which is only 30 miles south of San Francisco. Thus my first hypothesis for what had triggered this significant amount of Facebook ad visits was created:
Facebook is testing all of my destination URLs to see if they actually work
Naturally, I needed to confirm this by looking at one of our other clients who is running a highly geographically targeted Facebook advertising initiative. By looking at the date in which Facebook ads first started producing visits within the client’s Google Analytics and comparing it to Facebook’s click data, I found my confirmation.
Google Analytics showed 9 visits produced on June 5th coming from Forest City, NC while the actual launch of campaigns within Facebook was on June 7th. After Googling ‘forest city nc facebook’ I found that Facebook recently opened a datacenter in that location, thus confirming that Facebook does indeed test ad destination URLs for new launches. But what about when you do constant optimization and ad testing which results in new ads being launched? Does Facebook check every single time?
In order to find out, I created a quick Google Analytics Advanced Segment including only those cities I had found to be directly correlated to Facebook datacenter traffic during my analysis. This is what I found.
Comparing this with the account manager’s notes on when new ad tests were launched, we were able to confirm the visitor spikes, especially the one seen in October, accurately matched up to those tests beginning.
How This Impacts the Online Marketer
At this point you may be saying, ‘Very interesting, but why do I care?’ Well if you look closely at the previous screen shots, you’ll notice that all of this Facebook data center traffic has nearly 100% bounce rates and zero seconds time on site. When reviewing performance of Facebook advertising campaigns and determining the next optimization steps, this can have a significant impact on skewing the stats associated with each campaign and ad. You may end up suspending an ad or campaign or entire marketing channel that was in fact performing very well but had dirty data because of Facebook testing the ads.
It’s important to point out that these examples were fairly small scale campaign launches, but think how Facebook testing destination URLs every time a new ad is created impacts larger businesses such as Samsung who just reported a $10 million Facebook ad buy for its Galaxy S3? That is a lot of Facebook-generated visits being produced on an extremely frequent basis since I would assume Samsung had a significant amount of testing occurring.
So how do you clean up your Facebook advertising data within Google Analytics?
- Follow the same basic analysis that I did and determine where Facebook datacenter traffic is coming from. Note that this list will probably grow as time progresses since the first instance for our second client example was only Forest City, NC. As time progressed, I did find San Francisco visits in the data.
- Create two appropriately labeled Google Analytics Advanced Segments:
- One that excludes all of these Facebook datacenter visits that you can name Facebook Ads – Clean.
- One that includes only those cities where Facebook datacenter visits have been generated from, which can be named Facebook Ads – FB Datacenters. The reason I’m suggesting creating an ‘include’ filter is so you can confirm that whenever changes to ads are made, Facebook datacenter test the destination URLs. Facebook’s process may change in the future, so it’s always a good idea to check in on it every so often via this advanced segment.
It is important to point out that this method will likely not work for initiatives that are being advertised across the country. Hopefully though, you’ve split up your Facebook campaigns to match messaging up with geographic regions, thus allowing for a variety of advanced segments to still work.
I will note that I attempted to identify Facebook datacenter traffic beyond just city. I looked at the user’s domain, but ran into some trouble since some domains do not track accurately within Google Analytics thus resulting in unknown.unknown which is also being used by some of Facebook’s datacenters. I did however identify facebook.com as a user’s domain, so that can easily be included/excluded within an advanced segment.
You should always track non-AdWords advertising performance appropriately within Google Analytics using the URL Builder. That’s what allows you to not only evaluate and optimize appropriately to continuously improve performance, but also discover instances such as this that result in dirty data and thus skews your analysis. When you find questionable data, always raise the red flag and start digging to find the answer. What you find may be nothing at all, but in certain situations you will find something that can increase your ROI or conversion rates significantly.
Lastly, I’m interested to hear the thoughts and insights on all of this from Marty Weintraub, CEO of aimClear, who deals with mammoth Facebook campaigns and who visited our office a few months ago. Looking forward to potential blog comment Marty!