Profile picture for user Jordan Con

Here’s a fun fact: Seattle sells more sunglasses per capita than any other major city in the nation, according to the Philanthropic Educational Organization Record. (Note: I have yet to see another study that confirms or disproves this fact.)

As a Seattleite myself, it’s surprising and funny to see this because we have such a strong reputation for being a gray, rain-soaked city. So much so, that it if you just look at the big, overlying trends (yes, we do have a lot of rainy days), you’d miss the fact that we have beautiful, sunny summer days, which indeed call for sunglasses.

The same thing happens with marketing data. Marketers can miss important customer insights because they only look at the dominant trends.

For example, when B2B marketers think about the various marketing channels, we think of a channel like Social as a top-of-the-funnel (TOFU) channel -- it’s useful for awareness and discovery, and less so for lead nurturing and opportunity conversion. Then, when we look at attribution data on a channel level, our belief is confirmed.

In the chart below of our marketing attribution data, Social (a large portion of Paid Media) looks like an effective TOFU channel, but is less influential further down the funnel, as seen by the downward slope in influence from Lead to Opp to Revenue.



However, as we learned from our device data dive last week, desktop dominates much of our customer journey data, while mobile plays a relatively small role. As such, when we don’t break our data out by device, the desktop data will cover up and obscure some interesting insights about how our audience engages with our marketing efforts on the other devices.

When we layer on device data, we see that the previous observation that Social is only a TOFU channel isn’t 100% true -- there’s additional nuance that we must understand.

Difference in Funnel Stage Influence by Device

Below is a chart that shows the difference in touchpoint contribution for each channel split out by device. A positive number (in green) suggests that the marketing channel is more influential at the opportunity stage (lower in the funnel) than at the lead stage (higher in the funnel).



Here, we see that when looking at mobile only, Social is actually more effective deeper in the funnel (increase of 3.39% on mobile compared to a -3.80% on desktop). Because desktop makes up over 90% of our volume, the overall channel trend is dominated by the desktop trend. When we don’t break this type of data out by device, this increase in influence on mobile is greatly overshadowed by the decrease in influence on desktop, and often goes unnoticed.

After seeing this data, the more accurate analysis is that Social is typically a top-of-the-funnel channel; however, when it comes to mobile, it is also an effective channel deeper in the funnel.

This type of analysis can also give us more insight to the degree in which a channel is effective on each device. Organic Search is the clearest example here. On desktop, Organic Search is influential in 16.48% more of our opportunities than leads. It is a highly effective channel deep in the funnel. But when it comes to mobile or tablet devices, the degree to which it is a more effective channel deep in the funnel is much less (6.69% and 3.27%, respectively).

Campaign Conversion Rate by Device



When analyzing paid media campaign performance, marketers often use the general conversion rate to measure whether a campaign is effective. In this instance, we’re looking at seven LinkedIn campaigns and measuring their downstream effectiveness in terms of what percentage of leads from each campaign converted into an opportunity.

In the first column, we have the general conversion rate for all leads from each campaign. This is generally what we look at to gauge campaign performance. Then, we break it down by device.

In the next column, we have the conversion rate from lead-to-opp for people who engaged with the campaign on a desktop. Notice that there is a strong correlation between TOTAL and Desktop. In fact, the correlation coefficient is a strong 0.89 (1.0 is perfect direct correlation, -1.0 is perfect inverse correlation).

In the third column, we have the campaign conversion rate for people who engaged on mobile. Unlike desktop, the overall conversion rate and the mobile conversion rate have a very weak correlation of 0.18.

This shows that, for at least these LinkedIn campaigns, the overall performance of the campaign has little bearing on the campaign’s performance on mobile. For example, Campaign 2 and Campaign 4 have identical conversion rates, turning 14.1% of leads into opportunities. However, when we look a bit deeper, we see that Campaign 2 is far more effective on mobile than desktop, converting leads into opportunities at about twice the rate. Conversely, Campaign 4 is performing terribly on mobile, producing zero opportunities, while on desktop it is pretty average.

At this moment, this insight is admittedly not tremendously actionable because LinkedIn doesn’t allow advertisers to modify their bid based on device. However, things change (like how LinkedIn now allows advertisers to bulk upload accounts for ad targeting), and when they do, we know that we have the data to optimize.

Why do some campaigns perform better on mobile?

We then looked at the content and offers for each of the campaigns, seeking an explanation for why some campaigns performed better on certain devices. Since it can't be because of mobile bid modifications, what could it be? 

We found that Campaign 2 promoted ungated blog content, which is simple to consume on a mobile device. In order to have the high conversion rate that it has, however, readers of the blog content did need to go on to fill out a form during the same session. So, perhaps, the blog content was able to build enough initial trust and engagement to convince readers to fill out a form.

In contrast, a campaign that promotes gated content, like an ebook, asks the reader to fill out a form right off the bat. As seen in Campaign 3 and 4’s mobile conversion rates (7.9% and 0%, respectively), this may not be a particularly effective mobile reader experience.

As a marketer, I understand that sometimes it’s hard to find new insights that impact your marketing. Because you use the same reports week in and week out, you can get in a bit of an insight rut. Sometimes you just have to dig a bit deeper -- and spend a few months in Seattle -- to find the insight sunshine that you were looking for.