A Closer Look at Desktop vs Mobile Engagement

Background

A few weeks ago, our support team received an email from an Advanced Queries-savvy customer at the Poughkeepsie Journal who wanted to examine the differences in user engagement between their top articles on desktop computers vs. mobile devices. He wanted to better understand any differences in behavior so he and his team could be more thoughtful about their content and promotion strategy.

In his initial email, he had already done quite a bit of work; created a spreadsheet using Advanced Queries’s ‘most popular articles’ format and grouped them into device type, ran them through an excel spreadsheet to find the top 140 or so matches (since not every story ranked in the top 200 on both platforms), and even created a new field called ‘spread’ which looks at the difference between a story’s rank between the desktop and mobile version. But he needed our help to go deeper and see if we could pull out any meaningful insights.

As the biggest Advanced Queries nerd on the team I jumped at the chance to tackle this problem.

What Engagement Means on Different Devices

Before getting into the numbers, let’s recall what exactly Chartbeat is measuring: a user’s engagement with your page. Chartbeat listens for scroll events, keystrokes, and mouse movements to determine when a user is engaged. When this isn’t happening (for example, when a user moves into a background tab on a desktop device) Chartbeat stops attributing engagement.

Readers on desktop devices who leave background tabs open and never return can ultimately dilute the average engaged time number, because while they’re still being counted as a concurrent, they don’t contribute to engaged time. On mobile however, there’s a different story: once a user moves to a new app, opens a new tab, or closes the phone, the session times out.

Phase One: How does Engagement Compare?

Our first question was around which device type usually had the higher average engagement, so the first step was to compare engagement between devices.

mobile_desktop_scatterplot

The key metric we’re using here is a given story’s rank on the most engaging articles list for each device type; the higher the ranking, the less engagement it received. In the graph, each dot represents a single story and placement on the axes represent its ranking for each device type.  

What we see is that it loosely follows a 1:1 relationship: as you increase a story’s rank on mobile you increase at about the same rate it’s rank on desktop. Which makes sense — you’d generally expect that in most cases a highly engaging mobile story will also be engaging for desktop readers.

However, it’s also pretty clear that more stories sit above that trend line than below it, showing that there were more stories whose mobile versions have a better rank than their desktop counterparts.

Due to the differences in technology there are generally fewer deflating sessions that contribute to average engaged time on mobile it’s not surprising to see higher average engaged times for mobile. Nonetheless, when you compare the desktop and mobile average engagement ranks, there’s still a clear correlation.

But we wanted to go deeper. What might be happening in between these ranks is where we can learn how substantial that variation really is.

Phase Two: Looking Closer

By looking at how strong the variation in user engagement is between a story’s desktop and mobile versions, we can look closely at each story to see if we can identify a pattern in stories with large engagement variations.

To explore this I looked at the distribution of the differences between average mobile and desktop engaged time for each story:

mobile_desktop_histogram

I found that of the top 139 stories, mobile readers spent an average of 6 more seconds engaging with a story than desktop visitors — further evidence of higher average engaged times for mobile.

In fact, 15 stories (10% of the total) had mobile average engaged times that were at least 25 seconds greater than desktop readers. Digging deeper, I found that each of these stories has a highly promoted video at the top of the page.

Remember, Chartbeat Publishing engaged time only tracks engagement with the page and not the video player. We know that the higher engaged times aren’t coming from by users watching the videos — it appears that on the Poughkeepsie Journal, mobile readers really are just spending more time engaging with the longer-form content following video. This could be because users are conscious of running up a huge data bill, or because they’d preferred to digest the narrative quickly and quietly.

Either way, there’s a clear a trend here: the five pieces with large discrepancies between mobile and desktop have captivating video, whereas the bottom five stories with negative deltas don’t.

What does this mean for the Journal? I think it means trust their gut. It’s a great reminder that producing multimedia content shouldn’t come at the expense of written content — mobile readers will thank you for including both (and probably watch that video when they get home)!


More in Customer