On March 31, the Media Rating Council (MRC) announced it was lifting its advisory on viewable impressions for display advertising, bringing the industry one step closer to transacting on viewability for the first time. The point at which publishers are asked to deliver highly viewable campaigns is rapidly approaching. If you haven’t started to develop a strategy to maximize the viewability of your ads, I’d wager that in the next three months, you will.
There are many tactics that can be applied to improve your ads' view ability: ensuring fast ad loads; lazy-loading advertisements; and redesigning a website to feature always in-view units.
One issue has gotten surprisingly little discussion, though: Ads are much more viewable on pages that people actually want to read. Take a look at the following figure, which was computed across a sample of a billion ad impressions across the month of May 2014.
We see there’s a strong relationship between what fraction of ads are seen and how long a person spends reading the page: as Engaged Time increases from 15 seconds to one minute, viewability goes up by over half, from 37% to 57%. Visitors who read for more than 75 seconds see more than 60% of advertisements.
This isn’t too surprising. Of course, people who read pages more deeply see more of the ads on the page, but it’s still worth taking note. We’ve argued for years that articles with higher average Engaged Time should be promoted because they represent the articles your audience is most interested in, and—in the days where viewability is more critical than ever—promoting your most deeply read articles makes good business sense, too.
Want to know more about traffic sources how they can help you understand your audience's behavior? Download our guide.
Over the past year, we’ve published extensive research on how to use data to understand and build your audience — everything from the effects of Engaged Time to scrolling behaviors and traffic sources driving traffic to the sites in our network. All of the data in those pieces are combined from a set of customers who allow us to use their data in anonymous, aggregated form. Looking at statistics aggregated from across a wide swath of sites is interesting because it lets us identify network-wide facts.
But, subtle patterns often get averaged out, so it’s hard to tell a nuanced story using aggregated data. Today, in partnership with New York Magazine and Rick Edmonds and Sam Kirkland of Poynter, we’re excited to present something different: a deep look into the data for one site, New York Magazine’s Vulture.com, about what factors drive visitor loyalty. (A quick note: This data is presented with the consent of New York Magazine and Vulture.com. Chartbeat never shares customer-specific data.)
If you’re going to read one piece, I’d highly encourage you to click over and read the Poynter team's piece, which contains much of the data given below, as well as extensive feedback from the Vulture team. But, we also wanted to present our own take on the data, which you’ll find below. Our goal is less to provide answers than to get you thinking about what questions you might ask of your own site.
How We Define “Loyalty” and Why It's Important to Measure
Before we can look at how visitors become loyal to a site, the first thing to do is define loyalty. Informally, by “loyal” we mean something like “a person who is highly likely to continue to return to the site across time.” For instance, a person might be loyal to the site of their daily newspaper. One way of getting toward a specific definition using the data is by asking how many times a person must visit before we’re nearly certain they’ll continue to return. In the figure below, we plot the probability that a person will return to Vulture.com, given the number of times they’ve already been to the site.
There are perhaps three things worth noting on this plot:
Visitors who have come once so far in a month are just over 20% likely to return.
That rate of return climbs rapidly until we reach visitors who have visited five or six times. Once a person has come five or six times in a month, we can be highly confident that they’ll continue to return.
The downward slope on the right side of the graph is a windowing effect because we’re looking at one month of data: people are unlikely to come every single day in a month, so once a visitor has come more than about 22 times their probability of returning more times begins to decrease.
The Relationship Between Time of Day and Return Rate
After asking if visitors returned to the site, the next question was when visitors returned. One of the most striking data points we found was that visitors are far more likely to return at the same time of day as that of their initial visit — those who first visit the site today at noon are most likely to come back to the site tomorrow at noon, and so on. While that pattern is significant throughout the day, for Vulture it’s substantially stronger for visitors who come in the afternoon and evening, as demonstrated in the figure below.
In this figure, we’re comparing two sets of visitors: those who first arrive on a Wednesday between 10:00 a.m. and 10:59 a.m. and those who arrive on the same day, but between 6:00 p.m. and 6:59 p.m. The red lines show what hours of the day the 10 a.m. visitors return to the site throughout the rest of the month, and the blue lines represent the same statistics for the 6 p.m. visitors. For both audiences, the vast majority of time spent on other days of the week is at the same time of day — for instance, the 10 a.m. audience is most likely to return on Tuesday, Wednesday, or Thursday at about 10 a.m. What’s striking, though, is that the 6 p.m. audience spends dramatically more time on site throughout the week when compared to the 10 a.m. crowd. It’s worth noting that, though we’re showing traffic from Wednesday morning and evening, the basic pattern holds for those who arrive at other hours on other days.
One theory might be that this variation is caused by a difference in topics consume — perhaps, for instance, readers are engaging with Vulture's TV coverage during the afternoon and evening. Interestingly, we saw no evidence that this is the case: the breakdown of traffic by topic is roughly constant throughout the day. On the other hand, this variation in return times lines up extraordinarily well with device usage. In the early daytime, when traffic is less likely to return, upwards of 40% of traffic is mobile. In the evening, when traffic is much more predictable and more likely to return, mobile falls to only 22% of overall traffic.
This data raises more questions than it answers: What can be done to get the morning audience to come back more frequently? How can editors take advantage of the daily patterns of their evening readers? Answering those questions is out of the scope of this article, but the upshot here is that there is a hugely interesting opportunity in understanding behavior as it relates to time of day.
Improving Return Rates of New Visitors
Obviously, one key challenge for any publication is in getting new, incidental visitors to move down the funnel toward loyalty. We saw three factors that exhibited significant influence over a new visitor’s probability of returning: how they arrived at the site, the type of content they landed on, and how much time they spent reading.
Vulture’s top referrers are similar to what we see across the internet, as are their relative rates of return. Unsuprisingly, new visitors coming from its sister site nymag.com are most likely to return (22%), followed those from Twitter (16%) and Buzzfeed (10%). Perhaps surprisingly, the length of an article proved to be a strong predictor of likelihood to return, as shown below.
Stepping through this graph from left to right:
Visitors who land on the shortest articles are extremely unlikely to return, but their probability of return rapidly increases from there.
Those who view the Vulture hompage, forming the first peak at about 3900 pixels, are substantially more likely to return than those who view average-length articles — this article, for example — which are 4000-4500 pixels high.
However, those who visit longer articles — this article, for example — are substantially more likely to return.
Visitors who spend substantial time reading on the first page they land on are also much more likely to return to the site. Overall, this confirmed an editorial hunch the Vulture team had, that they were better off moving away from extremely short pieces of content.
But that’s the Vulture team specifically; shorter posts may work best for your site. We dove into this study with Vulture.com precisely because every site is different: the content is different, the people visiting are different, the goals and metrics are different. I hope you and your team will see this data as a starting point for everything you can be looking at and acting on. There's a lot more richness to your site's data than purely traffic numbers. If you need help getting started and knowing what to look for — Chartbeat or not — just send me an email at firstname.lastname@example.org.
Current estimates are that nearly 100 million viewers tuned in to watch Seattle’s 43-8 win against Denver last night. Of course, there’ll be many reports that dissect the ways we watched the game, but for us, one particular area of interest is the prevalence of multi-device viewing. The concept of the “second screen”—people consuming media on multiple devices simultaneously—gets a lot of discussion these days, and sports sites are perhaps the best study in second screens. Sports fans still consume the vast majority of games on TVs but, while watching, they might also scan stats, highlights, and commentary on their phones, tablets, and computers.
That’s why I found myself flipping back and forth last night between a livestream of the game, my Chartbeat Publishing Dashboard, and an Emacs window, trying to figure out how online traffic varied throughout the night. Whereas on a typical night it’s hard to collate real-world events with online behavior, last night’s game was different. Whether you were watching online or on television, the commercials and game events happened at the exact same moment, which gave us the opportunity to watch second-by-second shifts in web traffic.
One of the most interesting observations was how much online traffic fluctuated before and after commercial breaks. Across sports sites, we saw upticks of 5% to 15% in traffic just as the game went to a commercial break, and that traffic drained off just as quickly when the game resumed play. That trend was present across every commercial break during the game. Perhaps unsurprisingly, the vast majority of those upticks were on mobile devices.
After watching that trend for the first half, I expected a similar increase in traffic during halftime. But, interestingly, halftime elicited exactly the opposite response; sports traffic dropped by 15% to 50% during the break, and the majority of that drop was on mobile.
Because it’s so difficult to know for certain that the same person is using multiple devices, most analyses of second-screen behavior have measured device usage via surveys. In this case, though, because we saw behavior that was so tightly coupled to events taking place on TV screens, we can start to get a sense of the scale of multi-device usage across the web. And, with patterns in usage as strong as we saw, it’s clear that a large portion of people tuning in were actively engaged on second screens in response to game events.
For the final installment of our series on Understanding Your Traffic Sources, I wanted to go over some best practices for managing referral traffic and identify a few places where you can use Chartbeat data to support your decision-making.
But first, let's sum up the data that we've seen over the past few weeks. The graphic below shows what sort of browsing behaviors are indicative of visitors coming back to your site, based on many sites' most common traffic sources.
At one extreme, we have visitors who come to your site homepage direct and are always likely to return. At the other, those who come via Google News are unlikely to return, regardless of how they read. In the middle, though, we have an interesting split:
Visitors who come from Facebook are likely to read most of the article they land on, but those who click to a second article are much more likely to return
Visitors from Twitter and Google search, on the other hand, consuming the entire article they land on is the best indicator of a likelihood of returning
Traffic from other, smaller sources tends to behave much like Google News or Twitter traffic in this graphic. Now that we have a sense of how different kinds of referral traffic behaves, I’ll dive into right into what actions you can take with this data.
Where, and how, to concentrate your efforts
One of the starkest data points we've come across is how much more likely a person is to return to a site via the referrer they come from versus all other referrers combined. Those who come from Facebook are likely to return only via Facebook, those who come from Google News are likely to return only via Google News, and so on. In that sense, the most important thing you can do to grow audience from a given referrer is maintain a steady stream of links from that referrer.
Given that, you should ask two questions. First, what sources should we concentrate on building traffic from? Second, what can we do to build that traffic?
The best way to decide the former, if you're a Chartbeat Publishing client, is to take a look at the "return rate" and "return direct rate" columns of your Weekly Perspectives. Those columns express, in essence, the value of links from different referrers — those with higher return rates send traffic that's more likely to return to your site.
If you don't have access to Chartbeat Publishing, the general trend that we've seen is that, unsurprisingly, visitors from social sources have the highest likelihood of returning, while sources like Google News, Reddit, and Outbrain are likely to increase your site's reach by sending new visitors, but are unlikely to meaningfully help you grow your audience in a self-sustaining way.The second question, of course, is much harder to answer in broad terms. Taking each traffic source one-by-one, though:
Twitter: One thing we've seen many times is that people don't promote posts nearly as often on Twitter as they should. Most sites see the majority of their Twitter traffic coming from their own tweets, and the lifetime of a tweet is incredibly short. Tweeting headlines is rarely the right choice.
Facebook: Facebook traffic typically comes from organic sharing, which means it's harder to predict and control. One thing you can control is Facebook's preview text, and it's hugely important. If you don't know what text is showing up on Facebook's previews, you need to figure it out.
In-network sites: If your site is part of a network, working to maintain links from your sister sites is critical. It’s not uncommon to see return rates over 50% (about twice as high as for typical referrers) for in-network traffic, which is a function both of similarity of audience and of the regularity of links. Fostering these types of link partnerships is one of the best ways to sustainably build audience.
Google: First off, it’s critical to separate “branded” search (searches for your domain name or URL) from truly organic search and Google News. Branded search should be thought of as akin to direct traffic. Optimization for organic search is a huge topics unto itself and probably beyond the scope of this post.
A caveat for paywall sites
One place where sites often miss out is with paywalls that are porous for traffic from external referrers, only presenting a prompt to subscribe on later pages. Under that scheme, a visitor, for instance, who always comes from Twitter and only read the article she lands on will never even be asked to subscribe. We've seen some publishers move toward differentiated paywalls for exactly this reason -- traffic from some referrers is immediately asked to log in while visitors from others are allowed to read an article or two for free.
If that fine-grained control isn't in the cards, your goal should always be to get visitors to read through to a second article. Looking at "subsequent time" in your Weekly Perspectives should give you some idea of which referrers send visitors that are likely to click to a second page -- concentrating on getting traffic from these referrers makes sense. And, understanding where people are leaving each article will give you a clue into where you should be placing link suggestions. Great related links at the top of an article aren't in view for visitors who read the whole page, and great links at the bottom of an article don't matter to those who never scroll down to see them.
We've hardly scratched the surface of what can be said about traffic sources. Much of the most exciting data is easiest to find under the hood of your dashboard – the data that's specific to your site, not the internet as a whole. We're working on putting out several case studies that look in detail at traffic for a few sites, which we'll be sure to let you know about here once they go live.
In the meantime, thanks for reading, and if I can leave you with one message it's this: experiment!
What we've presented over the past five articles are broad statistics about traffic across the internet, but we regularly see sites that wildly depart from the average. If you see a return rate of 10% from a given referrer, take that as a challenge and try getting traffic to a different set of links from that referrer and see if you can push next week's rate to 11%.
Let me know your questions or what you're seeing in your data in the comments here or by tweeting at @joshuadschwartz; I'll be sure to come back to your site if you get in touch.