The Role of Research in Building the New Chartbeat Publishing, Part 2

A quick recap of Part I of this post…
  • We knew Chartbeat Publishing was strained by UX debt
  • We were psyched to introduce some major new functionality into the product
  • We knew that we had to consolidate all our research and fill in knowledge gaps before the full design and dev process
  • We were working out of a glorified bomb shelter next to a demolition site, an atmosphere of adversity which likely hastened our eagerness to redesign

The research effort culminated in the construction of a massive affinity diagram or mental model, which neatly organized all of the chaos of a newsroom into a taxonomy of actions and behaviours. The top level of the mental model consisted of four main categories:

  1. Developing content – actions associated with actually creating content
  2. Assessing content – actions associated with measuring traffic to content
  3. Assessing audience – actions associated with measuring identity and quality of traffic
  4. Developing audience – actions associated with systematically building audience

Taking a look at the mental model, the “developing content” and “assessing content” categories were fairly concise. We had a pretty thorough understanding of the workflows, processes, and product opportunities. But for “assessing audience” and “developing audience,” things were a little murkier. There were a myriad of complex activities that seemed disorganized and in need of rationalization — we had unearthed all kinds of social media tricks and hacks, experiments in link partnerships, attempts to infiltrate Reddit, newsletter optimizations, Outbrain gambits, and a whole slew of other tactics.

And the survey data backed up our feeling that there were more people working on audience development and using Chartbeat than we had originally thought.

We reached two conclusions:

  1. We needed to sit down again with the publishers doing the most progressive work in the area of audience development and try to figure out what we’d missed, if anything, the first time.
  2. And, in parallel, we needed to prototype some ideas that came out of our own hypotheses about how to measure audience quality in a simple way.

We kept the interviews going, delving more into the practice of audience development. Organizations with a paywall or those considering a paywall were already doing very sophisticated analyses on audience data and propensity modelling and understanding segments of ‘quality’ traffic. At other organizations, the role resided with an individual, perhaps a social media manager or audience development manager with more of an ad hoc approach, but the mandate was consistent: find an audience that will visit our site regularly.

In parallel, we were building the first Audience Perspectives prototype for a very limited number of customers. And by “built,” I mean we all jumped on the back of our lead data scientist, who carried us across the goal line after an intense six-week sprint cycle.  At the end of that cycle, we had our first Weekly Audience Perspective, which we then showed around to people. The reaction was fantastic. We heard things like “this is exactly what we’re trying to do,” and “we’re hiring a team of people to generate insights like this.”

The product strategy crystallized. Customers at all levels of audience development proficiency validated our ideas around propensity modeling, visitor return rate, and editorial planning that focused on building a core audience of dedicated readers. To tell the story of what was happening in the industry, we boiled down our research to a couple succinct artifacts:

1. Organizational Personas

This is a pretty unconventional persona… maybe it isn’t one. But it efficiently summarizes the what our research found: there’s a continuum, starting with the fundamental and common mandate of creating quality content; it moves to the intermediate stages of measuring content performance; it continues on to understanding the identity and quality of an audience; and then it concludes with systematically developing that audience. Each category is phrased as if the organization were a person with deliberately oversimplified and goals:

org-persona-continuum

For many publishers, the Develop Content and Asses Content columns captured the extent of their activities and inferred their analytics needs. But other organizations were moving to the next level. For them it was about understanding and developing audience.

For each of these categories there were fuller descriptions and definitions. Here is a zoomed-in view of those extended definitions for the Assess and Develop Audience categories.

org-personas-zoom

For Chartbeat Publishing to succeed, it had to facilitate the needs of people all across the spectrum—from those who have very basic needs about content performance, all the way to those who want to acquire and retain audiences in more sophisticated ways.

2. Individual Personas

In addition to understanding organizational goals, we needed to understand specific roles within those organizations and how responsibilities are typically split up. For this, we came up with four personas that overlap to a greater or lesser degree, depending on the organization. At some places, one person may perform three or four of these roles. At other organizations, the opposite is true and there’s additional fidelity within each of the four areas. But for the purposes of our project, this level of specificity felt right.

role-personas

Product Direction

How did all of this research finally drive the direction of the dashboard design? Audience development was to be the focus of the new dashboard, but not at the expense of the simple, high-level metrics about content. The dashboard also couldn’t attempt to do too much simultaneously. The Heads Up Display, for example, was a better homepage performance tool than the dashboard could ever be, so the best thing for us to do for a homepage editor in the dashboard was to clearly point the way to the Heads Up Display.

For content creators, producers, and audience development personnel, the dashboard was designed to support their own individual activities and use cases. For example:

For Content Creators:

I want to know how my content is doing. By clicking or pivoting on a story, I can see the identity and quality of that audience: are they new, returning, or loyal? From what traffic sources? Are they engaged or not engaged? Is my article one that’s bringing in visitors that are likely to come back, or are these visitors just tourists?

For Audience Developers:

Starting from the traffic sources and referrer module, I can click to find out what content they are visiting and whether they are quality or not, i.e. engaging longer and recirculating. Is my Facebook campaign bringing people in? More important, what are they doing once they’re on the site? How much are they consuming? Am I maxing out my best referrers—ones with the highest return rates? What are the tweets that are driving the most traffic to my site?

For Producers:

I want to ensure that all side-door traffic is handled in priority. Which articles have great traffic volumes but low recirculation, so that I can can make sure those pages are juiced up with related links? Where are the highly engaged new visitors on my site so that I might show them a call to action to follow us on Twitter or sign up for our newsletter? What are the top pages without video content, so I can make sure we’ve got our video in front of as many people as possible.

For anyone that opens up the new Chartbeat Publishing dashboard, we want to answer your questions about what you care about—your content, your social media campaign, your video or mobile site. And we want to introduce the core concepts of audience development: paying attention to the right lead indicators (Engaged Time and Recirculation), the difference between a high-converting and low-converting traffic source, and content that doesn’t just get clicks, but actually contributes to the development of your returning and loyal audience.

Wrapping Up

The research gave us all the confidence we needed to push ahead with the actual design process. For a product to succeed it needs to fill a need and be simple and easy to use. The first part of that equations was solved  – the want was clear and well defined.

Next week Tom Germeau will delve into the interaction and design process and the realization of the dashboard.

 


More in Product