Author Archive

The Role of Research in Building the New Chartbeat Publishing, Part 2

December 6th, 2013 by Adam

A quick recap of Part I of this post…
  • We knew Chartbeat Publishing was strained by UX debt
  • We were psyched to introduce some major new functionality into the product
  • We knew that we had to consolidate all our research and fill in knowledge gaps before the full design and dev process
  • We were working out of a glorified bomb shelter next to a demolition site, an atmosphere of adversity which likely hastened our eagerness to redesign

The research effort culminated in the construction of a massive affinity diagram or mental model, which neatly organized all of the chaos of a newsroom into a taxonomy of actions and behaviours. The top level of the mental model consisted of four main categories:

  1. Developing content - actions associated with actually creating content
  2. Assessing content - actions associated with measuring traffic to content
  3. Assessing audience - actions associated with measuring identity and quality of traffic
  4. Developing audience - actions associated with systematically building audience

Taking a look at the mental model, the “developing content” and “assessing content” categories were fairly concise. We had a pretty thorough understanding of the workflows, processes, and product opportunities. But for “assessing audience” and “developing audience,” things were a little murkier. There were a myriad of complex activities that seemed disorganized and in need of rationalization — we had unearthed all kinds of social media tricks and hacks, experiments in link partnerships, attempts to infiltrate Reddit, newsletter optimizations, Outbrain gambits, and a whole slew of other tactics.

And the survey data backed up our feeling that there were more people working on audience development and using Chartbeat than we had originally thought.

We reached two conclusions:

  1. We needed to sit down again with the publishers doing the most progressive work in the area of audience development and try to figure out what we’d missed, if anything, the first time.
  2. And, in parallel, we needed to prototype some ideas that came out of our own hypotheses about how to measure audience quality in a simple way.
Read the rest of this entry »

The Role of Research in Building the New Chartbeat Publishing, Part 1

December 5th, 2013 by Adam

Last fall, the Chartbeat product team was hunkered down in an office space that could've made an excellent interrogation room. We temporarily obtained this 500 square-foot room to augment our main office, a sardine can of developers, designers, analysts, scientists, and a growing sales and marketing team.

It was… austere: four brick walls and a cement floor. There was a glass-topped table in the middle, a whiteboard, and a phone. Two windows separated us from the round-the-clock demolition going on in the adjacent lot, and you almost always had to shout to be heard. We even called it the murder room.

We were examining the roadmap for the Chartbeat Publishing dashboard. There was a lot on the table—all kinds of ideas for functionality that we wanted to add to a product that was starting to look like it had too much going on. There was no way we were fitting it all in to the current UI. The bulldozers were looking like a good idea. It was time to start from scratch.

But in reality, prep work for a top-to-bottom overhaul was already well underway. We had initiated a massive effort at capturing the state of the newsroom and the publishing industry, and were already thinking about how to align Chartbeat's services with those conclusions.

Our Research Effort

Research is an ongoing practice at Chartbeat. We’re constantly talking to our clients, figuring out how they work and why they do what they do – even sketching out ideas together and evaluating concepts. Nevertheless, heading into the project, we wanted to consolidate of all our meeting notes and interviews, and confidently answer the following questions…

  • Who's using our product?

  • What are their motivations, needs, and philosophies?

  • Where's the industry going?

  • What will the newsroom look like in a year or two?

  • How will editorial roles evolve?

If we started by simply answering these questions, we knew good things would happen.

Gathering Information

To approach our research challenge systematically, we used ethnographic methodologies:

Interviews and Field Studies

If we weren't on the phone firing away at our customers with non-leading questions, Mona Chaudhuri and I were hitting our clients’ offices on a semi-daily basis throughout most of 2012—picking brains, hearing war stories, watching them work, and bouncing ideas around.

Copious interview notes came from these many many meetings at places like The Blaze, NBC News, the Wall Street Journal, The New York Times, CNN Money, Fast Company, Slate, Financial Times, and dozens more. If you had an office in New York, one of us was knocking on your door. And if your offices were outside of New York, we were there too: Washington, London, Toronto, Berlin, San Francisco.

Diary Studies

We asked a diverse group of Chartbeat customers to keep journals of their day to day activities. The journals were written over the course of three weeks into Posterous (R.I.P) blogs. Some of the participants were given, i.e. gifted, iPads to more easily facilitate the entry of notes and ideas. Yes they were great as a lightweight field tool for entering notes, but more so the iPads were a great incentive to keep participants motivated.

We had some very prolific contributors… for example this guy: Jonathan Tesser at New York Magazine (at the time). Reading what his day was like in his own words was a fantastic window into newsroom issues. The ups and downs were so much more tangible—you could really feel the personal challenges in a way that other research methods just couldn't uncover.

Surveys

To get a quantitative perspective on newsroom ethnography, we conducted a survey, which asked people about their role, three day-to-day responsibilities, and the three long-term objectives that they are evaluated on.

Processing the Data

We dug into the survey data and immediately got to some interesting information. For example, 64% of respondents reported themselves as some type of “content creator.” And 36% identified their role as being at least partially on the business side. In our fieldwork, we were still talking primarily to editors and writers, so it was somewhat of a surprise to see that one in three people had some involvement in other aspects of the business, too.

We took the diaries and interview notes and boiled them down, then reduced them, and then reduced them some more into a mental model diagram (shout out to Indi Young and her fantastic book on the subject). The mental model represented everything we knew about newsroom behavior—it contained every discrete action or behavior taken by people in the front lines of a newsroom. There were a lot and they were extremely varied, for example:

  • “curate third-party content on Tumblr”

  • “harass writers to meet their deadlines”

  • “look for dead spots on the homepage”

above: a couple branches of the mental model

 

above: a grouping of activities within the branch: “Understanding referrer sources”
We consolidated the individual actions—several hundred—into larger groups. For example “curate third-party content on Tumblr” was put into a group called “build off-site brand presence.” And finally, all the groups were assembled into four high-level categories:
  1. Developing content

  2. Assessing content

  3. Assessing audience

  4. Developing audience

Everything that we observed and captured fit into one of those four categories. That gave us a way to maintain a broad perspective on the publishing business as a whole, with the means to narrow our focus down to specific workflows and actions through a highly organized affinity diagram.

At this point our ‘forensics’ work was done. Well, it’s never done, but we’d just completed a very thorough and immersive look at newsroom culture, workflow and the state of the publishing industry.

The output of this work - the mental model - gave us something to measure our product against as well. What actions were we supporting and not supporting? We brainstormed all the realistic and totally unrealistic things we could do to help our customers across the many facets of their work.

Tomorrow, in part two of this post we’ll focus on some specific findings of the research and how we used it to roadmap the next incarnation of Chartbeat Publishing.

Get Your End-of-Day Insights: Introducing Chartbeat Daily Perspectives

April 8th, 2013 by Adam

We figure you might not have your eyes on your Chartbeat Dashboard all day. We know that you're busy.

But even if you're paying close attention, there is some information that can't be fully realized or understood until the dust settles - once the day is over, all the pings have been counted and the final results are in.

Our motivation for creating Chartbeat Perspectives was simple: For a long time, Chartbeat customers have been asking us for an end-of-day summary that measured the totality of what happened on their site the previous day. Basically, you guys requested some more context around your site’s data, and we heard you loud and clear.

But we wanted to take things a step further - in addition to providing a general overview of your must-know information (traffic volumes, top stories and sections, top referrers) we wanted to uncover the more significant insights that help newsrooms really piece together what happened. The insights that contribute to a larger understanding the value of your content as it pertains to the quality of your audiences' experiences.

Starting with the basics, the Daily Perspective surfaces that simple but slightly elusive information, for e.g.:

  • Who was your biggest referrer?
  • What was your top social page? Search page?

  • Which page attracted the most new visitors?

It also gets at more deeply hidden information:

  • How many stories were published site wide, or in a given section?

  • Was that number up or down and does it explain traffic volumes?

And finally the traffic anomalies worth knowing - for e.g. if search or social traffic were substantially up or down. We'll also tell you if you succeeded in getting your sideways traffic to stick around and read more.

Where do you find these insights? Let's take a quick tour....

Top Stories by Audience Engagement

The Daily Perspective starts with a list of your top stories, as measured by total engaged minutes.

"What the blazes is an engaged minute?" you ask.

An engaged minute is a minute spent by a visitor actually reading or watching or writing – a minute he/she spends investing in your content. When a visitor goes 'idle' (no keyboard or mouse events in an active tab), Chartbeat stops counting Engaged Time.

Engaged Time is important because it tells us something not only about the volume of traffic, but also the quality of the content - did people fully consume the piece or did they chase a fun-sounding headline only to leave once the story fell short of their expectations?

That's Engaged Time. Now back to the tour. Here's the top stories view.

top stories

For each of the top stories we show you a graph of concurrent visitors and what the peak number was, as well as the Average Engaged Time on the story (how much Engaged Time on average a visitor spent on the story).

You'll be surprised to find that often the story that had the most concurrent visitors didn't have the most Total Engaged Time. In fact, there will be stories in this list that were only briefly broke the top ten in your dashboard – or maybe they didn’t reach the top ten at all. Or the story was big in the evening or early morning when you weren't at your dashboard (we'll see an example of that later).

Click on a story to get additional insights:

additional insights

The top referrer is listed for every story. Other vital takeaways are also listed, if there are any. In the above instance it’s noted that this story was successful in driving visitors further into the site.

Here's an article that was popular (lots of concurrents) and had great Average Engaged Time (over 3 minutes). From what we know about Engaged Time and audience growth and what others have likewise figured out, these exceptional pieces of content are what really build a loyal audience.

loyal audience

Top Sections by Audience Engagement

The sections summary is organized similarly to the top stories. The ten sections with the most Engaged Time are listed down the main column and compared to the same day from the week before, with most important takeaways displayed in the left column.

crushing sections

It's easy to see which sections totally crushed it. In this case Environment obviously had a huge day, and Local was off-pace.

And the takeaways on the left provide those below-the-surface insights. In this case it mentions that Politics had a substantial amount of traffic from links.

To dig deeper into what happened within that section, use the section filter to generate a report exclusively for that section - complete with all the top pages and top referrers for those pages.

top pages

A peak at the Daily Perspective for Politics will help us understand why it did so well that day. What were the big stories? Big referrers? How much content went out?

Summary

Every report, including the ones for individual sections also has a third part - the summary. What does this summary tell us about Monday, March 25th on Gawker.com?

27 percent

Total Engaged Time was down 27% compared to the previous Monday. Meanwhile the total number of stories published was way up - double what it was the previous week. And if you look at the peak concurrents you can see that in general there just wasn't as much traffic. But then there's that big spike at the end of the day on the previous Monday (denoted by the grey line). What was that?

To look at a Daily Perspective for any day in the pastjust use the date selector control.

calendar

And here's the answer - at least regarding that late night spike.

spike
Coming soon: more access to historical data

Look forward to much, much, more in terms of access to your historical data.

Very shortly you'll see a Weekly Perspective that we hope will really rock your world. The Weekly Perspectives aim straight at understanding your audience - where they come from and their propensity to return. Not all visitors are a good fit for what you do - some come, consume, and move on. Others stick around, read more, and drop subtle hints that they'll be back. The Weekly Perspectives will help you identify these visitors so you can concentrate on acquiring and retaining more of them – we want to help you figure out what's bringing your audience in today and what will keep them coming back tomorrow and long after that.

Your feedback is very welcome regarding our Daily Perspectives –  so let us know what you think and how we can help out more with your audience loyalty goals.

Measuring Content’s Value and the Futility of Counting Clicks

February 28th, 2013 by Adam

We've known for a long time that page views are a pain in the ass and tell you far from the whole story.  So it's encouraging to find more and more comrades out there share the sentiment and, like us, are working to get our industry out of the page view game. Some of these like-minded thinkers were at the Crossmedia TO conference last week in Toronto. A few of those presenters called out the futility of counting clicks:
 

Screen Shot 2013-02-27 at 11.41.53 AM

"Ipsa Desai from @Youtube, "true currency is not how many total youtube views, but how much time spent watching and if shared." 
 

 Screen Shot 2013-02-27 at 11.43.53 AM

"Measure of success on @YouTube isn't necessarily view count, says @coreyvidal. Track + value audience engagement + your influence #cmto13"  
We've made this point before, but page views only measure the effectiveness of your headline or your thumbnail image.
  That's great if you're trying to get people to click. But aren't we, as content creators, more concerned with measuring the value of the content itself?

Screen Shot 2013-02-27 at 11.45.23 AM

"Difference is that with Chartbeat you know if people are reading the article, not just landing on the page #CMTO13"
And once you're able to measure your visitor's interest in your content, you can start building that audience by focusing on those people - who they are, where they come from, when, and so on.

Screen Shot 2013-02-27 at 11.47.37 AM

"@savvymomdotca: Creating a community “is not rocket science.” Ability to match right content with right audience is critical."
 
Whether it's creating a community or building an audience, the ability to match right content with right audience is the way to keep them coming back.
  Once you figure out how to do that, then you're able to, as Paul Kotonis puts it, reach your audience in a regular, repeatable, predictable way. Understanding how to build audience loyalty is a huge focus for Chartbeat in 2013. We're working hard to identify who your quality audience is - that is, the people that engage with your content and show a propensity to return - and introduce you to them. At the end of my presentation CrossmediaTO host Bradley Cooper, errr, Gavin McGarry asked me how newsrooms reacted to Chartbeat when it initially came out two years ago and how that's changed over time. And actually, our history mimics the changes in the publishing industry as a whole, I think. In 2011 editors were worried that real-time analytics were good for content farms or news site that were interested in chasing quick clicks and views of any kind, but not for scholarly publishers or anyone concerned with creating quality content. In 2013 many of those publishers are now at the forefront of data-driven audience development: how to match right quality content with the right quality audience, how to know if a visitor is just a piraña or likely to come back (maybe even subscribe), and how to optimize acquisition and retention. Relevance and engagement have always been vital aspects of journalism. But in the past that was seen purely in a content-centric way (great content speaks for itself). Now it's widely understood as the intersection of content AND audience. That's what our focus has been by encouraging newsrooms big and small to use Engaged Time as a measure of the depth and quality of content AND as a direct correlation to understanding if that reader will return to your site to read tomorrow's quality piece.