Print This Post Print This Post

Methodology Notes: How we built and ranked our list of online news publications.

Visit “Six criteria for new news” to view data rankings

Our survey seeks to highlight some of the emerging local online news publications. That means that we have avoided most established metropolitan news organizations’ online publications. In no way does this mean that they are not of major importance to local news coverage (or that they are not leaders in innovation in their own right).

In fact, if we had included them in our rankings they would easily have filled the top slots with their large audiences and professional operations. Given the time and resources available to us and our goal of learning more about the potential for online publications to stimulate better local news coverage, we decided our survey should meet the following objectives:

1. To foreground new and innovative online news publications in Chicago, we wanted to focus on the small and emerging host of blogs, news sites, and other Web 2.0 entrepreneurs providing opportunities for new voices and new ways of covering communities to flourish.

2. To get a sense of who they are, how they envision their audience, what kind of content they provide, and how they are organized—for example, whether they consider their endeavor a hobby, a business, or something in between.

3. To be as inclusive as possible in cataloguing such sites.

4. Given our interest in local news, we prioritized community newspapers, blogs and other sites from the neighborhoods, and original voices on arts and culture. We also decided to include, both for comparison purposes and because we thought they were exemplary, some content from traditional newspapers.

These goals were informed by assumptions we made about best practices online. In addition to accurate, entertaining and informative content (of course), we focused on three best practices:

Influence: How widely known is the publication? Although as previously stated we did not feel audience size was a major determinant—and size is just one of many ways to measure influence. We used various measures of engagement. Some came from publication owners, who reported, on the honor system, numbers of subscribers via online syndication (such as RSS, Really Simple Syndication, and E-mail), and time spent on their Web site. We also used third-party indicators to measure influence.

Technical proficiency: While we are not believers in using every Web 2.0 bell and whistle, we valued interesting and innovative uses of these tools—and wanted to gauge which Web 2.0 tools are most popular.

Transparency: One of the biggest challenges we faced was the novelty of practically everything connected to online news publications—including expectations and culture around sharing information such as traffic data. While we recognize some legitimate concerns about maintaining proprietary data, we also believe some basic traffic information is essential to help everyone understand the emerging online news marketplace. The relatively small audiences of many of the sites we were tracking made this even more important, since we found third-party sites that track traffic and other indicators to be less reliable for smaller sites. A second important dimension to transparency that we valued was the ability to look at a site and understand who had created it and how to contact them. In fact, many smaller sites lacked a clear profile, contact person, or feedback method.

Those were our goals and assumptions when we first sent out the surveys. In the end, 93 individuals filled them out.1 After eliminating duplicates (for example, when two people from the same organization responded), editing out sites that did not originate from or focus on Chicago (such as In These Times, New America Media and the Champaign/Urbana Independent Media Center—excellent sites that we ultimately decided were beyond the scope of the survey), we edited this down to a list of 84 sites. It’s important to note that we are far from considering this list the best or the only online news publications in Chicagoland: we restricted ourselves to ranking those we were able to survey within the time available to collect our data. In fact, there are some significant sites, such as Huffington Post Chicago, that did not respond and so are not included.

We would have liked to collect data on even more publications (in fact we compiled a list of more than 200; some were major media sites that we decided not to include, others were approached but did not respond to our survey). We hope to continue this process, gathering and cataloging new publications that emerged as we were completing our process, and others we learned of too late to include them. Similarly, space limitations keep us from displaying everything we learned in the printed report. Sixty of the sites we surveyed are listed here; the rest of the 84 as well as the larger list of sites we identified but did not survey will be shared online at www.communitymediaworkshop.org/newnews.

Gathering data

In advance of the Chicago Journalism Town Hall on February 22, we prepared a three-page list of noteworthy online sources of local news.2 This ranged from the Active Transportation Alliance to the Zoo Plane, as well as online news publications from other organizations, individuals and reporters at established newspapers and magazines. From there, we drafted a fresh list of Web sites to survey for a closer look at the makeup of emerging online news in Chicago.

We invited respondents to recommend new Web sites to us, and to pass along the survey to people at other relevant, local online publications. As we learned of relevant new online news publications we contacted the people behind them with invitations to complete our survey.

Nearly a dozen blogs provided no clue about the identity of the people behind them or how to contact them. We got creative in our investigations, attempting to send a survey request by commenting on their stories. (We even followed some on Twitter in a mostly unsuccessful attempt to send a direct message.)

As we collected responses over several weeks, we followed up by calling non-respondents by telephone and sending them additional reminders via e-mail. With the help of Ken Davis, organizer of the Chicago Journalism Town Hall event, we were able to do one final round of gathering online news publications when we surveyed Town Hall attendees asking for their feedback on our list so far, to let us know what we had missed, and to share a link to the survey.

In exchange for completing a survey, we offered each respondent the choice of a $10 gift card from either Intelligentsia or Caribou Coffee. Many respondents waived the gift card and opted to consider their response a donation to the Workshop.

The survey was completed online, via Survey Monkey. We sent out the first invitation Monday, April 22 and the final response was received Saturday, May 23. We had 26 people respond to our direct invitation; 56 responded to an email link to the survey that they received either from Community Media Workshop or a colleague, and 11 responded to a different link and were re-entered in our final Survey Monkey survey by hand. The latter group included a number of testers who graciously agreed to help us debug the final draft of the survey.

Confidentiality

info_audience

Among other issues we had to deal with was one related to confidentiality and sharing of data. A small group of respondents declined to share their data at all. In some cases, we followed up to request they change their minds; in other cases we evaluated them as best we could using the data available to us. A larger number were willing to share their data with us but not with the world.

Respondents who did not feel comfortable sharing their data widely indicated that this was primarily for business reasons. This also presented a challenge when it came time to analyze the data, which will be discussed in the next section.

Analyzing the data

We aimed to combine industry standard tools for measuring Web site popularity with our own, more subjective assessment of each site’s local relevance and quality. To that end, we created an algorithm loosely based on that of Todd Andrlik’s Power 150 ranking of marketing blogs.3 Our rankings took into account six criteria for each publication, as the following table shows.

We translated each of the above six results above into a 5-point scale, as the chart above explains. Web sites with the greatest final scores floated to the top of the rankings. Of the six dimensions on which we rated publications, three were based on information survey respondents provided to us, two were derived from third-party Web sites (Google and Alexa.com), and one was a subjective assessment by the Workshop. Descriptions of each dimension follow:

1. Self reported subscribers via RSS (and e-mail): one measure of engagement we examined was how many subscribers—people who had requested information from the sites, either via RSS or email (because relatively few sites had strong RSS data, and because this technology appears to remain relatively opaque to many, we opted to supplement it with Email information).

2. Self-reported unique monthly visitors: We asked sites to report the number of visitors they had received in March 2009; we also asked them to tell us what tool they were using to measure such information. Google analytics and various proprietary tools were the most common responses.

3. Self-reported time on site: The same analytics programs that provided unique monthly visitor data were able to provide average time on site data.

4. Google Page Rank: This indicates, among other things, the popularity of a Web site according to the number of other sites that it links to or that link to it.4

5. Alexa traffic rank: Alexa helped us to determine where a site fit in relative to all other U.S. Internet sites. While there were some challenges with using it—notably its inability to track sub domains (Web URLs with multiple forward slashes in them, for example)—we determined that it was the most accurate of a range of such tools available to us that we auditioned for tracking tasks, such as Compete.com and Quantcast.com.5

6. Community Media Workshop Score: The Workshop Score reflects each Web site’s unique local relevance, originality, quality of content, frequency of updates, openness to story pitches, design and use of social media tools. The Workshop Score averages rankings from individual staff members of the Workshop.

Our algorithm relies heavily upon figures provided by each Web site. We decided to abide by an honor code, trusting survey respondents to provide figures that were accurate to the best of their knowledge. It also penalized respondents who chose not to respond, whether out of confidentiality or other concerns. Specifically, publications that did not provide responses for the first three dimensions of the assessment received a 1 on these dimensions. We informed respondents of this when they took the survey: “Please note that the data will be used as part of an algorithm we’ve developed to help clarify the new, local news ecology,” ran information immediately preceding questions about number of visitors and time spent on site. “If we don’t have traffic data from you, your Web site may rank lower than other comparable sites on the final list.”

We were generally unsatisfied with the third-party tools available for measuring influence and engagement over the Internet. These standards are the best we have, and probably as imperfect as traditional means of measuring the circulation of print publications. We also had to abandon early attempts at measuring RSS and e-mail subscribers with third-party tools, whose results were difficult to look up and seemingly unreliable. And although we aimed initially to include Technorati Authority in the algorithm, that tool was largely inoperable during the crucial weeks of survey collection.

1 Unfortunately, the survey is too long to fit in the printed report. A copy can be viewed online at bit.ly/PTSDO.

2 See chijournalismtownhall.com. The original three-page list is downloadable at communitymediaworkshop.org/download/onlineNewsList.pdf.

3 For Tod Andrlik’s methodology, see adage.com/power150/about

4 See www.google.com/corporate/tech.html

5 See alexa.com

Category: Article, Chapter 2

Tagged:


CONTACT US
Don't see your favorite online news website?
Submit an online news site

Mailing Address:
Community Media Workshop
at Columbia College Chicago
600 S. Michigan Ave
Chicago, IL 60605

Walk-in Address:
218 S. Wabash, 7th Floor
Chicago, IL 60605

Tel: 312-369-6400
Fax: 312-369-6404

Website: http://www.communitymediaworkshop.org
E-Mail: cmw@newstips.org
Follow on Twitter.com: @npcommunicator