Guest Post, the second in a four-part series, by KC Lynch:

Part 2: Impact Measurements

Collection Development Librarians will tell you that while Faculty members don’t look at Impact Factors of journals before publishing, many graduate students do. Which means the sooner TWC starts generating impact data, the better.

Measuring the impact of scholarly journals is a tricky, controversial business. Most of the established methods are highly flawed, simply because counting citations is not the most reliable way to track a journal’s impact, and the reporting of citation statistics even less so.

That being said, the gold standard is still the Journal Citation Report (JCR), published by Thomson Reuters. The JCR provides an Impact Factor (IF) for each of its titles, which measures the frequency with which an average article has been cited in a particular year. The IF is calculated by dividing the number of current year citations to the source items published in that journal during the previous two years.

The first goal of the JCR was to track the impact of scientific journals, and they track an impressive 7,300+ titles. The social sciences JCR is newer and consequently smaller, with only 2,200+ titles. Getting a new journal listed in either report is highly competitive: Thomson Reuters reviews over 2,000 new titles each year, and accepts 10-12% for inclusion in the report. Evaluation criteria include diversity of authorship and citation data. Which are exactly the qualities TWC hopes to strengthen by being listed in the report.

For journals like TWC, the JCR is not a perfect match. But is it still worth pursuing? Absolutely. For newer journals, the JCR is an investment in the long term.

In the short term, TWC might consider following in the footsteps of PLoSONE, the online, open-access journal by the Public Library of Science. PLoSONE uses a system called Article-Level Metrics (ALM) to measure the impact of specific articles, in an effort to evaluate individual articles rather than the journal as a whole:

“Article-level metrics place relevant data on each article to help users determine the value of that article to them and to the scientific community in general. Importantly, they provide additional and regularly updated context to the article, which currently includes data on citations, online usage, social bookmarks, comments, notes, blog posts about the article, and ratings of the article.”

To those of us who work in website development, this sounds a lot like a cousin of Google Analytics, which, believe me, is a complement. These are the tools we should be using to measure the impact of online journals. Where the print model was subject to measuring readership through circulation and subscription data, for example, the online journal should measure the same through usage data (page views, downloads, unique users, etc).

At PLoSONE, ALM data is having a positive effect on author satisfaction. In their 2010 report, after just one year of using the system, ALM was a significant factor in deciding to publish with PLoSONE, and 32 of 101 survey respondents reported finding the data useful in some way. And whether or not they have ALM to thank, in the same year, authors considering PLoSONE as a first choice journal rose from 23% to 37%.

On July 27, PLoS launched the Article-Level Metrics API to give researchers access to their statistics.

So, there are two ways for TWC to get Impact measurement data. The old-school method of JCR and the new technology of ALM. Both are worth doing: one for longevity, the other for immediacy. And, who knows, ALM may be the new JCR for the open-access digital world.

[META] Attracting Contributors to the TWC: Part Two
Tagged on: