The Impact of Impact Factors
At last month's Council of Communication Associations (CCA) meeting we talked about a long-pressing issue: Impact Factor (IF). This is not new for the CCA; it is an ongoing topic and part of the work the CCA does.
Led by ICA Fellow Linda Putnam, the chair of the CCA's ISI Thomson-Reuters committee, the CCA routinely endorses journals in Communication during the review process for acceptance into the Journal Citation Report. Now we are at a crossroads where, when we endorse journals for inclusion, we are endorsing a flawed system that is not and was never a true measure of the impact of a journal - only a metric that measures citations.
At the forefront of this debate is an education issue. Whether we like it our not, IF is the measure of record that goes beyond just journal impact: It is used to evaluate output for tenure and sell journals to libraries. So it's important to understand it historically and practically. For this, I highly recommend the excellent paper that Trevor Parry-Giles at NCA filed for the CCA. It can be found here. This paper gives a concise explanation and some action items for the CCA.
It's ICA's position that this metric is helpful, but flawed in evaluating what we truly want to know about our journals. I liken this to baseball statistics (apologies for the American-centric analogy). One measure for offensive value in baseball is Runs Batted In (RBI), which entails a batter producing offense that results in a Run (score). RBI accurately counts the amount of Runs a player produces, but isn't a true measure of the players value. There are too many variables in play (you need runners to get on base) to make this a true measure of value. The same goes for IF: It counts citations within a 2-year window, but doesn't actually demonstrate value.
So, where do we go from here? Finding alternate metrics is a start, but many of those are flawed too. Many have stated using Usage as a main determination, but that can be gamed very easily. Perhaps Eigenfactor, or an alternate measure within the Journal Citation Report, like Cited Half Life or Immediacy Index, would be a step in the right direction. Or a combination of metrics weighted with a more human aspect, like the information from JournalReviewer.org, run by our very own Malte Elson (U of Muenster) and James Ivory (Virginia Tech). Regardless, our discipline is correct to take a deeper look at this issue, and tackle any developments with supporting statements.
Early next year the CCA looks to make a more defining move, but until then, we search for something better.