Skip to main content

The Scientific Literature

Our Friend the Impact Factor

I had an email asking if scientists in industry care about journal impact factors. It’s an interesting question, but it needs to be answered in parts. Unless you deal with academic publishing, the phrase probably doesn’t mean much. “Impact factors” are an attempt to quantify what everyone knows empirically: some journals are more prestigious than others. You know how we science types love to quantify stuff.
The whole business comes from the folks at ISI (now owned by Thomson.) They had been publishing the Citation Index for years, which was (and is) a way to find out who had referenced a given paper in the scientific literature after it was published. This can be useful if you want to see if anyone’s followed up or commented on an interesting paper (or if you just want to see if anyone’s cited your own work.)
And as ISI realized early on, it could also furnish some interesting rankings of “most-cited” papers in a given field. (Here’s a recent set of lists from Chemical Abstracts, the big dog in the chemical information world, who got into the citation-counting business themselves.) You could figure out who the most cited authors are, too, although that ISI link won’t rank-order them for you. (You have to pay for that data! Or you can look here.) There are lists of the most highly cited institutions, too, naturally.
About ten years ago, they introduced the Impact Factor to do the same thing for scientific journals. That’s the number of citations generated by a journal (usually over a multiyear period) divided by the number of papers it published in that time: the average number of cites per paper, in other words.
The publishing community – initially rather worried and sceptical, if my memory serves – has gone completely crazy over the whole idea. Now journals advertise themselves by their impact factors. “Publish here! We’re a good journal, really! We have proof!” If you’d like to know what a particular journal’s rating is, they’ll probably shout it out if it’s any good at all. A failure to mention the number, down to three decimal places, is an act that speaks for itself.
Want the whole list? It can be rather hard to come by, unless you’re a paying customer of ISI’s, but here’s a place to start. Those aren’t the latest figures, but they’ll do. You’ll notice that at the top are a bunch of review journals, who publish comparatively few papers but get cited out the wazoo. Among the original-research journals in the top ranks are big kahunas like Cell, Nature, the New England Journal of Medicine, Science, and such. But I find a perverse fascination in browsing the low end of the scale. The Ethiopian Medical Journal? Fertiliser Research? Bovine Practice? Annals of Saudi Medicine? Surely some of these trench-dwellers ceased publication and vanished from sight during the rating period. The list of “0.000” impact factors is particularly alarming, although most of the journals listed are there by some sort of statistical artifact or (I presume) don’t really exist. But if no one reads them, how do we know if they’re real or not?
In the next installment, we’ll look at some problems with the whole idea – there are some – and I’ll tell you if we industrial types give a hoot about it or not. . .

6 comments on “Our Friend the Impact Factor”

  1. jsinger says:

    …or (I presume) don’t really exist. But if no one reads them, how do we know if they’re real or not?

    I once found a paper mentioned in a reference list in a journal that I just could not obtain anywhere. Not in Medline, not through any academic library exchange, not through a commercial service.

    I went over to Harvard Med School and was informed (with the humility characteristic of that institution): “Sir, if Countway doesn’t have that journal, it doesn’t exist.”

  2. Derek Lowe says:

    Or, as it was put in the context of Balliol College, Oxford (about Benjamin Jowett):
    “First come I; my name is Jowett.
    There’s no knowledge but I know it.
    I am the master of this college,
    What I don’t know isn’t Knowledge.”

  3. Rich Y says:

    Looking through that list of 0.000s, I’m sure some of them are statistical artifacts.
    Now, I know for a fact that the IEEE Journal of Selected Topics in Quantum Electronics is a reasonably well cited journal. It’s not exactly Nature, but it is one of the more important journals in the subcommunity of people who study and build lasers. I’d expect it wouldn’t even be near the bottom of the IEEE journals.

  4. Gerald Bothe says:

    The 0.000 impact factors are indeed a methodological artifact. Only the journals listed in the Science Citation Index are used to calculate the factor. Thus, if the Paraguayan Journal of Entomology is never cited in Science etc., it has an impact factor of zero, not matter how often it is cited in Dipterologist’s Digest (assuming that Dipterologist’s Digest is not included in SCI). They hid this information deep down in the definition of the Impact Factor, see and look for the difference between citING and citED-only journals…at least that’s how I interpret that section.

  5. biohombre says:

    It may be useful to also recall that it is in the best interests of academic researchers to slightly manipulate citation indices. To illustrate what I mean by that comment- consider what happened to myself, and others that have published “a first” in the same year as another lab. I was in a “low citation index” lab as a graduate student, the other author was in a “high citation” lab. If you trace the citations in the literature you will find that our publication is almost never cited. This of course is in the best interest of the other author, who will cite themself. It is also in the best interest of others citing this literature. Why? The other author has considerable influence in NIH study sections…
    But having worked a number of years in industry, I also know that some good science that is critical in problem-solving (such as- what is REAL from that academic publication/patent) will never be published. (Heaven forbid, it may question the validity of that patent your company just licensed!)
    Oops, maybe I should have held these comments until Derek’s next installment?

  6. Interesting…I never knew that IF was intended to represent the average number of citations per paper (though it makes perfect sense).
    I might be anticipating your next post a bit, but it seems to me that since some papers get cited hundreds (even thousands) of times, and an impact factor of >10 puts you in the top 1%, the “typical” paper in what is considered a high-impact journal might not be cited all that much more than a paper in a journal with an impact factor of, say, 3.

Comments are closed.