The ways to mess around with the peer-review process are legion, but these schemes are getting a bit easier to catch. That’s what I take away from this paper, from two bibliographic scientists at Elsevier who set up a system to do just that. One hears tales of reviewers who will look more favorably on your manuscript if you cite some of their own work (although I can say that I haven’t run into this behavior myself, fortunately). And now these folks can be tracked down by the signatures they leave in the citation record.
That paper will also take you to some examples of other behavior such as “citation stacking” (where a journal strongly encourages authors to cite other papers that have been published in the journal, so as to raise the title’s own database ranking) and “citation cartels”, a more elaborate version of the same idea, where several titles collaborate to cite each other and artificially float everyone’s boats. But the new work they describe comes from the study of 69,000 individual reviewers for Elsevier journals (over the period 2015-2017), and it uncovered some interesting patterns.
That’s especially apparent when you are able (as these authors are) to compare the first-submitted versions of papers coming into Elsevier journals with their final form – that is, looking directly at what are probably changes requested by reviewers. This is still a nontrivial effort; the paper goes into detail on how they worked through the various databases to make sure that they were talking about the same manuscripts and authors all the way through. It turns out that there were about 54,000 reviewers who met the cutoffs of having published 5 or more paper themselves and also having reviewed 5 or more publications, of which at least one cites the reviewer.
Now, citing a reviewer is not prima facie suspicious. After all, you’re supposed to send submitted papers out to people who know the field and are working in it, and they may very well have published papers themselves that are worth citing in such new manuscripts. But there are limits. When these authors examined how often that happens, per reviewer, there was a weird and obvious outlier set of about 1600 reviewers for whom every single paper they ever reviewed cited them in turn. A histogram shows a peak where this reviewer-citation situation happens about one out of every five times for a given reviewer, and the number of instances gradually decreases from there. By the time you get up to (say) “95% of the time the reviewer was cited in the manuscripts they reviewed”, there are almost no instances at all. And then suddenly there’s a big lump of such cases where it happens a flat 100% of the time. It doesn’t look good.
And it isn’t. Comparing the first-submitted versions of the papers under consideration with the final ones (in a smaller set of , there were hundreds of cases where the majority of those reviewer citations were not even present in the original manuscript. The authors estimate that this happens in about 0.8% of the total manuscripts submitted – and you can take that several ways. You can be glad that it’s as low as that, but remember, this net was set to catch only the most persistent and blatant forms of this behavior (stuff like this). Lower-level citation extortion wouldn’t have been flagged by these cutoffs at all. And as the authors note, this sort of thing may not even be noticed by authors for what it is, since the reviewer reports are anonymized. And even if they do see what’s going on, they may see this as a minor payment to get a favorable review. As the paper says, “In one case where a reviewer had submitted 120 reviewer reports including requests to add multiple irrelevant citations in each report, only four authors refused to add the citations during revision of their paper”
So I would take this estimate (the first of its kind) as a likely lower bound. If it leads to the identification of the biggest abusers of the system (and subsequent banning of them from reviewing papers), well, good. But there’s probably a lot more of this sort of thing being done more quietly. . .