Science magazine and writer John Bohannon have done us all a favor. There’s a long article out in the latest issue that details how he wrote up a terrible, ridiculous scientific manuscript, attached a made-up name to it under the aegis of a nonexistent institution, and sent this farrago off to over three hundred open-access journals. The result?
On 4 July, good news arrived in the inbox of Ocorrafoo Cobange, a biologist at the Wassee Institute of Medicine in Asmara. It was the official letter of acceptance for a paper he had submitted 2 months earlier to the Journal of Natural Pharmaceuticals, describing the anticancer properties of a chemical that Cobange had extracted from a lichen.
In fact, it should have been promptly rejected. Any reviewer with more than a high-school knowledge of chemistry and the ability to understand a basic data plot should have spotted the paper’s short-comings immediately. Its experiments are so hopelessly flawed that the results are meaningless.
I know because I wrote the paper. Ocorrafoo Cobange does not exist, nor does the Wassee Institute of Medicine. Over the past 10 months, I have submitted 304 versions of the wonder drug paper to open-access journals. More than half of the journals accepted the paper, failing to notice its fatal flaws. Beyond that headline result, the data from this sting operation reveal the contours of an emerging Wild West in academic publishing.
Well, sure, you’re saying. Given the sorts of lowlife publishers out there, of course they took it, as long as the check cleared. But it’s even worse than it appears:
Acceptance was the norm, not the exception. The paper was accepted by journals hosted by industry titans Sage and Elsevier. The paper was accepted by journals published by prestigious academic institutions such as Kobe University in Japan. It was accepted by scholarly society journals. It was even accepted by journals for which the paper’s topic was utterly inappropriate, such as the Journal of Experimental & Clinical Assisted Reproduction.
Here’s all the documentation, and it documents a sorry state indeed. You’ll note from the world map in that link that India glows like a fireplace in this business. Nigeria has a prominence that it does not attain in the legitimate science publishing world, and there are exotic destinations like Oman and the Seychelles to be had as well. The editors of these “journals” tend to be people you’ve never heard of from universities that you didn’t even know existed. And the editorial boards and lists of reviewers have plenty of those folks, mixed in with people who reviewed one paper before they didn’t know better, and with people who didn’t realize that their names were on the mastheads at all.
Bohannon didn’t actually submit the exact same manuscript to all 304. He generated mix-and-match versions using an underlying template, giving him variations of the same crap and taking great care that the resulting papers should be obviously flawed:
he papers describe a simple test of whether cancer cells grow more slowly in a test tube when treated with increasing concentrations of a molecule. In a second experiment, the cells were also treated with increasing doses of radiation to simulate cancer radiotherapy. The data are the same across papers, and so are the conclusions: The molecule is a powerful inhibitor of cancer cell growth, and it increases the sensitivity of cancer cells to radiotherapy.
There are numerous red flags in the papers, with the most obvious in the first data plot. The graph’s caption claims that it shows a “dose-dependent” effect on cell growth—the paper’s linchpin result—but the data clearly show the opposite. The molecule is tested across a staggering five orders of magnitude of concentrations, all the way down to picomolar levels. And yet, the effect on the cells is modest and identical at every concentration.
One glance at the paper’s Materials & Methods section reveals the obvious explanation for this outlandish result. The molecule was dissolved in a buffer containing an unusually large amount of ethanol. The control group of cells should have been treated with the same buffer, but they were not. Thus, the molecule’s observed “effect” on cell growth is nothing more than the well-known cytotoxic effect of alcohol.
The second experiment is more outrageous. The control cells were not exposed to any radiation at all. So the observed “interactive effect” is nothing more than the standard inhibition of cell growth by radiation. Indeed, it would be impossible to conclude anything from this experiment.
I like this – the paper looks superficially presentable, but if you actually read it, then it’s worthless. And yes, I realize that I’ve described a reasonable fraction of the ones that actually get published, but this is a more egregious example. I hope. The protocol was that Bohannon submitted the paper, and if it was rejected outright, that was that. If any reply came back addressing the paper’s flaws in any way, he had a version ready to send back with more stuff in it, but without fixing any of the underlying problems. And if the paper was accepted, at any point in the process, he sent the journal a note that they’d discovered a serious flaw in their work and had to withdraw the manuscript.
157 journals accepted the paper, and 98 rejected it. He’d submitted it to a further 49 journals from his original list, but at least 29 of those appeared to be out of the business entirely, and the other 20 still had the paper “under review”. So of those 255 decisions, 149 of them looked as if they’d occurred with little or no review. For a rejection, that’s not so bad – this is a perfect example of manuscript that should not even be sent out for review. But the acceptances, well. . .
The other 106 editorial decisions made with some review are problematic, too. 70% of these were acceptances. Even in the few cases (36 times out of 304) where the paper was actually reviewed, and the reviewers realized that something was wrong with it (as they should have), the paper was accepted by 16 journals anyway.
The Elsevier journal that took this heap of junk was Drug Invention Today, in case you’re wondering. I’ve never heard of it, and now I know why. The Sage journal was the Journal of International Medical Research, so you can strike that one off your list, too, assuming that the name wasn’t enough all by itself. Another big open-access publisher, Hindawi (they advertise on the back cover of Chemistry World in the UK) rejected the paper from two of its journals, much to their relief. Jeffrey Beall’s list of predatory publishers came as as pretty accurate, as well it might.
The problems with all this are obvious. These people are ripping off their authors for whatever publication fees they can scam, and some of these authors are not in a position to afford the treatment they’re getting. No doubt some subset of the people who send manuscripts to these places are cynically padding their publication lists. The “editors” of these things get to reap a little unearned prestige for their “efforts” as well, so the whole enterprise just adds to the number of self-inflated jackasses with padded reputations, and the world is infested with too many of those people already. But I’m sure that there’s another subset of authors who don’t realize that they’re submitting their results into a compost pile, and being asked to pay for the privilege. The first group are contemptible; the second group is sad. None of it’s good.