Skip to main content

Who Discovers and Why

The Story Behind the Story Behind the Story

I enjoyed this article at FiveThirtyEight, because I’ve had similar thoughts over the years myself. There are layers of knowledge about many topics, and it can be hard to be sure what layer you’ve made it do, and whether there’s another one underneath you yet. The first example in the piece is the spinach-has-a-lot-of-iron idea, which many people have heard of. A subset of those people will tell you that this got started many decades ago because of the “Popeye” comic strip, and that it’s actually a myth, that spinach doesn’t have an unusual amount of iron as opposed to other greens and vegetables. And a subset of those people have heard the story about how the whole error was based on a missed decimal point with some earlier German data about iron content, which propagated for years until someone noticed.

But as it turns out, that last story is a legend, too, which has also been handed along uncritically. (As an aside, I was sure that the reason that a fair amount of the iron in spinach wasn’t very bioavailable because of its oxalic acid content, but it turns out that this is not necessarily the case, either). It’s easy to take the first story you hear as the explanation, but in some ways, it’s even easier to take the second one and figure you’re done, especially if you’re used to seeing first explanations not quite pan out.

An example of this from medicinal chemistry would be thalidomide. Pretty much everyone has heard of the terrible effects it had in pregnant women. Many chemists and biologists are sure that these were due to one enantiomer of the compound, and that dosing the other might have prevented the catastrophe. But that’s not true, either: thalidomide’s chiral center racemizes quickly in vivo, so it really doesn’t matter if you dose a single enantiomer. And the teratogenic effects vary so widely from species to species that the various stories about how they would have been found/were certain to have been missed, etc., are generally confused as well.

This explanation seems quite plausible:

Complicated and ironic tales of poor citation “help draw attention to a deadly serious, but somewhat boring topic,” Rekdal told me. They’re grabby, and they’re entertaining. But I suspect they’re more than merely that: Perhaps the ironies themselves can help explain the propagation of the errors.

It seems plausible to me, at least, that the tellers of these tales are getting blinkered by their own feelings of superiority — that the mere act of busting myths makes them more susceptible to spreading them. It lowers their defenses, in the same way that the act of remembering sometimes seems to make us more likely to forget. Could it be that the more credulous we become, the more convinced we are of our own debunker bona fides? Does skepticism self-destruct?

It can indeed. We humans have a narrative bias: an explanation that makes a good story is going to get elevated compared to the ones that don’t. And if you’re used to the initial explanation for something always being a matter of possible doubt – that is, if you’re a working scientist – then you have to watch out for this constantly. I know that I’m personally more attracted to long-shot contrarian explanations than I strictly should be, because of this very effect.

Where do you call a halt, though? The article notes, disturbingly, that conspiracy theorists and people like anti-vaccine activists are also skeptical of prima facie explanations and believe that they have deeper truths that are unrealized by the masses. I’m looking for reasons to differentiate my own ways of thinking from theirs, for my own psychological comfort, and one that I can adduce is that in my work I’m often forced to give up on things that I believed (or hoped) to be true, after seeing the evidence against them. A real conspiracy theorist isn’t going to let that sort of thing slow them down much – that contrary evidence is just what they want you to believe, while the real evidence always points back to the real explanation. Which is the pet theory that the person was actually convinced of at the start, most of the time.

Not to say that scientists can’t walk off into that pit, either – happens all the time, on small issues and on big ones. The lines are sometimes finer than we’d like to admit. A hypothesis might need some real dedication and effort to be properly tested, but when does that cross over into a too-attached refusal to accept that it’s been tested and failed? No lights flash and no buzzer sounds. It’s like figuring out when to kill a drug project, because there’s always something else to try, and there’s always a chance that it might still work out in the end. Many a huge drug success has looked like a failure at one point or another, so how can you stop work on something just because it looks like it’s going to fail? On the other hand, what better reason is there, anyway?

Uncritical acceptance of whatever comes along is clearly a mistake, but total reflex skepticism of everything is a mistake, too, as well as being so hard to live by that you run into inconsistencies immediately. Like Samuel Johnson kicking his rock, you have to have some assumptions in there somewhere. When I open up a bottle of reagent, I generally work as if I believe it to be what’s on the label, and set up my reaction accordingly. There are gradations and shades, though – if it’s an unusual reagent from a supplier that I don’t know, or have had reason to doubt in the past, then I’ll probably look into the stuff a bit more. But if it’s a common chemical from a reputable company, then no, I’m not going to take an NMR to make sure that this 4-liter jug of acetone really is acetone before I use it to rinse out a flask. (We will leave aside, for now, the time in grad school that I found someone else’s rubber pipet bulb – now swollen to the size of a small trout – floating in the can of acetone I was so using, which accounted neatly for the fact that my newly rinsed glassware was still sticky).

The FiveThirtyEight article does not end on a comforting note. The person profiled in it, Mike Sutton, is now working on the story that natural selection was actually hit upon decades before Darwin by someone else, Patrick Matthew. Darwin himself acknowledged that much, but the question (which I wasn’t aware of before reading this) is whether he cribbed it from Matthew in the first place. Sutton now thinks he did, but many other Darwin scholars think the evidence is nowhere near strong enough. This, to me, illustrates a problem in between the two fallacies just mentioned, though: the situations where there just isn’t enough to say for sure. Deciding that there never is enough evidence to say that anything’s for sure is another way of describing total skepticism, and deciding that any old evidence at all is plenty is uncritical acceptance. But these are easier to differentiate when the argument has a chance of being won, like being able to decide that yeah, that is reasonably pure acetone in that can, or alternatively that there is in fact a well-soaked pipet bulb in there, too. When you’re talking about the historical record, things get fuzzier quickly, which is why there are still Holocaust deniers out there, despite this being one of the better-attested historical events of the twentieth century. George Orwell started off a newspaper column by talking about how Sir Walter Raleigh at one point abandoned his plan to write a history of the world, because while imprisoned in the Tower of London he was unable to even figure out what had caused a disturbance just within his earshot. (Of course, I’m not even sure if this story is accurate – Orwell himself couldn’t resist adding that if it wasn’t true, it should be).

Which takes us back, finally, to the judgement-call aspect of the whole thing. As mentioned above, the wilder conspiracy theories trip themselves up by what evidence they’re willing at accept: for example, if you’re ready to call the Holocaust a fake, how do you know that World War II happened at all? In the end, we have to start talking about “the weight of the evidence”, and decide for ourselves how weighty it needs to be, what scales we’re using to measure it, and how much we trust the scales. Historians have it far worse in that department than scientists do, since we have actual scales to work with, but that doesn’t mean we escape the problem completely.

26 comments on “The Story Behind the Story Behind the Story”

  1. Anon says:

    Perhaps Lane should read this. Please.

    1. I think it best you actually read my published research. So many prefer to guess what it is about from the little snippet (clue) another provides and then jump in feet first with their conclusions.

      Might I suggest – to prevent the spread of further fallacies and myths – you Google “On knowledge contamination” – Be sure to enclose the tem in double inverted commas (speech quotes). Interestingly, as proof of concept – that’s exactly all it took – back in 2013 to make a significant new discovery. You see I’m not researching that someone got there first – that’s well known. What I have in actual fact finished researching is who really did read the work of the person who got there first. I found the hidden books in the library that disconfirm the premise upon which is built the old paradigm of independent discovery of Matthew’s prior published ideas. That now debunked premise is that no naturalist known to Darwin/Wallace or their known influencers read matthew’s original ideas before 1858.

  2. Peter Kenny says:

    The views that hydrogen bond donors are somehow ‘bad’ and that enthalpy of binding is the result of hydrogen bond formation may fit into this category. The problem often starts with folklore that sounds sort of convincing (perhaps because it has been expressed by ‘experts’). I’ve linked a post on Voodoo Thermodynamics as the URL for this comment.

  3. Glen says:

    Plausibility is an emotion response, not an intellectual process. We find things plausible because it is emotionally satisfying to do so.

    The challenge for an adult is to follow a methodology which allows you to recognize and reduce your level of emotional response in evaluating a situation or evidence. The idea of Scientific Method has that as its foundation.

    I don’t quite believe that correct methodology will always lead to individual truth, but it does let you look back and identify your errors. Proper method should let you quickly find errors in the writing and ideas of others-an easier process, since you have much less emotional involvement.

  4. anchor says:

    @ Anon-Lane as in Lane Simonian? Please clarify.

    1. Lane says:

      The first is a rational money should work. Second is the inertial resistance of corporations can not be defeated. Last one is that remains to be seen.

  5. simpl says:

    The point on evolution reveals a related topic, that assigning priority is often several layers deep, and the clearer the truth is, the messier it all looks. I rationalise from what I know, rather than what the world knows.

  6. Lane says:

    History repeats itself. A check from which a king cannot escape.

  7. Anonymous Researcher snaw says:

    As I read that headline, in my head I heard the late Paul Harvey on his radio show saying “the rest [pause] of the story” with his distinctive delivery.

    1. cancer_man says:

      This is a coincidence. I thought of Paul Harvey as well. But I also thought about him when reading the post on Elysium and comments. Check out the paper that was just published in Science. In a year or two we will know “the rest [pause] of the story.” (maybe)

  8. Stephen says:

    “tellers of these tales are getting blinkered by their own feelings of superiority ”

    Now that I know the real story about the spinach I now feel superior to all those that dont and will now bore all my friends with the correct account.

    1. Lane says:

      Do you know something else? There are three players at least 🙂

  9. Lane Simonian says:

    Historians like scientists are supposed to be unbiased or at least they want to think that they are unbiased. Interlopers are not particularly well-liked in either field because they are seen to lack the type of critical thinking necessary to find the truth.

    It becomes uncomfortable to begin to think of perhaps a moral equivalence–there has to be a difference between a misguided non-scientific skeptic and a properly guided scientific skeptic. One of my favorite definitions of a scientific skeptic is a person who believes what he or she does not want to doubt and doubts what he or she does not want to believe. Or as another critic of scientific skeptics put it: “Whenever you find a controversy, you will find that the data is not adequate to definitively confirm either side. Of course, people who are devoted to one side will perceive anyone on the other side to be an idiot.”

    That does not mean there are varying strengths of evidence in both history and science. The Holocaust of the Jews, the mass murder of other groups such as the Gypsies (Roma), and the mass death of Slavs in Nazi slave labor camps is supported by a wealth of photographic, eyewitness, testimonial, and even chemical evidence. The genocide of the Armenians is supported by a lesser, but nevertheless substantial photographic, eyewitness, and testimonial evidence. It is thus easier for some people to deny the genocide that occurred during World War I than it is to deny the genocides that occurred during World War II. A lesser degree of evidence, though, is sufficient to convince many people that the Armenian genocide occurred (as probably should be the case). But those who hold to a different “truth” will not deem any evidence of either genocide sufficient.

    The methods of science and history may be different, the first focuses more on hypothesis and experimentation, the latter on collecting as much information as possible to try to tell an accurate story. Science may (and often) has required a change in paradigms as new evidence has presented itself just as historian have had to revise history as new sources are discovered or rediscovered. We try to get to truth by different routes, but the goal is not really different.

  10. SteveM says:

    Both conspiracy theorists with alternative views of history and stubborn scientists who pathologically attach themselves to a theory that becomes increasingly tenuous with additional information operate from a totally different perspective from those that evaluate them. Both operate under the assumption of that a step-function of new information has yet to be uncovered. I.e., transformational insight is out there just waiting to be revealed. For the conspiracy theorists, that information is the “smoking gun” document. For the perseverating scientist, it is the singular experiment that will show his theory to be undoubtedly true.

    For outside observers though, what convinces them is an accumulation of reputable information. They eventually become convinced that there is no step-function of increased information out there. In that context of radically different understandings of the decision environment, the two never shall meet. The conspiracy theorists are labeled eccentric cranks and the stubborn scientists are fired.

    I did Ag Chem long ago and far away and I witnessed 3 chemists getting fired simply because they were too stubborn to move onto something else when their pet theories did not pan out over time. What was amazing is that their group managers explicitly told them to move onto something else several times, but they refused. From their perspective, the refusal made sense because the transformative insight could come from the next compound. I.e. potential success was only a week away. That logic as an obsession could be extended forever.

    Funny though, I supported the Department of Energy program for Energy Efficiency and Renewable Energy, (EERE). There were National Lab scientists who worked on pet theories for years with marginal results. However come funding time every year, they promised that their epiphany was just around the corner. And then they invariably got funded again. Wherever level of pathological perseverance is tolerated in the commercial space, increase that by an order of magnitude in the government supported labs.

    1. Lane says:

      The Story Behind the Story Behind the Story. When your an ego exceeds your, the upshot (an failure of several clinical trials) may be your downfall (checkmate). Now you know the rest of the story.

  11. Jonathan says:

    I recall certain “truths” passed on from generation to generation in the laboratory context..pyridine makes men infertile, ethidium bromide is extraordinarily carcinogenic. Now I do not know exactly to what extent this is true…it is simply unquestioned. But my favourites: aspirin is a covalent binder and acts as an acetylation agent in vivo, and my absolute favourite…don’t use too much methanol in silica columns because it dissolves the silica!

    I often get my students to disprove the last one themselves. It arises Uni departments I suspect because of years ago reusing prep TLC silica (with binder) in columns.

    1. Derek Lowe says:

      I remember hearing the pyridine one in grad school, too. I have seen residue from silica gel columns come off with MeOH, but I’m not sure what it is. But I was under the strong impression that aspirin acetylate Ser530 in Cox2 – has that been disproven? Here’s a recent reference:

  12. Gretchen says:

    When I was in grad school, we had bottles of double-distilled water that was supposed to be so pure. But when I used that “pure” water to make coffee, I could taste plastic. Obviously, something was leaching out from the plastic tubing we used to get the water. But no one believed me because they believed that the water was so pure.

    1. MTK says:


      Hold on here, Gretchen.

      How do you know that it wasn’t you tasting thinking you tasted plastic when in fact there was none?

      Isn’t that the whole point of this post? That we all believe what we want to believe and that we’re all open to bias?

    2. Isidore says:

      Actually the absence of salts may make water a better solvent than regular water for trace amounts of plasticizers from the container.

    3. Li Zhi says:

      A moment’s thought about what it means to be “distilled” might provide you with a better understanding. Distillation ‘filters’ those compounds which are ‘non-volatile’ (details matter here – vacuum, temperature, time, atmosphere, gas/vapor flow,..) It is astoundingly difficult to create and store ultrapure water. Distillation² is, of course, less effective than, say ultrafiltration+distillation. And then we should discuss the dissolved gasses present…Anyway, in today’s modern lab, plastic is ubiquitous – when purity isn’t much of a concern. When it is, glass is the standard (last I heard).

  13. Insilicoconsulting says:

    Go Bayesian!

  14. Anon says:

    Seems that in many cases, the deeper we probe, the more complex things get. We seem to reach an equilibrium, where the noise takes over from any signal, and basically, we don’t have a clue.

  15. Li Zhi says:

    I’ve a couple of problems with the framing of these subjects. First is the confounding of observation with experiment. I can “prove” (to a rational observer) that sodium metal will react with chlorine gas and form sodium chloride. It is replicable. I can NOT prove (to all rational observers) that Booth shot Lincoln. There’s no “do-over” in History. Mixing up historical “fact” with scientific fact is grade-school logical error. Matthew not Darwin, Matthew and Darwin, Darwin not Matthew – what possible difference does it make for the (scientific) facts? Isn’t this a matter of ego? As for ‘layers’. I claim there is a base layer, which I call the observational evidence (which may change with time and is for all practical purposes infinite), from which we educe the “relevant” evidence, from which we infer the narrative. What is (or “should” be) in play is the amount of base we include in making our narrative. This is only relevant if we create a narrative before we have “sufficiently” explored the base. This is also known by several other names, and is a well established (if we believe the studies) human trait: we believe we have more expertise than we actually do, except when we are either nephytes or experts (and maybe even then). (Although this ignores the “tentative opinion held as a ‘place-holder’ pending more adequate information” opinions which most of us analytical types have in (over-) abundance.

    1. Li Zhi says:

      Anyway, given the enormously flawed information storage devices we all use every day, and its dynamic and mutable nature, using testamony as evidence is probematical. It requries us to believe that the testifier is a “sufficiently accurate” recording, storage, and play-back device. All 3 of which traits are well established (replicable science!) to often be flawed with regard to the human brain.

Comments are closed.