Skip to Content

Who Discovers and Why

Roger Boisjoly and the Management Hat

I’d like to take a few minutes to remember someone that everyone in R&D should spare a thought for: Roger Boisjoly, If you don’t know the name, you’ll still likely know something about his story: he was one of the Morton Thiokol engineers who tried, unsuccessfully, to stop the Challenger space shuttle launch in 1986.
Here’s more on him from NPR (and from one of their reporters who helped break the inside story of that launch at the time). Boisjoly had realized that cold weather was degrading the O-ring seals on the solid rocket boosters, and as he told NPR, when he blew the whistle:

“We all knew if the seals failed the shuttle would blow up.”
Armed with the data that described that possibility, Boisjoly and his colleagues argued persistently and vigorously for hours. At first, Thiokol managers agreed with them and formally recommended a launch delay. But NASA officials on a conference call challenged that recommendation.
“I am appalled,” said NASA’s George Hardy, according to Boisjoly and our other source in the room. “I am appalled by your recommendation.”
Another shuttle program manager, Lawrence Mulloy, didn’t hide his disdain. “My God, Thiokol,” he said. “When do you want me to launch — next April?”

When NASA overruled the Thiokol engineers, it was with a quote that no one who works with data, on the front lines of a project, should ever forget: “Take off your engineer hat,” they told Boisjoly and the others, “and put your management hat on”. Well, the people behind that recommendation managed their way to seven deaths and a spectacular setback for the US space program. As Richard Feynman said in his famous Appendix F to the Rogers Commission report, “For a successful technology, reality must take precedence over public relations, for nature cannot be fooled”.
Not even with our latest management techniques can nature be fooled, no matter how much six-sigma, 4S, and what-have-you gets deployed. Nothing else works, either. Nature does not care where you went to school, what it says on your business cards, how glossy your presentation is, or how expensive your shirt. That’s one of the things I like most about it, and I think that any scientist should know what I’m talking about when I say that. The real world is the real world, and the data are the data.
But it’s up to us to draw the conclusions from those numbers, and to get those conclusions across to everyone else. It may well be true, as Ed Tufte has maintained, that one of the tragedies of the Challenger launch was that the engineers involved weren’t able to do a clear enough presentation of their conclusions. Update: see this account by Boisjoly himself on this point. It might not have been enough in the end; there seem to have been some people who were determined to launch the shuttle and determined to not hear anything that would interfere with that goal. We shouldn’t forget this aspect of the story, though – it’s incumbent on us to get our conclusions across as well as we can.
Well, then, what about Nature not caring about how slick our slide presentations are? That, to me, is the difference between “slick” and “effective”. The former tries to gloss over things; the latter gets them across. If the effort you’re putting into your presentation goes into keeping certain questions from being asked, then it’s veered over to the slick side of the path. To get all Aristolelian about it, the means of persuasion should be heavy on the logos, the argument itself, and you should do the best job you can on that. Steer clear of the pathos, the appeal to the emotions, and you should already be in a position to have the ethos (the trustworthiness of the speaker’s character) working for you, without having to make it a key part of your case.
But today, spend a moment to remember Roger Boisjoly, and everyone who’s ever been in his position. And look out, be very careful indeed, if anyone ever asks you to put your management hat on.

30 comments on “Roger Boisjoly and the Management Hat”

  1. John says:

    Feynman elsewhere recounts the story of being a young physicist forced to argue for inconvenient nuclear safety precautions against the wishes of generals in a time of war. The formula he used, supplied by one of his mentors, was something like “Sandia cannot accept responsibility for the safety of Oak Ridge if you do not adopt these recommendations.”
    His description of the scene is quite funny, and I’ve used similar wording to great success.

  2. There’s one more thing Feynman wrote in Appendix F which is worth remembering – the woeful distortion of reality by the managers compared to its accurate perception by the technical people:
    “It appears that there are enormous differences of opinion as to the probability of a failure with loss of vehicle and of human life. The estimates range from roughly 1 in 100 to 1 in 100,000. The higher figures come from the working engineers, and the very low figures from management. What are the causes and consequences of this lack of agreement? Since 1 part in 100,000 would imply that one could put a Shuttle up each day for 300 years expecting to lose only one, we could properly ask “What is the cause of management’s fantastic faith in the machinery?”
    We are still reeling from management’s fantastic faith in…well, pretty much everything.

  3. GC says:

    Management doesn’t always screw up, in the ultra-rare case that they’re clueful. For example, Rickover knew the Navy’s nuclear program would be Fucked with a capital “F” if there was a huge accident, and he also knew that with nuclear, there are usually only huge accidents. So he ruled with an iron fist, insisted on a culture of safety, everyone hated him, and the US Navy’s never had any Soviet-style nuclear accident.
    Pretty much the exception that proves the rule, however.

  4. johnnyboy says:

    I understand what you mean when you talk about the importance of presenting facts effectively. But whether this would be sufficient to get the point across assumes that everyone around the table looks at the data the same way. Different people have different personalities, which color their outlook. And people’s personality traits will determine in large part the job they hold. Scientists and engineers tend to be data-driven and analytical, and may always require additional data before they are comfortable with a final decision. This goes against the more decisive, hurried personality that is often a characteristic of upper management leader types. To me the Boisjoly story seems a typical example of that conflict, with managers who ‘want to get things done’ looking down on engineers as overly cautious and indecisive. It’s very similar to what common occurs in the field of pharma safety.

  5. luysii says:

    Boisjoly wasn’t the only one. Billings, Montana produced Allan McDonald, also working at Thiokol at the time, who spoke against the launch back then (for the same reason). He was quickly invited back as the graduation speaker at Billings Senior High, his alma mater. His further education was at Montana State. He later wrote a book about his experience at Thiokol http://www.amazon.com/Truth-lies-rings-Challenger-disaster/dp/B003TV41DY/ref=cm_cr_pr_product_top.
    Montana is often looked down upon by bicoastals as a bunch of rubes, but having lived and practiced medicine there for 15 years, nothing could be further from the truth. You quickly learn to treat everyone the same, as the guy in bib overalls may (1) be able to buy and sell you (2) be smarter than you are.

  6. bearing says:

    @johnnyboy: I don’t know, I have always found Edward Tufte’s argument that the communication was unclear *extremely* persuasive. Especially the part where he shows the graphs that should have been — when Tufte re-presents the data in a way that makes it abundantly clear just how far outside the usual operating range were the temperatures that morning.
    But then, the work I did after receiving my engineering doctorate was mostly technical editing, and of course Tufte is an expert in the use of visuals to communicate data. To a boy with a hammer, everything is a nail, yada yada yada.

  7. ba says:

    Derek and others,
    I strongly recommend you read Boijoly’s account of the Challenger disaster, published in Science and Engineering Ethics in 2002:
    http://people.rit.edu/wlrgsh/FINRobison.pdf
    It speaks directly to whether or not “slick graphics” had anything to do with the decision that was ultimately made. Furthermore, anyone who has any experience as a practicing scientist – running experiments and interpreting data – will get shivers down their spine reading Boijoly’s paper. You will immediately recognize the situation the engineers were in: trying to make a cogent recommendation on a critical issue with conflicting data – something we all do almost daily, sometimes with great stress, but likely never with the consequential weight of the Thiokol engineers.

  8. The point that needs to be underscored about the data presentation. The Thiokol engineers presented the data in a manner which emphasized that there was conflicting data. It’s a boat I’ve often rowed.
    Tufte’s argument is that this was entirely the wrong take on the data — the key thing which should have been emphasized, and was not, was that there was NO data in the temperature regime Challenger launched in. Yes, management could have argued that no temperature-failure trend could be reliably pulled from the data, but a better presentation would have hammered home that nobody had a clue what the seals would do there.
    Damn tragic way to run the experiment to collect that data!

  9. GreedyCynicalSelfInterested says:

    “Take your engineer hat off and put your management hat on…”
    Management is about appearances and playing roles, like in theatre or childrens’ make-believe. The emphasis is on “believe.”
    I think a lot of them who have no technical background or training are just a bunch of phonies who are good at reading people, are tall, good-looking and have lots of charisma. They look the part so they get hired. Take acting lessons before or during the time you’re getting the MBA.

  10. Jonny says:

    There’s a lot of anti-management/MBA comment on this site, and while some of it is justified (as in most situations of extended criticism), some is not.
    I think this example is somewhat atypical (frankly, management could be extended to management/politics), but the tone on here can be a little unbalanced.

  11. ba says:

    @8 Keith-
    That point, that there was no data in the temperature regime they were set to launch in, was EXACTLY the point that the engineers tried forcefully to make. What management wanted was, “prove to me that the seals will fail at 29 F,” and the engineers could not provide that data because it did not exist. Read another account besides Tufte’s, it will blow your mind.
    As an aside, I find it amazing that Boisjoly could ever be considered at fault for allowing the launch to occur. The fact of the matter is, he told management that the launch shouldn’t happen. Management made the decision that it would happen anyway. To think he is somehow culpable because he wasn’t able to force management to make the right decision is mind bogling to me.
    “I want to eat this donut.”
    “But if you eat it, you’ll get fat.”
    …10 year later…
    “I’m suing you for not trying hard enough to convince me that donuts would make me fat.”

  12. Another Kevin says:

    The baseline expectation is that the engineer will tell the manager what the manager wants to hear – and make it so. If the manager’s fantasies have no grounding in the real world, that fact is no defence; the engineer’s job is to alter the reality.
    M: “Why didn’t you warn me?”
    E: “I did. Repeatedly. In writing.”
    M: “Well, you should have warned me better, so that I would have believed it.”
    If the political power imbalance is such as it usually is, “you should have warned me better” is sufficient to deflect all blame for the bad decision away from the manager and onto the engineer.
    And perhaps this is entirely right. The engineer’s duty in a case like Boisjoly’s may be to resign in protest – and he is delinquent in that duty if he fails to convince all his colleagues to do the same.

  13. Dreamer says:

    A timely reminder that more often that not in science you get the result you get, not the result you want.

  14. Dreamer take 2 says:

    A timely reminder that more often than not in science you get the result you get, not the result you want.

  15. ReneeL says:

    I am a government contractor, and can tell you that personnel in government agencies, if they feel like it, do not listen to their contractors. Their attitude is that they are the customer, they are paying the bills, they control the contract, and their viewpoint is all that matters. You can have the most clearcut presentation in the world – it won’t matter. On top of this, each government agency is run by political appointees who are nominated by the President; you can thus imagine how much politics plays a role in decision-making.
    This is somewhat different than the typical corporate management/technical professional arena.
    There is nothing that Boisjoly could have done to convince his management and the NASA brass to have come to a different decision. The management and NASA had decided to launch that morning, period.

  16. Chris Barts says:

    This is an example of the fact people have a moral obligation to go outside the normal channels when the normal channels break. In this case, it could have been as simple as going a few levels higher within NASA. Maybe it would have necessitated going to the media. Maybe they should have gone to the astronauts themselves.
    Yes, right or wrong, you will lose your job, and some problems aren’t worth losing your job over. This one was.

  17. Abuzuzu says:

    If I remember correctly no NASA manager had his career killed or injured as a result of this incident.
    Boisjoly? Well that is another story.

  18. Anonymous BMS Researcher says:

    The whole concept of the Space Shuttle was deeply flawed because it arose from political expediency. The most dangerous parts of space travel are takeoff and reentry, for much the same reasons why the most dangerous parts of commercial aviation are takeoff and landing: things are happening quickly in those phases of the trip. If something fails during cruise phase of an airline flight, or the orbital phase of a space flight, there is TIME to think through how to fix it. Had that tank on Apollo 13 blown up during a critical stage of the mission, little could have been done.
    Therefore, human space travel should consist entirely of long-duration missions (to minimize trips up and down), and people should ascend and descend in small spacecraft that are dedicated to the job of carrying people. These should be as simple as possible, because simplicity is how you get reliability.
    Satellites, components of a space station, supplies, etc., etc., should be carried into orbit by heavy lift rockets with no crew on board. With no people on board, loss ratios become a matter of simple economics: would the increased cost for a rocket with a 1% failure rate be more or less than the increased cost of insuring the rocket with a 3% failure rate?
    The Shuttle was overly complex because it was supposed to be the one spaceship for all low-earth-orbit missions. The decision to make one spacecraft for all purposes was driven by the desire to avoid facing budgetary reality: NASA’s budget was vastly too small for its ambitions. Therefore, NASA should have told the politicians: we cannot do all these programs on that budget, so unless you give us more resources we will make drastic program cuts. Instead, NASA decided to build one ship for all missions because in theory that would cut costs enough to let them carry out a very ambitious program on a limited budget.
    However, quite aside from the disasters, the Space Shuttle never came close to attaining the promised efficiency. Every aspect of Shuttle operations took much longer and cost lots more than had been projected. Not only was its design overly complex, but nearly every component was pushed to the limits of its capabilities. A robust design has generous margins of safety; many aspects of the Shuttle had minimal margins of safety. Keeping it flying required continual tinkering.

  19. Twelve says:

    I recall that during the Congressional hearings on the Challenger disaster, one of the culpable managers was confronted with the facts that they had had in front of them just before the flight, and was asked if, in light of what they now knew, they regretted making the decision to launch the Space Shuttle that morning. The answer was ‘no, I would make the same decision again’.
    Sounds like some pharma managers we know.

  20. johnnyboy says:

    “Therefore, human space travel should consist entirely of long-duration missions (to minimize trips up and down)”
    Wrong. Human space travel should not be done, period. It is an utterly absurd and pointless waste of immense financial resources.

    1. fajensen says:

      It is still better to waste money on space than on the military and bank bailouts.

  21. Innovorich says:

    There’s a Harvard Business school case study on this that has become required reading on many MBA courses. The reason it is required reading (beyond the obvious “mis-management to avoid”) is that the original government report (which included phone transcripts), although agreeing that Boisjoly did warn management, he actually didn’t properly explain the statistical basis for his argument as to why the o-ring temperture test results did not support a safe launch, hence management did not understand or believe him properly (as Keith already above). If you take a look at the data he had, and the transcripts, if you’re a good communicative scientist, you can see that there would have been ways to present the data that would have made what he was saying glaringly obvious. It’s clearly true that management somewhat ignored the risks, but it’s also true that they totally didn’t understand them and Boisjoly failed to explain them to them.
    This is actually a classic problem in scientific institutions. That the scientists and management just don’t adequately communicate with each other and that is the fault of both the management and the scientists.

  22. Paul says:

    Wrong. Human space travel should not be done, period. It is an utterly absurd and pointless waste of immense financial resources.
    Has Derek ever roasted the perennial space fan chestnut of using microgravity for protein crystallization? This was one of the justifications for building the space station. I’d love to see some industry folks cut this one down to size.

  23. ba says:

    @21 Innovorich-
    No. You are wrong. If you look at the data that Boisjoly had, (and I suspect you have NOT looked at exactly what data he had), you would see that he had essentially 3 data points to work with: an o-ring failure at 53 F, an o-ring failure at 75 F, and a single “ex vivo” experiment that suggested that the o-ring rubber got less squishy when cold. That’s it. The 75 degree o-ring failure was devastating since it disproved a causal relationship between o-ring temperature and failure.
    What Boisjoly had in hand was a bunch of conflicting data, and a strong belief that flying outside of previous field experiments was a huge risk. He did not have the power to shutdown the launch, Tufte did. Boisjoly had the power to present the data that he had, and he had the power to make a recommendation. Tufte ignored his recommendation, and a lot of people died.
    The famous scatter plot later constructed by Tufte (which I think you’re eluding to) was filled with erroneous and unavailable (at the time) data. Of course the trend looks obvious, the data was essentially fabricated.
    Finally, Boisjoly did not know at the time that he should stop the launch. He only knew that it was risky. He did his job in presenting the available data and making his recommendation. Tufte made the wrong decision.
    We pharmaceutical chemists face this kind of problem all the time. Do we kill this entire program because we think it might be a dead end? As a researcher, let’s say I present data to management that shows conflicting trends and recommend ending the program. Management looks at the same data and decides to continue. Several hundred million dollars later, the program turns out to be a dead end. Is it my fault that management didn’t take my recommendation? Of course not, it’s not my decision to make. The classic problem, in fact, is that rarely is the answer obvious. Science is full of conflicting data, and you either take a concervative approach, or you don’t. Tufte chose to take a risk, and it turned out to be the wrong decision.

  24. Stu West says:

    Maybe my favourite Feynman quote (and I only wish I could remember where I read it) was the one he made after working on the Challenger investigation, where he proclaimed that if he had his time over again he would devote himself to studying the problem of information flow in hierarchical organizations. He said, “It’s not rocket science: it’s much more complicated than that.”

  25. Joe T. says:

    #23, why do you think Tufte had anything to do with the decision to launch?

  26. cthulhu says:

    Challenger was a classic example of what is called the “normalization of deviance”: NASA believed that, because O-ring blow-by had happened before and the shuttle had not blown up, that this was somehow proof that O-ring blow-by was not a serious safety risk. They were wrong, and seven fatalities resulted. Thiokol management was also somewhat culpable for their gutless caving-in to NASA’s bullying.
    Again, the issue was not temperature per se; the issue was that O-ring blow-by had been shown to happen at both low and nominal temperatures, and NASA had decided to take the risk because no catastrophic failure had happened to date. Tufte’s scatterplot is intellectually dishonest because it “relies on facts not in evidence” – facts not available to the engineers. My opinion of Tufte plummeted as a result of his grandstanding on this.
    Normalization of deviance is a powerful force in human psychology, and must be tamed in any safety-critical culture. Again, this is the true lesson of Challenger.

  27. ba says:

    @25 Whoops, correct, in my initial reading I thought Tufte was part of the management team – he was not. His analysis, however, is flawed based on the reasons stated above.

  28. Curt F. says:

    It’s awesome that the heroes who spoke out against the launch are being named so that we can honor their efforts. However I also wish that the “management” at NASA that threw away the lives of the seven astronauts would also be named, so that we can remember their folly.

Comments are closed.