Science writer Matt Ridley had an interesting (but somewhat odd) column over the weekend in the Wall Street Journal. Titled “The Myth of Basic Science”, it maintains that “technological innovation has a momentum of its own”, and that basic research doesn’t really drive it.
I think that my brain must be about ninety degrees off of Ridley’s, because as I read the piece, I kept thinking that I was seeing sideways slices of the real situation. He has technology moving along by “a sort of inexorable, evolutionary progress” that we can’t do anything about”, but this skips over the millennia (and the many human cultures) where it did nothing of the kind. He may well mean “once it gets going”. And it’s true that there are a number of discoveries and inventions that seem to have been in the air at about the same time – Newton and Leibniz with calculus, the number of people simultaneously working on radios, electric lights, and so on. Growing up, I had relatives living around Murray, Kentucky, which for many years has proclaimed itself “The Birthplace of Radio”, because of one Nathan Stubblefield, a pre-Marconi inventor whom I’d be willing to be that very few have ever heard of (outside of Murray, that is).
So to some extent, I can see where Ridley is coming from. And he’s correct to say that many people have too simple an idea of how things work in the real world:
Politicians believe that innovation can be turned on and off like a tap: You start with pure scientific insights, which then get translated into applied science, which in turn become useful technology. So what you must do, as a patriotic legislator, is to ensure that there is a ready supply of money to scientists on the top floor of their ivory towers, and lo and behold, technology will come clanking out of the pipe at the bottom of the tower.
Although that may also be too simple an idea of how politicians work, to be fair about it. Not all of them are thrilled about the ivory-tower phase of things, and some of them would like to skip that stage and go right to the applied stuff. “We will now fund breakthroughs in Field X, because of its importance to the nation”, and so on. That doesn’t always work out so well, either. The standard examples are the Manhattan Project and the Apollo program, but what some people miss is that these were largely efforts in applied engineering. Huge, complex efforts, to be sure, with plenty of very difficult and expensive twists and turns – but we knew that nuclear fission was a reality when we started working on the bomb, and we knew that rockets (and Newton’s laws of motion) existed when we started going to the moon. (More recently, we knew what DNA was, and an awful lot about genes and chromosomes, when we started sequencing the human genome). I’m not trying to take away anything from these achievements by pointing this out, but what that does mean is that that similar projects that don’t start on as firm a footing can go off track very quickly. Does anyone remember the big Japanese “fifth-generation” computer project? That’s what I mean – no one was really sure what such a machine was, what it would look like, or how anyone would know that it had really been built, so in the end (as far as I can see) a lot of money and effort was just sort of dispersed out as a fine mist.
Ridley brings up about something that both Peter Medawar and Freeman Dyson have written about, that discoveries are influenced by the sort of technology and equipment that’s available to work on them. That, to me, is what makes it look as if “technology” is some sort of force by itself, an organism that we can watch but not tame. But where I think I part company with Ridley’s article is a thought experiment he doesn’t mention: what might happen if we ran the whole process again? How much of our current technology depends on accidents of invention and timing? Ridley makes it sound almost as if there’s something inevitable about where we ended up – at each stage, if one person hadn’t come up with Theory X or Invention Y, someone else would have. But I think that’s like saying that if you started biological evolution again, that it would always lead to human beings, when actually the record is full of near-misses, lurching accidents, and chance encounters.
Experimental science took a really long time to get going. But if we assume that one way or another that it would (and I have no idea how warranted that assumption is), than it’s for sure that someone would have discovered (say) the laws of optics if Newton hadn’t, and so on. But I don’t think that the order that things were discovered in was as inevitable, and if you start rearranging those, you could arrive at some very different places. There’s a lot of talk about how discoveries aren’t made (or aren’t appreciated) until the time is ripe for them, and while I can see some of that, I worry that that viewpoint is too post hoc to be useful.
I think one reason I had trouble initially parsing this article was that it’s about two things at once. Beyond the topic of technology driving itself, though, Ridley has some controversial things to say about the sources of funding for technological progress, much of it quoted from Terence Kealy, whose book The Economic Laws of Scientific Research has come up here before:
It follows that there is less need for government to fund science: Industry will do this itself. Having made innovations, it will then pay for research into the principles behind them. Having invented the steam engine, it will pay for thermodynamics. This conclusion of Mr. Kealey’s is so heretical as to be incomprehensible to most economists, to say nothing of scientists themselves.
For more than a half century, it has been an article of faith that science would not get funded if government did not do it, and economic growth would not happen if science did not get funded by the taxpayer. It was the economist Robert Solow who demonstrated in 1957 that innovation in technology was the source of most economic growth—at least in societies that were not expanding their territory or growing their populations. It was his colleagues Richard Nelson and Kenneth Arrow who explained in 1959 and 1962, respectively, that government funding of science was necessary, because it is cheaper to copy others than to do original research.
“The problem with the papers of Nelson and Arrow,” writes Mr. Kealey, “was that they were theoretical, and one or two troublesome souls, on peering out of their economists’ aeries, noted that in the real world, there did seem to be some privately funded research happening.” He argues that there is still no empirical demonstration of the need for public funding of research and that the historical record suggests the opposite.
. . .In 2003, the Organization for Economic Cooperation and Development published a paper on the “sources of economic growth in OECD countries” between 1971 and 1998 and found, to its surprise, that whereas privately funded research and development stimulated economic growth, publicly funded research had no economic impact whatsoever. None. This earthshaking result has never been challenged or debunked. It is so inconvenient to the argument that science needs public funding that it is ignored.
Now those are fighting words for many people. It really has been an article of faith, ever since Vannevar Bush’s “Endless Frontier” that funding scientific research is an important function of the government. There’s no doubt that many discoveries and innovations have come out of such funding. The question – which I can’t answer, and neither can anyone else, unfortunately – is what discoveries and innovations might have come along without that funding. Don’t make the mistake of thinking about what would happen if we just suddenly defunded the NIH and the NSF; that’s not what we’re talking about. We’re talking about what would have happened if those agencies (or their like) had not existed in the first place. It would have been a different world, set up differently, and (as argued above) with different outcomes. It’s also not too useful to start listing inventions that can be traced back to government funding, because that says little or nothing about what would have happened in that alternate world.
And it’s easy, at this point, to play Greeks and Romans again – the curious Greek philosophers inventing geometry and atomic theory, while the hard-nosed Romans just wanting better recipes for cement. There are quite a few professors in this world who would assume that industrially-driven research would provide only seventy-seven more flavors of corn chips, with a different celebrity photo on each bag. But that’s as much of a caricature as the picture of hordes of absent-minded faculty members writing papers on obscure topics that no one else ever bother to read. Blue-sky work would still have gotten funding. Even after the advent of government funding for basic research, industrial basic research used to be a much bigger thing than it was later on (cue up the obligatory mention of Bell Labs here). Google and others may well be bringing us back around to that, which is another large and interesting topic in itself.
Another argument to be made against the quoted excerpts above is that there’s more to funding scientific research than economic growth. That’s surely true – when it comes to science, I’m sort of an ars gratia artis man myself. But that’s not quite what the people authorizing the funding think, for the most part. They fund the NIH and NSF because they believe that the grants involved will lead to jobs and economic progress (eventually), not so much because they have any burning curiosity about microtubules, hadron collisions, biosynthetic pathways, or metal-organic frameworks. Every one of those grants, therefore, has language in it (however specious) tying it to some sort of “real-world” outcome: climate change, cancer, the quest for the perfect egg salad recipe, what have you. You can assemble a long and cringe-inducing list of quotes from politicians in just this vein, voting for Project X or Agency Y in the assumption that they’re mostly some sort of fancy pork-barrel jobs programs.
So for me, that’s the part of Ridley’s article that I found most worth thinking about. I would have liked to have sheared off the technological-progress-momentum stuff for another op-ed, which I might not even read, and had more about the public-versus-private research argument. The article isn’t so much about “The Myth of Basic Research”, it’s about the myth of who should be funding it, and what basic research really is.