Skip to Content

Technology and Funding: Myths and Alternate Worlds

Science writer Matt Ridley had an interesting (but somewhat odd) column over the weekend in the Wall Street Journal. Titled “The Myth of Basic Science”, it maintains that “technological innovation has a momentum of its own”, and that basic research doesn’t really drive it.

I think that my brain must be about ninety degrees off of Ridley’s, because as I read the piece, I kept thinking that I was seeing sideways slices of the real situation. He has technology moving along by “a sort of inexorable, evolutionary progress” that we can’t do anything about”, but this skips over the millennia (and the many human cultures) where it did nothing of the kind. He may well mean “once it gets going”. And it’s true that there are a number of discoveries and inventions that seem to have been in the air at about the same time – Newton and Leibniz with calculus, the number of people simultaneously working on radios, electric lights, and so on. Growing up, I had relatives living around Murray, Kentucky, which for many years has proclaimed itself “The Birthplace of Radio”, because of one Nathan Stubblefield, a pre-Marconi inventor whom I’d be willing to be that very few have ever heard of (outside of Murray, that is).

So to some extent, I can see where Ridley is coming from. And he’s correct to say that many people have too simple an idea of how things work in the real world:

Politicians believe that innovation can be turned on and off like a tap: You start with pure scientific insights, which then get translated into applied science, which in turn become useful technology. So what you must do, as a patriotic legislator, is to ensure that there is a ready supply of money to scientists on the top floor of their ivory towers, and lo and behold, technology will come clanking out of the pipe at the bottom of the tower.

Although that may also be too simple an idea of how politicians work, to be fair about it. Not all of them are thrilled about the ivory-tower phase of things, and some of them would like to skip that stage and go right to the applied stuff. “We will now fund breakthroughs in Field X, because of its importance to the nation”, and so on. That doesn’t always work out so well, either. The standard examples are the Manhattan Project and the Apollo program, but what some people miss is that these were largely efforts in applied engineering. Huge, complex efforts, to be sure, with plenty of very difficult and expensive twists and turns – but we knew that nuclear fission was a reality when we started working on the bomb, and we knew that rockets (and Newton’s laws of motion) existed when we started going to the moon. (More recently, we knew what DNA was, and an awful lot about genes and chromosomes, when we started sequencing the human genome). I’m not trying to take away anything from these achievements by pointing this out, but what that does mean is that that similar projects that don’t start on as firm a footing can go off track very quickly. Does anyone remember the big Japanese “fifth-generation” computer project? That’s what I mean – no one was really sure what such a machine was, what it would look like, or how anyone would know that it had really been built, so in the end (as far as I can see) a lot of money and effort was just sort of dispersed out as a fine mist.

Ridley brings up about something that both Peter Medawar and Freeman Dyson have written about, that discoveries are influenced by the sort of technology and equipment that’s available to work on them. That, to me, is what makes it look as if “technology” is some sort of force by itself, an organism that we can watch but not tame. But where I think I part company with Ridley’s article is a thought experiment he doesn’t mention: what might happen if we ran the whole process again? How much of our current technology depends on accidents of invention and timing? Ridley makes it sound almost as if there’s something inevitable about where we ended up – at each stage, if one person hadn’t come up with Theory X or Invention Y, someone else would have. But I think that’s like saying that if you started biological evolution again, that it would always lead to human beings, when actually the record is full of near-misses, lurching accidents, and chance encounters.

Experimental science took a really long time to get going. But if we assume that one way or another that it would (and I have no idea how warranted that assumption is), than it’s for sure that someone would have discovered (say) the laws of optics if Newton hadn’t, and so on. But I don’t think that the order that things were discovered in was as inevitable, and if you start rearranging those, you could arrive at some very different places. There’s a lot of talk about how discoveries aren’t made (or aren’t appreciated) until the time is ripe for them, and while I can see some of that, I worry that that viewpoint is too post hoc to be useful.

I think one reason I had trouble initially parsing this article was that it’s about two things at once. Beyond the topic of technology driving itself, though, Ridley has some controversial things to say about the sources of funding for technological progress, much of it quoted from Terence Kealy, whose book The Economic Laws of Scientific Research has come up here before:

It follows that there is less need for government to fund science: Industry will do this itself. Having made innovations, it will then pay for research into the principles behind them. Having invented the steam engine, it will pay for thermodynamics. This conclusion of Mr. Kealey’s is so heretical as to be incomprehensible to most economists, to say nothing of scientists themselves.

For more than a half century, it has been an article of faith that science would not get funded if government did not do it, and economic growth would not happen if science did not get funded by the taxpayer. It was the economist Robert Solow who demonstrated in 1957 that innovation in technology was the source of most economic growth—at least in societies that were not expanding their territory or growing their populations. It was his colleagues Richard Nelson and Kenneth Arrow who explained in 1959 and 1962, respectively, that government funding of science was necessary, because it is cheaper to copy others than to do original research.

“The problem with the papers of Nelson and Arrow,” writes Mr. Kealey, “was that they were theoretical, and one or two troublesome souls, on peering out of their economists’ aeries, noted that in the real world, there did seem to be some privately funded research happening.” He argues that there is still no empirical demonstration of the need for public funding of research and that the historical record suggests the opposite.

. . .In 2003, the Organization for Economic Cooperation and Development published a paper on the “sources of economic growth in OECD countries” between 1971 and 1998 and found, to its surprise, that whereas privately funded research and development stimulated economic growth, publicly funded research had no economic impact whatsoever. None. This earthshaking result has never been challenged or debunked. It is so inconvenient to the argument that science needs public funding that it is ignored.

Now those are fighting words for many people. It really has been an article of faith, ever since Vannevar Bush’s “Endless Frontier” that funding scientific research is an important function of the government. There’s no doubt that many discoveries and innovations have come out of such funding. The question – which I can’t answer, and neither can anyone else, unfortunately – is what discoveries and innovations might have come along without that funding. Don’t make the mistake of thinking about what would happen if we just suddenly defunded the NIH and the NSF; that’s not what we’re talking about. We’re talking about what would have happened if those agencies (or their like) had not existed in the first place. It would have been a different world, set up differently, and (as argued above) with different outcomes. It’s also not too useful to start listing inventions that can be traced back to government funding, because that says little or nothing about what would have happened in that alternate world.

And it’s easy, at this point, to play Greeks and Romans again – the curious Greek philosophers inventing geometry and atomic theory, while the hard-nosed Romans just wanting better recipes for cement. There are quite a few professors in this world who would assume that industrially-driven research would provide only seventy-seven more flavors of corn chips, with a different celebrity photo on each bag. But that’s as much of a caricature as the picture of hordes of absent-minded faculty members writing papers on obscure topics that no one else ever bother to read. Blue-sky work would still have gotten funding. Even after the advent of government funding for basic research, industrial basic research used to be a much bigger thing than it was later on (cue up the obligatory mention of Bell Labs here). Google and others may well be bringing us back around to that, which is another large and interesting topic in itself.

Another argument to be made against the quoted excerpts above is that there’s more to funding scientific research than economic growth. That’s surely true – when it comes to science, I’m sort of an ars gratia artis man myself. But that’s not quite what the people authorizing the funding think, for the most part. They fund the NIH and NSF because they believe that the grants involved will lead to jobs and economic progress (eventually), not so much because they have any burning curiosity about microtubules, hadron collisions, biosynthetic pathways, or metal-organic frameworks. Every one of those grants, therefore, has language in it (however specious) tying it to some sort of “real-world” outcome: climate change, cancer, the quest for the perfect egg salad recipe, what have you. You can assemble a long and cringe-inducing list of quotes from politicians in just this vein, voting for Project X or Agency Y in the assumption that they’re mostly some sort of fancy pork-barrel jobs programs.

So for me, that’s the part of Ridley’s article that I found most worth thinking about. I would have liked to have sheared off the technological-progress-momentum stuff for another op-ed, which I might not even read, and had more about the public-versus-private research argument. The article isn’t so much about “The Myth of Basic Research”, it’s about the myth of who should be funding it, and what basic research really is.

40 comments on “Technology and Funding: Myths and Alternate Worlds”

  1. exGlaxoid says:

    Having worked in academia, industry, and government funded labs now, I can see both sides of the issue of funding research. Many academic and government funded labs are doing a fair amount of obscure, pointless, or irrelevant research, some even done poorly, fraudulently, or wastefully. Some labs are much better and provide great science and new technologies. Many industrial labs are and were great (IBM, AT&T, Merck, etc), but they are vanishing or shrinking for a variety of reasons and they also do tend to focus on applied science much more, although I think more basic science has come from them than often admitted by academics.

    So it is easy to see that there needs to be some public funding of sciences, both in academics (to help educate and promote research) and in government funded labs, in order to get some research done in areas either not profitable to industry or too complex/abstract for academic labs. But having seen and worked in several public funded labs over the years, I do think that there could be changes to improve efficiency, productivity, and quality of many of them, Some should be closed, cut back, or cleaned up, just look at the messes at many atomic energy labs, as well as some other government labs for a few examples. Anyone who blindly says to keep increasing public research spending is just as bad as the people who say that we should eliminate it entirely.

  2. Hap says:

    Weren’t some of the biggest islands of industrially-funded research (Bell Labs, in particular) at least partially used to justify monopoly or government protection of some sort (that this kind of research was the consequence of the ordered economics of their monopolies)? In that sense, government was funding the monopoly, or at least forcing others to do so. It seems like it would be better and more explicit to fund it directly.

    It also seems like industrial funding for anything longer-term than product development would require some sort of market security – if a company is worried about next quarter, they won’t fund research unless they have no other choice. Thus, an industrial research world would probably have booms and busts as companies’ fortunes shift upward and downward, perhaps more so than now.

    How would you train scientists? Would it be an apprenticeship system in industry, or would students pay for research work in grad schools, perhaps like other graduate majors?

  3. NMH says:

    Lets say that technological innovation is proportional to the amount of good research out there: research that isn’t wrong, or fradulent, or plain stupid. Lets say that TI (Tech Inv) prop to GP (good papers)

    Although there is a lot more research being done nowadays than there was in the 1960’s and 70’s, I would say there is a much higher fraction of it that is garbage, in part caused by too extreme competition and careerism of the people doing it.

    So a question to ask is: do we have as much research that leads to technological innovation than we did earlier? in other words, is total GP say, after 1990 is greater than total GP before 1990? I’m not sure.

  4. ex-academic says:

    I wasn’t a fan of the article in general (and the comments are, as expected, truly terrifying), but one of his specific examples really annoyed me:

    The discovery of the structure of DNA depended heavily on X-ray crystallography of biological molecules, a technique developed in the wool industry to try to improve textiles.

    That’s a very. . . creative interpretation of history. I can’t really take anything else in the column seriously after reading that.

  5. Anon says:

    I tend to agree with the original article, that innovation would probably run the same path in the same order at the same pace if we did it all over again. The precise order and timing of individual discoveries and developments may change slightly, but the overall picture would look the same. Just consider how many discoveries and inventions have been developed in parallel by different groups working independently, but building on what has come before. It seems everything would have come out within just a few years in any case, if the original inventor had got run under a bus. I wouldn’t say everything is fate (at east not at the detailed level), but there is certainly a pre-defined path of least resistance (maximum likelihood) based on conditional probabilities.

  6. Anon says:

    PS. Nature is full of examples of convergent evolution where different paths have led to the same result (e.g., insects, bats and birds all have wings), so why not scientific progress?

  7. Polynices says:

    The comments about basic research funding remind me of the line attributed to a department store magnate: “half the money I spend on advertising is wasted; the trouble is I don’t know which half.” Even if we grant that half (or more) of basic research is junk it’s not like we can only turn off that part. Maybe lots of junk is the price for the good stuff. Perhaps there’s room to move the needle a bit and produce a bit less junk but that’s the sort of incremental improvement that doesn’t make for interesting headlines or soundbites.

    With regard to the progress of technology, I am always struck by the counterfactual that Sir Humphry Davy could have invented surgical anesthesia 40 years earlier than Morton if he’d just looked a little bit harder at the effects of nitrous oxide on pain. Lots of stuff could have been invented earlier and lots of stuff could have been invented later, but we usually can’t tell.

  8. Biotechie says:

    The real danger about this article is how it will be misused/mischaracterized to divert funding away from basic research. Defining what is meant by innovation and what is meant by new ideas and creativity is very important. What is essential for innovation both upstream and downstread? What are the metric/output to be measured? If output is products and technologies, then industry rather than public funded academia does lead the way. But industry isnt as resourceful as academia in creativity or coming up. IBM, Xerox, Bell labs and the old Janssen (not part of J&J) are all great examples of creative innovative companies (that were). But they are long gone. Personally, I would say most creative work that leads to important translational work in industry happens in publicly funded research. Academia undervalues what industry does. And the original article shows how industry and innovation consultants often undervalue what academia does

    1. Joe Q. says:

      Bell Labs may be long gone, but IBM and Xerox are certainly still alive, complete with multi-site research divisions.

  9. Anon says:

    The more you look at the big picture, the more it will follow a pre-defined path based on fundamental principles like the law of averages, and the law of diminishing returns, etc. Furthermore, the path of progress is largely driven by fundamental human needs, so on that basis one would expect a very similar path of development in a parallel universe.

  10. CMCguy says:

    I find Ridley’s views rather discouraging as even though can see how technology innovations can often experience periods of extensive replication and extension in certain areas, with appearance of self sustained momentum, that period is invariably preceded by long periods of knowledge acquisition for the foundations to make those innovations possible and reliable. Also most technologies will mature to points of slowed incremental or stagnate progress until a new discovery supported by basic research spawns more dramatic growth. Just like in drug innovations the relationship of academia and pharma can be complex and overlapping to degree because the differences in underlying objectives and missions there really needs to be advancements and contributions from both to accomplish effective disease treatments. Cutting off or limiting basic research will surely hurt future innovations and it is doubtful that industry would adequately step up more strongly (as once did to more significant levels) especially when not directly in the short term interest of their shareholders.

  11. Mark says:

    Not wanting to be too ad hominem, but I’ve seen it pointed out that Mark Ridley is a big fan of government non-intervention, except for when the bank of which he was chairman (Northern Rock) went bankrupt and had to be bailed out at the start of the credit crunch.

    One big thing Ridley misses is all of the basic research that cannot and will not be funded by industry because it is antithetical to industrial interests or has no direct commercial value. Would the drug landscape be different if there were no independent clinical trials, especially given the large amount of evidence on the biases (explicit and implicit) that tend to occur when the funder of a trial has a financial interest in the outcome? Who funds research into sustainable fishing policies, or the effects of heavy metal contamination, or whether groundwater contamination by fracking is actually a problem or not? He also neglects the often enormous lead times between basic scientific understanding being developed and the practical application of that basic understanding: corporations do not fund research on multi-decade timescales, yet without the basic research having been done the technological applications would take much longer to arrive.

    1. “corporations do not fund research on multi-decade timescales”

      Don’t mortages last for multiple decades?

  12. bad wolf says:

    Interesting comments today! Surely i’m not alone in having grown up with James Burke’s shows on PBS/cable, Connections and The Day the Universe Changed? They influenced me a lot in pointing out the way that many ‘obvious’ developments built in unexpected ways on prior work in other fields, and always had a random element.

    In any case i always enjoyed such contemplations of alternative technologies; similar to the genres like Steampunk exploring a “what-if” of developments already past.

  13. David Borhani says:

    @ #4, ex-academic: I haven’t read the article, and perhaps it is 90 degrees off from my own thinking as it was for Derek. But, in the instance you quote, I don’t think that’s a creative, or wrong, interpretation of history, albeit it is an incomplete interpretation.

    William Astbury and his students at Leeds were indeed studying wool and other fibrous substances, using x-ray diffraction, to understand the structure of these materials, and to learn how that structure impacts their properties.

    In the 1930s, Astbury discovered both the “alpha” (helical) and the “beta” (extended) forms of protein structure; although he got the precise structural details wrong, he recognized that alpha was curled up (helical) and that beta was extended. Linus Pauling kept Astbury’s nomenclature when he proposed the correct alpha-helix structure in 1951.

    What’s more, Astbury did study the structure of DNA. Astbury & Bell published in 1938 (Nature 141:747) on the fibrous nature of DNA, as determined by x-ray diffraction, noting the strong repeat at 3.34 angstrom spacing; they understood correctly that the structure was a long fiber with bases arranged perpendicularly (cf. the strong negative birefringence), with 3.34 A spacing, to the fiber axis.

    He didn’t have the tools to go much further — Cochran, Crick, and Vand’s analysis of helical diffraction was still 14 years in the future — and Rosalind Franklin’s iconic Photograph 51 was the key information needed to realize that the fiber was truly helical. In this regard, however, it’s worth noting that Astbury’s student Elwyn Beighton produced B-form DNA photographs, remarkably similar to Franklin’s, in 1951.

    Correct, yes. The whole story, no.

  14. cirby says:

    The problem is that when a lot of people talk about “basic” research, they often confuse “basic” and “applied.”

    A lot of the successes of “basic” government-funded research are really in applied research – they were looking for specific answers to specific problems. The internet itself was one of those – they needed a practical solution for computer networking.

    Conversely, there’s a LOT of what’s called “basic” research that’s pretty much just trivia that ends up having no usefulness to anyone whatsoever. That sort could easily be disposed of just from reading the proposal closely in the first place.

  15. AQR says:

    You ask, ” What might happen if we ran the whole process again?” It seems to me that we have already done the experiment. One could argue that 1000 years ago, China was at least as technologically advanced as Europe and because of the difficulties in communication between the East and West, they can be treated as two isolated systems. Why did the technology of one culture remain relatively stable while all of the advances mentioned in the article are the product of the other? According to Ridley’s position on this mystical concept of the “technium”, the two cultures should have developed in parallel.
    Where I think he misses the mark is that he totally dismisses the importance of the individual scientist. He thinks like an NFL coach – “If Newton goes down, we have Leibnitz. If Leibnitz goes down, we always have another guy on the bench.” It’s not just funding, it is the inspiration and creativity of the individual that leads to advances. Where this creativity happens and is fostered, we see advances in science and technology, I don’t know why this happen in the West and not the East, but it seems that advances in technology and science are not inevitable.

  16. JAB says:

    @AQR: Ian Morris recently published a book that deals with the differences between eastern and western culture and advancement. Some of it is economic, some technology:

    thttp://www.amazon.com/Why-West-Rules-Now-Patterns/dp/0312611692

  17. ex-academic says:

    David Borhani: the incompleteness is exactly what bothers me, because he’s discarded a huge number of data points that don’t fit his hypothesis. (Bragg? Laue? Hodgkin?) And if Ridley was referring to Astbury, I’m not sure how the work of a public university professor and FRS supports his argument. Maybe he got funding for his research from the textile industry? It’s still the kind of selective interpretation of evidence that gets junior scientists torn to shreds in group meeting.

  18. Andrewd says:

    Ridley’s position reminds me of Faraday’s comment:
    Titled and wealthy woman:-What use, pray, is Electricity Mr Faraday?
    Faraday:-And what use is a new born baby Madam?

  19. matt says:

    Balderdash and poppycock, I say. He started out with his ideaology: that we need to slice government down to nothingness, or at least to Calvin Coolidge-may-his-name-be-ever-praised levels. Then he picked a handful of examples that can be made to fit his thesis, organized it so it hypothesizes about unknowable alternatives and thus cannot be disproven, and poof! There emerges his idealogy again, shinier for his effort in polishing it up and confirming its rightness to himself.

    Without numbers, though, the turds and the gold nuggets are indistinguishable. Let’s stipulate that he’s right, that industry will go back and do the basic research. Certainly, under the pressure of profit/loss statements, they will seek to do it more efficiently–ideally by waiting until someone else has spent the money and buying them or their innovations up, of course. But what if the overall rate of progress is one tenth that provided by the less-efficiently spent but vastly more money being put in by government?

    All of a sudden, his inexorable wheel of progress grinds to a halt, as those industries get decimated by competitors in countries who haven’t been so clever as to shoot themselves in the leg. Even if no other country spends on basic R&D, to the extent the industry is more stagnant, it has less to spend on ALL R&D, and you may be sure that basic R&D will be cut much faster than applied R&D for very logical reasons.

    If drug companies and private foundations were the the only entities exploring basic biology and identifying druggable targets, I can believe some research would still get done. And no doubt, since the money might be orders of magnitude less, it would be done more efficiently in some regards. (Not entirely, because some economies of scale would be lost.) But if R&D is already considered a poor use of money for pharma, imagine how much worse it would be if pharma had to shoulder the basic research! I’m certain the pharma industry would be much smaller.

    Finally, what the hell is a science writer doing writing what is essentially a historical musing / political opinion piece? Is this what passes for science in Rupert Murdoch’s enterprise? The WSJ is moving closer and closer to Fox News, it seems. There are already a handful of writers there whose byline I recognize and instantly ignore what they write, because they’ve shown a complete lack of understanding of the industries they cover. Sort of the evil opposites of a Matthew Herper, as clueless as he is clueful.

  20. matt says:

    And re: “Don’t make the mistake of thinking about what would happen if we just suddenly defunded the NIH and the NSF; that’s not what we’re talking about.”

    Except that’s exactly what he’s driving toward. Perhaps the myriad of alternatives once we changed some event in the past cannot be known, but we can certainly see how applying his clueless reasoning would be detrimental going forward.

    On a side note, what is the meaning of the “You are posting comments too quickly” message? I get that the first time I click the Post Comment button, even if it’s been days or months since I last posted, and hours since I started writing in the comments field.

  21. Daniel T says:

    As anyone who has ever been in academia knows most of the government money spent on basic research is spent training junior scientists. The only way industrial research could take place is if industry funded this training either directly via grants to universities, or indirectly via higher wages to scientists. Since the value of this training can’t be captured by industry (it walks out on two feet) no rational company will fund the training of junior scientists and hence all industrial research will grind to a halt due to a lack of scientist. It does not matter how much you want to do applied research, or how much profit you can make if you do the research, if you don’t have, and can’t hire, staff that can actually do what you want.

  22. Peter says:

    It’s true to anyone but the Ivy elite that the basic system of research is broken, the wheels have come off and the self driving car is careening towards the fuel depot. A third of adjuncts are on food stamps while tenured faculty lounge around working part-time, summers off hours
    while their soon to the unemployed students come up with new ideas?

    Basic science would be an apt description.

    I wouldn’t decrease the net amount of science funding, but I would limit it to institutions which re-invest their massive endowments (Harvard at 35 billion now), into education and science instead of paying fees to investment analysts and giving the faculty raises.

    Classifying higher education and research as non-profit is the biggest joke of the century. It’s a for-profit business that is about as efficient as an old Soviet hat factory.

  23. Garrett Wollman says:

    Not sure I would introduce the 5th Viscount Ridley as “Science writer”, even if that is the context in which his article is being presented. (And yes, he does have academic credentials, and came about them honestly, but he is also a member of the British aristocracy, a hereditary peer, and a member of the Conservative party, — not exactly a disinterested observer — and he has previously won awards from economic-libertarian organizations for his writing in support of their cause.)

  24. Todd Knarr says:

    I think Ridley goes off the rails with his comment about steam engines and thermodynamics, and his error’s the basic problem with his philosophy. Sure, once you’ve got the steam engine you’ll pay for advances in thermodynamics because they let you build a better steam engine than your competitors. But you won’t invent the steam engine without having a basic understanding of thermodynamics first because without that understanding you won’t realize you can build a steam engine. To get a feedback loop started you need that initial push that doesn’t depend on the feedback loop, and that’s what basic research supplies.

    We also tend to make basic discoveries out-of-sequence, we uncover something that seems to be useless because we don’t yet have the other discoveries or the advances in the tools we need to put it to use. But if we hadn’t discovered it earlier, when we did get to the point we could make use of it we wouldn’t know to go looking for it because it doesn’t itself follow from that point.

  25. gippgig says:

    It wasn’t until World War II that government funding of research really got started. Basic research got done just fine before then (such as the foundations for atomic energy, radio, & computers).
    Throwing money at a problem can certainly be effective (see Apollo) but it is seldom a good approach. Government finding of research has attracted people who are more interested in making money than making discoveries which is one of the worst things possible.
    Also note that the money government spends on research doesn’t appear out of thin air; it is taken from companies etc. who then have less money to spend on research.

  26. gippgig says:

    Oops, that should be funding not finding.

  27. Eric says:

    @gippgig

    I find your bald assertion “Government f[u]nding of research has attracted people who are more interested in making money than making discoveries” quite remarkable. I’d like to see your evidence for this. Have you looked at academic salaries around the world recently, for example? The academics I interact with in the UK are driven by curiosity, and work in academia precisely because they’re allowed to follow their interests rather than focus on boring cookie-cutter commercial activities – developing new synthetic routes, for example, rather than banging out 200 analogues via a single route. If you were to suggest to them that they’re in it for the money, the best you might come away with is a slight hearing deficiency from the loud, hollow laughter.

    On a wider point concerning the article: as has been pointed out, Viscount Ridley, who generated his own money and status by his own skill and ingenuity in being born into a extremely wealthy family and inheriting that wealth, is a vocal proponent of small government. That didn’t stop him taking an enormous amount of money from the UK government when the bank he was running imploded spectacularly at the height of the global financial crisis, due the unsustainable business model it was following: borrowing money for 3-12 months and lending it for 25 years. I would have more respect for his opinions if he had the courage of his convictions and transferred to somewhere like Somalia, where I believe they are operating his dream, zero tax, zero government, society.

  28. Pierre Lebeaupin says:

    I can’t say for chemistry or pharma. But in my field of computing, the most remarked examples of basic research happening in private labs:
    1. were from companies that had some sort of monopoly situation (the AT&T monopoly for telephone funded Bell Labs, the monopoly situation granted by the photocopy patents from Xerox funded PARC) and
    2. these companies often did not really know what to do with the results of this research, and others really reaped the benefits from these.

    So this basic research ended up benefitting others more than the company in question, and these companies could afford to do so anyway thanks to money they got from a public-enforced situation. So might as well reduce these monopoly situations (at least other than patents), cut the middlemen and leave basic research directly to public institutions, which is what mostly happened between the heyday of private labs and now.

    It’s something Pierre-Gilles de Gennes touched on in his book after his Nobel: as the head of a major physics and chemistry school, he would encourage his researchers to look for unsolved basic research problems in industry, e.g. in thin film liquid flow, while also telling them solving the industry’s more applied problems was not their job. So it makes sense for the industry to be funding in part basic research so they can have some say and stake in what is being researched, with most of the funding for basic research labs and institutions still coming from public funds.

    My two cents, anyway.

  29. Corporate labs can be just as much a boondoggle as government funded ones. HP Labs seems to be such a place. They’re working on projects that are perennially two years from commercialization (memristors) or are yesterday’s news (cloud computing). At least one wag has described their mission as solving yesterday’s problems… Tomorrow!

    And yet they seem to be doing OK on the funding side. For now.

    1. It sounds like HP labs is the sort of place that many people think cannot exist.

  30. anon says:

    @Pierre Lebeaupin: if that is so, how come so many of the major companies in Silicon Valley were spun out of Stanford?

  31. gippgig says:

    I definitely did not say that everyone doing government-funded research was in it for the money; people who want to do research will do so, money or no money. Spending billions of dollars on research will ALSO attract people who are in it for money.

  32. Tom Womack says:

    Thermodynamics followed the steam engine; but radio followed Maxwell’s equations.

    When was it that universities started to claim strong IP rights over their professors’ inventions?

  33. Shion Arita says:

    @ Todd Knarr:

    You touch upon a very deep point. I think that in all of what we humans do, the hardest thing for us is that moment of true innovation, where someone finds a good model that explains some strange results and shifts a paradigm, or in the case of things like computers, steam engines, and in these more modern times synthetic biology, is the moment where one notices that it’s possible to do it, that in that sea of infinite possibility of things to do or attempt, THIS is something that can happen, and it is interesting. For the (very important as well) subsequent steps of refinement and functional improvement it feels to me like we have those methods down, and that they generalize to some extent. But I think that those innovation moments don’t really generalize. What circumstances, both in terms of a unique mental moment and the longer term structure of ideas that set the conditions for it, lead to someone coming up with the concept of, say a programmable computer, aren’t really that similar to those that led to evolution by natural selection. I do think that certain individuals have the knack for it in general, but I think that this mostly comes from a combination of high general intelligence and a propensity for just plain weird and bizarre thinking.

    What’s kind of strange is that even though this is the hardest thing for us, we’re the only things we know of that can do it at all, and it is this capacity more than anything else that distinguishes humanity from the rest of life on earth.

    A simpler version of this is the P vs NP dichotomy. My intuition says with high confidence that P doesn’t equal NP, and that this is due to a deeper law of reality. There’s something about the way things work, that we don’t quite understand entirely (1: some forays into it are things like cellular automata, where chaotic and strange-seeming results are produced by methods that one might naiively anticipate would be fairly predictable) (2: our lack of understanding results in us covering over some things with terms lik e’emergence’), that is what makes solutions vastly easier to verify than produce, for things to be easier to understand in hindsight, for true creativity to be so difficult.

  34. Steve says:

    @ gippgig, I think your ideology is getting in the way of the facts. It’s rather ridiculous to say that government spending attracts people who are in it for money. No one enters government work (e.g., NIH) or academia with big dollar signs in front of their eyes. It is equally ridiculous to say the money comes from corporate research. It comes from all of us, not directly from corporate research coffers. It’s equally ridiculous for Ridley to claim that you don’t need basic science to develop technology. If you trace the origins of biotechnology, for example, they come from academia and basic studies of DNA (were Watson and Crick technologists or basic scientists?). If you trace the origins of computers, they come from academia and basic math (e.g., Alan Turing). If you trace the origins of the chemical synthetic industry they come from basic research in chemistry (e.g., Pauling). So the whole thesis is just a crock. Without basic science as a foundation technology there will be no new fields of invention, just variations on a theme for the old ones.

  35. “All of a sudden, his inexorable wheel of progress grinds to a halt, as those industries get decimated by competitors in countries who haven’t been so clever as to shoot themselves in the leg.”

    I understand there are many complaints about other nations copying us. We can’t copy them?

  36. Oliver H says:

    Derek, the notion that some “real-world” purpose is attached to scientific funding may be true in the US, but it’s far from generalizable. Germany has a constitutional guarantee of freedom of research, and as such, the major research organizations get funding from the federal and state governments that they can distribute as they see fit. Now, the government can, of course, want to push research in a certain area, but that needs to go ON TOP of the regular research budget, just like anyone else with a large enough wallet can build a research institute if they want to.

  37. Tom B says:

    I think that, especially in america, we are too alert to making distinctions between the acts of private organizations and government. Long-term/Long-range planning of the sort that so much research and development requires (pharmaceuticals being an apt example) require such similar structures and maintenance that the differences tend to dissolve the larger the organization becomes, the longer the time range over which research must be performed, and the greater the sunk costs and risks of pursuing the research become.

    I am poorly paraphrasing the institutional economics viewpoint set out by Galbraith in his The New Industrial State. Worth a read if you’re interested in a deep dive into the management of risk and organization of research.

Comments are closed.