I’ve been meaning to write about this paper in PNAS for a while. The authors (from Cal Tech and the Weizmann Institute) have set up a new web site, are calling for a more quantitative take on biological questions. They say that modern techniques are starting to give up meaningful inputs, and that we’re getting to the point where this perspective can be useful. A web site, Bionumbers, has been set up to provide ready access to data of this sort, and it’s well worth some time just for sheer curiosity’s sake.
But there’s more than that at work here. To pick an example from the paper, let’s say that you take a single E. coli bacterium and put it into a tube of culture medium, with only glucose as a carbon source. Now, think about what happens when this cell starts to grow and divide, but think like a chemist. What’s the limiting reagent here? What’s the rate-limiting step? Using the estimates for the size of a bacterium, its dry mass, a standard growth rate, and so on, you can arrive at a rough figure of about two billion sugar molecules needed per cell division.
Of course, bacteria aren’t made up of glucose molecules. How much of this carbon got used up just to convert it to amino acids and thence to proteins (the biggest item on the ledger by far, it turns out), to lipids, nucleic acids, and so on? What, in other words, is the energetic cost of building a bacterium? The estimate is about four billion ATPs needed. Comparing that to those two billion sugar molecules, and considering that you can get up to 30 ATPs per sugar under aerobic conditions, and you can see that there’s a ten to twentyfold mismatch here.
Where’s all the extra energy going? The best guess is that a lot of it is used up in keeping the cell membrane going (and keeping its various concentration potentials as unbalanced as they need to be). What’s interesting is that a back-of-the-envelope calculation can quickly tell you that there’s likely to be some other large energy requirement out there that you may not have considered. And here’s another question that follows: if the cell is growing with only glucose as a carbon source, how many glucose transporters does it need? How much of the cell membrane has to be taken up by them?
Well, at the standard generation time in such media of about forty minutes, roughly 10 to the tenth carbon atoms need to be brought in. Glucose transporters work at a top speed of about 100 molecules per second. Compare the actual surface area of the bacterial cell with the estimated size of the transporter complex. (That’s about 14 square nanometers, if you’re wondering, and thinking of it in those terms gives you the real flavor of this whole approach). At six carbons per glucose, then, it turns out that roughly 4% of the cell surface must taken up with glucose transporters.
That’s quite a bit, actually. But is it the maximum? Could a bacterium run with a 10% load, or would another rate-limiting step (at the ribosome, perhaps?) make itself felt? I have to say, I find this manner of thinking oddly refreshing. The growing popularity of synthetic biology and systems biology would seem to be a natural fit for this kind of thing.
It’s all quite reminiscent of the famous 2002 paper (PDF) “Can A Biologist Fix a Radio”, which called (in a deliberately provocative manner) for just such thinking. (The description of a group of post-docs figuring out how a radio works in that paper is not to be missed – it’s funny and painful/embarrassing in almost equal measure). As the author puts it, responding to some objections:
One of these arguments postulates that the cell is too complex to use engineering approaches. I disagree with this argument for two reasons. First, the radio analogy suggests that an approach that is inefficient in analyzing a simple system is unlikely to be more useful if the system is more complex. Second, the complexity is a term that is inversely related to the degree of understanding. Indeed, the insides of even my simple radio would overwhelm an average biologist (this notion has been proven experimentally), but would be an open book to an engineer. The engineers seem to be undeterred by the complexity of the problems they face and solve them by systematically applying formal approaches that take advantage of the ever-expanding computer power. As a result, such complex systems as an aircraft can be designed and tested completely in silico, and computer-simulated characters in movies and video games can be made so eerily life-like. Perhaps, if the effort spent on formalizing description of biological processes would be close to that spent on designing video games, the cells would appear less complex and more accessible to therapeutic intervention.
But I’ll let the PNAS authors have the last word here:
“It is fair to wonder whether this emphasis on quantification really brings anything new and compelling to the analysis of biological phenomena. We are persuaded that the answer to this question is yes and that this numerical spin on biological analysis carries with it a number of interesting consequences. First, a quantitative emphasis makes it possible to decipher the dominant forces in play in a given biological process (e.g., demand for energy or demand for carbon skeletons). Second, order of magnitude BioEstimates merged with BioNumbers help reveal limits on biological processes (minimal generation time or human-appropriated global net primary productivity) or lack thereof (available solar energy impinging on Earth versus humanity’s demands). Finally, numbers can be enlightening by sharpening the questions we ask about a given biological problem. Many biological experiments report their data in quantitative form and in some cases, as long as the models are verbal rather than quantitative, the theor y will lag behind the experiments. For example, if considering the input–output relation in a gene-regulatory net work or a signal- transduction network, it is one thing to say that the output goes up or down, it is quite another to say by how much.