Home > Blogs & Communities > Origins > August 2009 Archives  

August 2009 Archives

Taforalttout.nassarius Anatomically modern humans evolved in Africa nearly 200,000 years ago, but researchers have long debated why there seems to be a gap between when hominins started looking modern and when they began acting modern. Some of the most important indications of modernity, such as cave art and certain types of advanced tools, don’t show up in the archaeological record until about 40,000 years ago, mostly in Europe. That has led some scientists, notably archaeologist Richard Klein of Stanford University in Palo Alto, California, to argue that modern human cognition, including language and other complex symbolic behavior, needed the additional kick-start of a genetic mutation about 50,000 years ago.

Yet an increasing number of researchers have come to think that Homo sapiens was capable of modern behavior from the very beginning of its history. Whether those behaviors show up in the archaeological record, these researchers say, depends on a variety of factors unrelated to genetics, such as how big and widespread early human populations were and what environmental challenges they faced. A team led by archaeologist Francesco d’Errico of the University of Bordeaux in France will soon publish what it considers yet more evidence for this viewpoint online in the Proceedings of the National Academy of Sciences (PNAS). Although the paper is still unpublished, PNAS has released the embargo on it, which will be available here when published. D’Errico and colleagues describe 25 shell beads, including those shown above, claimed to have been used as personal ornaments and found in four sites in Morocco dated between 85,000 and 70,000 years ago.

If this sounds familiar, that’s probably because d’Errico and others have recently published numerous sightings and analyses of shell beads from sites in Morocco, Algeria, South Africa, and Israel as old as 100,000 years ago. Nearly all researchers, including Klein, agree that personal ornaments are solid indications of modern behavior. Wearing jewelry is a form of symbolic signaling of personal identity to other hominins. And the use of symbols, scientists also agree, is a hallmark of modern human behavior—even if there are indications that earlier hominins might also have ventured into this cognitive realm.

Yet Klein has questioned many of the claims made by d’Errico and others, arguing most recently in a 2008 review in Evolutionary Anthropology that the holes found in the shells, which presumably allowed them to be strung as beads, were the result of natural abrasion from the action of the sea. He also points out that there are very few such examples in Africa. Given the near total absence of shell beads in Africa between 70,000 and 40,000 years ago, Klein suggests that the relatively infrequent earlier sightings do not represent a real symbolic tradition that endured over time. The new PNAS paper cites Klein’s Evolutionary Anthropology critique and counters with a detailed analysis of the newly discovered shell beads from the four Moroccan sites: Taforalt, Rhafas, Ifri n’Ammar, and Contrebandiers, which are dated by various techniques, including optically stimulated luminescence, thermoluminescence, and Uranium-series isotopic measurements.

Nearly all of the shells, the team writes, are from the genus Nassarius, a sea snail that shows up in archaeological sites all over Africa, Asia, and Europe. Moreover, three of the four sites are located about 50 kilometers from a coast, reinforcing the suggestion that the shells were deliberately transported inland and used for symbolic purposes, the team says. The team also notes that fragments of gravel inside shells and signs of mechanical erosion from lying on the seabed suggest that many snails were dead before they were collected. Thus they weren’t transported inland to be eaten. Finally, at some sites, the team found signs of ochre pigment clinging to the outside of the shells, which many archaeologists consider evidence of symbolic behavior.

But if hominins could create personal ornaments as early as 100,000 years ago, why have researchers found so few of them between 70,000 and 40,000 years ago in Africa and the Near East? The team points out that after 70,000 years ago, the climate turned markedly colder in both the Northern and the Southern hemispheres. When the temperature was warmer, the team writes, modern human populations were growing and “marine shell beads may have been instrumental in creating and maintaining exchange networks between coastal and inland areas. …” In other words, ornaments were a means for hominin groups to communicate their identities to each other. But the arrival of harsher conditions, and the possible resulting population crashes, “may have disrupted these networks through the depopulation of some areas, thereby isolating hunter-gatherer populations to the extent that such social and exchange networks became untenable.”

The article is part of a PNAS Special Feature series on “Out of Africa,” edited by none other than Richard Klein himself, who told Science that the d’Errico et al. paper adds “balance” to the issue. Some papers are being published online ahead of time; the entire Special Feature is tentatively scheduled for publication in late September, according to the PNAS press office.

—Michael Balter

CREDIT: Copyright d'Errico/Vanhaeren

We are, fundamentally, a fusion. As I wrote in my essay for Science on the origin of eukaryotes, there's now a wealth of evidence that our cells evolved from the combination of two different microbes. The mitochondria that generate fuel for our cells started out as free-living bacteria. Today, they still retain traces of their origin in the bacterial DNA they carry, as well as their bacterial structure, including the membrane within a membrane that envelops them.

Scientists I spoke with as I worked on the essay agreed that this merging was a profound event in the history of life. No living eukaryote, whether animal, plant, fungus, or protozoan, has completely lost its mitochondria since that symbiotic milestone some 2 billion years ago. It wasn't the only time that two species merged, however. Plants, for example, descend from algae that engulfed a species of photosynthesizing bacteria. Many protozoans have swallowed up photosynthetic partners as well.

Yet in all these cases, eukaryotes did the swallowing. It's striking that scientists have such a hard time finding an example of a noneukaryote (a prokaryote such as Escherichia coli and other bacteria) hosting a prokaryote symbiont. Some scientists have gone so far as to argue that swallowing up a partner requires lots of intricate molecular systems that can create a pocket in the surface of a cell and can draw that pocket inside the cell as a bubble. Eukaryotes have this sort of cellular skeleton, and prokaryotes, it seems, don't. If that's true, then our ancestors swallowed up mitochondria only after they evolved the molecules necessary for the swallowing.

But today, there's a provocative new alternative to consider. Maybe a lot of today's prokaryotes are also the result of an ancient merger. The idea comes from James Lake of the University of California, Los Angeles, a veteran researcher on the early history of life. In my essay, I describe how Lake first proposed in the early 1980s that the host cell that gave rise to eukaryotes belonged to a lineage of prokaryotes he dubbed eocytes. Now, a quarter of a century later, new studies on genomes are strongly supporting his eocyte hypothesis. In today's issue of Nature, Lake questions whether we may be too quick to assume that only eukaryotes are the result of fusion. He observes that aphids depend on a species of bacterium called Buchnera to digest their food, and Buchnera in turn contains other bacteria on which its own survival depends. These two bacteria are still distinct enough from each other that we can tell them apart. But what if two bacteria joined together billions of years ago and their identities blurred together? How would we tell them apart?

To look for possible signs of ancient fusion, Lake compared proteins in over 3000 different prokaryote genomes. He concluded that a major group of bacteria known as Gram-negative bacteria is actually the result of a fusion of two different kinds of bacteria, known as Actinobacteria and Clostridia. These bacteria, which include the ancestors of mitochondria, are unusual in many ways, but the most obvious one is their membranes. Whereas other bacteria are surrounded by a single membrane, Gram-negative bacteria are surrounded by two. It's possible, Lake argues, that the double-membrane structure of these bacteria is a vestige of one kind of bacteria living inside another.

How exactly they made that merger is one of many questions Lake's hypothesis raises. Buchnera's microbial residents may offer some clues. There are also predatory bacteria that push their way into other bacteria in order to feed on them from the inside; in some cases, these predators spare their victims and just live harmless inside them. Lake also points out that microbes don't actually have to take up residence with each other to mix their genomes.

When scientists dredge up muck from the ocean floor, for example, they often find different species of microbes living together in tight clumps. They have to live close to other species to survive because each species takes care of chemical reactions that their partners can't carry out on their own. That intimacy makes it easier for individual genes to move from host to host, as viruses infect different microbes or as microbes die and other microbes slurp up their genes.

It will be interesting to see if Lake's new hypothesis fares as well as his eocyte hypothesis is doing. If he's right, this symbiosis had an impact on the history of life on par with the origin of eukaryotes. Gram-negative bacteria were the first photosynthesizers, for example, and were then swallowed up by the ancestors of plants. And the same lineage also gave rise to the bacteria that became our own mitochondria. Our cells, in other words, are not just microbes within microbes; they are microbes within microbes within microbes: a true Russian doll of evolution.

Carl Zimmer

August 18, 2009

The Origins of Handedness

About 85% of humans worldwide are right-handed, at least for most tasks they perform. Most of the rest are left-handed; true ambidexterity is very rare. Yet researchers are not sure just when in hominin evolution a tendency to use one hand over the other evolved—evidence for handedness in other animals is inconsistent and controversial—nor why handedness might have been favored by natural selection. (Why right-handed people dominate rather than left-handed people is a whole other question.) Chimpanzees, gorillas, and other nonhuman primates do not show a  consistent bias for one hand or the other, which means that our handedness probably arose sometime after the chimp and human lines went their evolutionary ways about 5 million to 7 million years ago. Many researchers have suggested that there is a link between handedness and the evolution of language, because both involve asymmetries—or “lateralization”—in the brain. (Most people, including most who are left-handed, do their speech processing on the left side of their brains.)

One researcher hot on the trail of these issues is Natalie Uomini, an archaeologist at the University of Liverpool in the United Kingdom. In a new paper in press at the Journal of Human Evolution (JHE), Uomini reviews the evidence for the prehistory of handedness, including studies in nonhuman primates, and even offers a brief report on her own observations of human children and adults. Uomini also makes some interesting suggestions about why handedness evolved on the human line. The paper is part of a special JHE issue titled “Paleoanthropology Meets Primatology,” which is edited by researchers William McGrew and Robert Foley of the University of Cambridge.

Citing earlier work by other researchers, Uomini points out that handedness does not mean that one hand is “dominant” over the other. Rather, she writes, “both hands have different but equally important roles.” In right-handed people, for example, the right hand might be used for tasks requiring greater manual dexterity whereas the left hand might perform the more mundane but nevertheless crucial role of supporting an object. (Imagine eating dinner with just a knife but no fork, for example.) In human children, this kind of handedness begins to emerge between 7 and 13 months of age and is well-established by age 3. In nonhuman primates, however, there is no evidence of species-wide bias for one hand or the other, although some individual primate populations do show such trends. Yet even those trends are not consistent: Although researchers studying the captive chimpanzees at the Yerkes National Primate Research Center in Atlanta have found that they tend to be right-handed for many tasks, wild chimps at the Gombe Stream National Park in Tanzania tend to use their left hands to fish termites out of trees with sticks. These are trends only, Uomini points out, and they never reach the “extreme degree of consistency seen in humans.”

Figuring out when such consistency arose in hominins is not an easy task. In a small number of cases, it is possible to detect signs of handedness in early hominin fossils, such as the shoulder and arm bones of the Nariokotome Boy, a nearly complete 1.6-million-year-old Homo ergaster skeleton found in Kenya by Richard Leakey’s research team. The Nariokotome Boy was clearly right-handed, as indicated by the deeper bone insertions of his deltoid muscles in his clavicle (collar bone) and the greater length of his ulna. Yet Uomini notes that knowing the hand bias of one individual tells us nothing about the handedness of the population he belonged to, let alone that of his entire species. And she critiques the seminal 1980s studies by Nicholas Toth (now at the Stone Age Institute in Bloomington, Indiana) and other researchers who claimed to have detected a trend toward right-handedness in early hominin tools by looking at the direction in which flakes were struck off of a larger stone core. Uomini argues that such studies are limited by their assumption that the trends seen in an assemblage of flakes or tools directly represent the hominins who produced them. Because many of the tools might have been produced by one expert right-handed toolmaker, “it is not appropriate to treat each flake … as one data point,” she writes.

Uomini concludes that although hand bias may be hard to detect among early hominins, there is clear evidence from the large number of available Neandertal skeletons that our evolutionary cousins tended to be right-handed. Their right arms and shoulders show greater robusticity, possibly from throwing spears as they hunted wild animals. And studies of the teeth of both Neandertals and H. heidelbergensis, the presumed common ancestor of Neandertals and modern humans, show that the direction of their striations is consistent with biting on food held in the right hand.

If the trend toward right-handedness was indeed well-established by the era of H. heidelbergensis, which thrived in Europe and Africa about 500,000 years ago, what was its evolutionary advantage? To help get at that question, Uomini observed the spontaneous hand use of children and adults who visited the Lejre Historical Archaeological Research Center in

Handedness studyDenmark, using techniques similar to those employed by scientists with nonhuman primates. Visitors were invited to try two different “prehistoric games,” one that involved cracking walnuts using anvils and stone hammers (right) and the other a puzzle that required the subjects to refit flakes back into a flint core from which they were struck. The visitors were also asked to sign their names so their writing hands could be determined. Uomini performed statistical analyses on 48 children and adults ranging in age from 3 to 49 years who performed at least six bouts of nut-cracking or puzzle-solving and found that although the great majority of nut-crackers held the hammer stone in their right hands (which corresponded to their writing hand), most of the puzzle-solvers used both hands equally.

Uomini argues that these findings are similar to those found in studies of nonhuman primates, in which the degree of handedness varies depending on the complexity of the task they are facing and the level of skill required to perform it. The nut-cracking task required five basic actions, Uomini says—“grasp nut, place nut on anvil, grasp hammer, hit nut, and eat nut”—whereas the puzzle required at most three actions (grasp flake, grasp core, fit flake to core.) And along with other researchers who have offered a similar hypothesis, she suggests that the increasing sophistication of hominin toolmaking technologies over time may have selected for a greater degree of handedness. Indeed, Uomini says, the “technology-dense lifestyles” of early hominins might have required our ancestors to more or less make up their minds about what hands they were going to use to perform complex tasks. Moreover, such hand bias could have aided the learning process as hominins taught each other toolmaking and other skills; a number of studies have shown that people learn manually difficult tasks, such as knot-tying, more easily when they use the same left- and right-hand movements as their teachers.

If the hypothesis that handedness was selected for because it made hominins better toolmakers is correct, Uomini concludes, it should be testable in the archaeological record: The more complex the tool, the more evidence there should be for handedness in its use. Yet a number of questions remain, including whether there are selective advantages for right over left handedness that are strong enough to explain why almost the entire human species skews to the right, and is such advantages exist, why 15% of humans  remain stubbornly left-handed. The answer, Uomini writes, may come with ongoing research into the genetics of brain asymmetries.

—Michael Balter

Photo credit: Lejre Land of Legends

August 14, 2009

Excitable Plants

548px-VFT_ne1 In researching last month's Origins essay on the origin of the nervous system, I was struck by the range of behavior and electrical excitability exhibited by organisms that lack nerve or muscle cells. Some sponges, for example, have a sneeze-like reflex that flushes out sediment (see a video), whereas others generate electrical "action potentials" much like the impulses that convey information in nerves and brains. Electrical signals have been recorded even in the single-celled Paramecium, where they appear to play a role in escape and avoidance behaviors.

And it doesn't stop there. As far back as the 1870s, researchers had measured action potentials in two plants: the Venus flytrap (Dionaea muscipula) (left) and the800px-Mimosa-pudica-ante touch-sensitive Mimosa pudica (right). To find out more, I recently visited Elizabeth Van Volkenburgh at the University of Washington (UW) in Seattle. She co-authored a provocative 2006 review paper on "plant neurobiology" in which she and colleagues argued that electrical excitability is just one of several signaling mechanisms used by both animals with nervous systems and plants to gather information about their environment and change their behavior accordingly. Yes, behavior. In plants.

In a tour of the UW greenhouse, Van Volkenburgh demonstrated two famous examples, the quick snap of a Venus flytrap and the folding leaves of a Mimosa plant (see video at bottom of page). Root growth is another well-studied example of plant behavior, Van Volkenburgh says. The tip of a growing root can seek out moisture and steer a course around rocks and other obstacles. "If you look at video clips of root growth, you see a lot of behavior 364px-Darwin_Movement_in_Plants that looks like worm behavior," she says. Charles Darwin made a similar observation in The Power of Movement in Plants, published in 1880: "It is hardly an exaggeration to say that the tip of the [root] … acts like the brain of one of the lower animals; the brain being seated within the anterior end of the body, receiving impressions from the sense‑organs, and directing the several movements."

More than a century later, scientists are still investigating the sensory abilities of plants. In 2006, researchers reported in Science that seedlings of the parasitic dodder plant can sniff out their preferred host, growing toward a tomato plant—or even a vial of tomato extract—and shunning wheat. Sensing light is crucial for plants, and Van Volkenburgh says there are likely dozens of plant photoreceptor proteins for detecting different wavelengths of light. "They're detecting the quality of light, the intensity of light, the direction light is coming from, … and they're doing it all over, in every single cell."

As in animals, photoreceptors stimulated by light can trigger electrical activity in plants, her lab has found. "When light is shined on a young leaf, one of the first things that happens is what looks like an action potential," she says. That electrical impulse kicks off a series of events that enable the leaf to grow bigger. Other researchers also have found evidence that plants use electrical signaling. A 1996 study, for example, found that damaging the leaf of a tomato plant evokes an action potential, followed by a boost in gene expression for a proteinase inhibitor that makes the plants indigestible to insects. Electrical stimulation alone, in the absence of leaf damage, had a similar effect. Earlier this year, researchers used arrays of fine microelectrodes like the ones used in some neurobiology experiments to demonstrate synchronized electrical activity in the root tips of maize plants.

Plants also make a number of compounds that function as neurotransmitters in animals, including glutamate, GABA, dopamine and serotonin (though little is known about their roles in plant physiology), and some researchers see a parallel between neurotransmitter release at synapses and the release of the plant hormone auxin. Like neurotransmitters, auxin is packaged in vesicles that fuse with the cell membrane to unload their cargo into the cleft between cells. František Baluška, a botanist at Rheinische Friedrich-Wilhelms-University Bonn in Germany has gone so far as to call this arrangement a plant synapse.

But Van Volkenburgh thinks that's reaching a bit. At synapses, an electrical signal triggers release of a neurotransmitter, which in turn triggers an electrical response in the neighboring cell. But that chain of events has not yet been demonstrated for auxin, she said. "I would love to see those questions answered."

The mix of plants, behavior, and neurobiology is a strange cocktail. Even Darwin sounds slightly less than sober when he conjures up the image of tiny brains in the tips of growing roots. All the same, as Darwin realized, and as Van Volkenburgh and others have argued more recently, plants are far more responsive and dynamic than most animal-centric researchers give them credit for. That they do what they do without a nervous system makes them all the more fascinating.

—Greg Miller

Credits (top to bottom): Noah Elhardt, Wikimedia Commons; Sten Porse, Wikimedia Commons; Wikimedia Commons

August 12, 2009

On the Origin of Genomes

We live in genome-centric times. Already high-throughput sequencing machines have unraveled genetic blueprints for thousands of organisms, as well as viruses and organelles, and cheaper, faster technology promises thousands more in the near future. But where did the first genomes come from—and how? What rules govern how they function and what they look like? And why do they vary so much in size from organism to organism?

In a special theme issue of the Philosophical Transactions of the Royal Society B, two dozen experts discuss genome evolution—starting from the deep beginnings in the presumably RNA world, through rampant genome innovation through lateral gene transfer across species, to the divergent histories of particular genes and gene families. They touch upon an emerging concept of the supergenome, which describes the set of all genes in the local environment that a prokaryote has potential access to because of lateral-transfer processes. Based on genomes of primitive organisms, researchers are piecing together the genetic tool kit of the earliest bilateral animals. And one paper argues that viruses, being vast sources of genetic material, cannot be dismissed as nonliving material.

—Elizabeth Pennisi

Animals owe their survival to plants and other photosynthetic organisms. But according to a study published last week in Nature, photosynthesizers gave early animals another big assist, unleashing the Cambrian explosion, the big bang of animal evolution, 540 million years ago. The paper asserts that late in the Precambrian—the time before the Cambrian period—plants and other photosynthesizers settled the land, triggering a rise in atmospheric oxygen that shifted animal evolution into overdrive.

The Cambrian explosion was a burst of evolutionary creativity during which most of the modern groups of animals, such as arthropods, made their debuts. Anomalocaris_BW(2).jpgAnomalocaris (left) was one of the animals that swam the seas at the time. Scientists have ascribed the period's rapid diversification to everything from continental drift to the end of a global deep freeze.

Geochemist L. Paul Knauth of Arizona State University, Tempe, and geologist Martin Kennedy of the University of California, Riverside, posited a botanical explanation after analyzing the abundance of two rare isotopes, oxygen-18 and carbon-13. During the Precambrian, the key events took place in shallow seas, where limestone was forming. Sea level ups and downs mean that runoff from land sometimes infiltrates nascent rocks and changes their isotope balance—fresh water tends to be richer in carbon-12 derived from organic matter and lower in oxygen-18.

To check for signs of this freshwater infusion, Knauth and Kennedy pored over all published records of carbon and oxygen isotope measurements, plotting the relative ratios of carbon-13 and oxygen-18 against each other. They found that values for rocks from late in the Precambrian clustered with those for more recent rocks, which researchers know have been invaded by runoff laden with organic matter. That correspondence suggests that the land was lush during the late Precambrian. What Knauth and Kennedy term the “Precambrian greening” probably was under way by about 850 million years ago, the paper suggests. The identity of the early landlubbers is still uncertain, the most likely candidates being protists and small plants such as mosses and liverworts.

That leaves one step: explaining how the colonization of land by photosynthetic organisms started the diversification of marine animals that occurred between about 540 million and 490 million years ago. As Knauth and Kennedy see it, terrestrial greening would hasten erosion and further enhance growth of photosynthesizers, leading to an increase in the amount of carbon buried in sediments on land and in the sea. In turn, this decline in readily available carbon would allow atmospheric levels of oxygen to surge, permitting the evolution of larger, more complex animals.

—Mitch Leslie

Credit: Arthur Weasley, Wikimedia Commons

August 7, 2009

On the Origin of Sleep

From an evolutionary perspective, sleep might seem like a dangerous waste of time. When
animals slumber, they're less responsive to their surroundings and potentially more vulnerable to predators. And of course it's hard to reproduce when you're fast asleep. Many researchers assume that sleep must offer some adaptive advantage that more than makes up for its downsides. But what could that benefit be? Despite a great deal of research, there's no consensus.

That's because it's the wrong question to ask, according to Jerome Siegel, a neuroscientist at the University of California, Los Angeles. In an editorial published online this week in Nature Reviews Neuroscience, Siegel argues that sleep researchers have gotten off track in hunting for a universal benefit to sleep. In Siegel's view, the sheer diversity of sleep behavior, physiology, and neurochemistry argues against the idea that sleep confers the same benefit on all creatures.

A few examples:

*The big brown bat, one of the sleepiest animals known, dozes more than 20 hours a day. Giraffes sleep only about 4 hours.

*Migrating birds and newborn killer whales appear to forgo sleep for several weeks without having to make up for it later.

*REM sleep has been found in all terrestrial mammals tested (but not in dolphins and other cetaceans) and in birds (but not in reptiles, fish, or amphibians).

*Humans secrete more growth hormone during slow-wave sleep. Rats and dogs secrete growth hormone while they're awake.

*Male humans and rats experience erections during REM sleep. For male armadillos, erections occur only during non-REM sleep.

Instead of searching for a common thread through this confusing and contradictory jumble, Siegel argues, researchers should be thinking more about why different species sleep the way they do. He thinks it comes down to being active when it will do the most good. The brown bat, for example, wakes up at dusk just when the moths and mosquitoes it feeds on are most active. If it woke up any earlier, Siegel explained to me in a phone interview, it would risk attacks from agile raptors with keen daytime vision. If it stayed awake any later, fewer insects would be flying around. "Their awake time is exactly appropriate for their needs," Siegel says.

He thinks researchers should consider sleep as part of a continuum of inactive states found throughout the plant and animal kingdoms. Such periods of inactivity are adaptive mainly because they optimize the timing of particular behaviors (or other energy expenditures in the case of plants). Like a maple tree dropping its leaves or a bear settling in for a winter hibernation, sleep is just another way of making sure organisms spend their resources wisely, Siegel says. Thus, the reason sleep is so maddeningly diverse is that animals have such diverse ways of making a living. So, if you want to understand why an animal sleeps the way it does, you first have to understand what it does while it's awake. Siegel asserts.

"This strikes me as perhaps the most sensible thing I've seen written in a long time on the topic of what sleep is all about," says Ralph Greenspan, a neurogeneticist at the Neurosciences Institute in San Diego, California. The paper suggests to Greenspan that sleep arose first in some simple form and then gradually became more complex as more complex nervous systems evolved. His lab is just getting started on a project to investigate this idea more directly by looking for sleep-or something like it-in a handful of primitive organisms including jellyfish, which have a simple nervous system; Trichoplax, a simple marine creature with no nervous system; and single-celled Paramecium. They'll also examine several genes whose expression has been found to rise or fall during sleep in fruit flies and mice. "The main question is whether sleep requires a complex brain or whether sleep requires a brain at all," Greenspan says. Either way, the findings should provide interesting fodder for the discussion of what sleep is and how it evolved.

—Greg Miller

Puma: Ltshears - Trisha M Shears (wikimedia)

Moon Jelly:  Dante Alighieri

90807IntroArtToadstools, people, plants, and amoebae have strikingly similar cells. All these organisms keep their DNA coiled up in a nucleus. Their genes are interspersed with chunks of DNA that cells have to edit out to make proteins. Those proteins are shuttled through a maze of membranes before they can float out into the cell. And these cells all manufacture fuel in compartments called mitochondria.

All species with this arrangement are known as eukaryotes. The word is Greek for “true kernel,” referring to the nucleus. All other living things that lack a nucleus and mitochondria are known as prokaryotes. “It’s the deepest divide in the living world,” says William Martin of the University of Düsseldorf in Germany.

In this month’s Origins essay, Carl Zimmer looks at the evolution of the eukaryotic cell, one of the most important transitions in the history of life. Indeed, when you look at the natural world, most of what you see are these “true kernel” organisms.

Much of what we have learned about eukaryotes comes from studying their cell biology and their genomes. Through these efforts, researchers have made tremendous advances in the past 20 years in understanding that eukaryotes represent the merging of primitive microbes from both the archaeal and the bacterial worlds.

In addition to the essay, Zimmer talks about eukaryotes in a podcast.

—Elizabeth Pennisi

Image: Katharine Sutliff

But what about gorillas? That's what some may be wondering after they stop chuckling at The Daily Show's in-depth analysis of the academic battle over whether chimpanzees or orangutans are human's closest relative.


The Daily Show With Jon StewartMon - Thurs 11p / 10c
Human's Closest Relative
Daily Show
Full Episodes
Political HumorSpinal Tap Performance