Showing posts with label Evolution. Show all posts
Showing posts with label Evolution. Show all posts

Friday, June 24, 2011

Biologists discover how yeast cells reverse aging

The gene they found can double yeast lifespan when turned on late in life.

A whole yeast (Saccharomyces cerevisiae) cell viewed by X-ray microscopy. Inside, the nucleus and a large vacuole (red) are visible. 

Human cells have a finite lifespan: They can only divide a certain number of times before they die. However, that lifespan is reset when reproductive cells are formed, which is why the children of a 20-year-old man have the same life expectancy as those of an 80-year-old man.

How that resetting occurs in human cells is not known, but MIT biologists have now found a gene that appears to control this process in yeast. Furthermore, by turning on that gene in aged yeast cells, they were able to double their usual lifespan.

If the human cell lifespan is controlled in a similar way, it could offer a new approach to rejuvenating human cells or creating pluripotent stem cells, says Angelika Amon, professor of biology and senior author of a paper describing the work in the June 24 issue of the journal Science.

“If we can identify which genes reverse aging, we can start engineering ways to express them in normal cells,” says Amon, who is also a member of the David H. Koch Institute for Integrative Cancer Research. Lead author of the paper is Koch Institute postdoc Elçin Ünal.

Rejuvenation

Scientists already knew that aged yeast cells look different from younger cells. (Yeast have a normal lifespan of about 30 cell divisions.) Those age-related changes include accumulation of extra pieces of DNA, clumping of cellular proteins and abnormal structures of the nucleolus (a cluster of proteins and nucleic acids found in the cell nucleus that produce all other proteins in the cell).

However, they weren’t sure which of these physical markers were actually important to the aging process. “Nobody really knows what aging is,” Amon says. “We know all these things happen, but we don’t know what will eventually kill a cell or make it sick.”

When yeast cells reproduce, they undergo a special type of cell division called meiosis, which produces spores. The MIT team found that the signs of cellular aging disappear at the very end of meiosis. “There’s a true rejuvenation going on,” Amon says.

The researchers discovered that a gene called NDT80 is activated at the same time that the rejuvenation occurs. When they turned on this gene in aged cells that were not reproducing, the cells lived twice as long as normal.

“It took an old cell and made it young again,” Amon says.

In aged cells with activated NDT80, the nucleolar damage was the only age-related change that disappeared. That suggests that nucleolar changes are the primary force behind the aging process, Amon says.

The next challenge, says Daniel Gottschling, a member of the Fred Hutchinson Cancer Research Center in Seattle, will be to figure out the cellular mechanisms driving those changes. “Something is going on that we don’t know about,” says Gottschling, who was not involved in this research. “It opens up some new biology, in terms of how lifespan is being reset.”

The protein produced by the NDT80 gene is a transcription factor, meaning that it activates other genes. The MIT researchers are now looking for the genes targeted by NDT80, which likely carry out the rejuvenation process.

Amon and her colleagues are also planning to study NDT80’s effects in the worm C. elegans, and may also investigate the effects of the analogous gene in mice, p63. Humans also have the p63 gene, a close relative of the cancer-protective gene p53 found in the cells that make sperm and eggs. 

Source MIT

Humans Guided Evolution of Dog Barks

It’s a question that tends to arise when a neighborhood mutt sees a cat at 3 a.m., or if you live in an apartment above someone who leaves their small, yapping dog alone all day: Why do dogs bark so much?
Perhaps because humans designed them that way.

“The direct or indirect human artificial selection process made the dog bark as we know,” said Csaba Molnar, formerly an ethologist at Hungary’s Eotvos Lorand University.
Molnar’s work was inspired by a simple but intriguing fact: Barking is common in domesticated dogs, but infrequent if not downright absent in their wild counterparts. Wild dogs yip and squeal and whine, but rarely produce the repetitive acoustic percussion that is barking. Many people had made that observation, but Molnar and his colleagues were the first to rigorously investigate it.

Because anatomical differences between wild and domestic dogs don’t explain the barking gap, Molnar hypothesized a link to their one great difference: Domesticated dogs have spent the last 50,000 years in human company, being intensively bred to fit our requirements.
Evolution over such a relatively short time is difficult to pin down, but Molnar reasoned that if his hypothesis were correct, two facts would need to be true: Barks should contain information about dogs’ internal states or external environment, and humans should be able to interpret them.
To people who know dogs well, this might seem self-evident. But not every intuition is true. As Molnar’s research would show, sheepherders — people understandably certain in their ability to recognize their own own dogs’ voices — actually couldn’t distinguish their dogs’ barks from others.

Molnar tested his propositions in a series of experiments described in various journal papers between 2005 and 2010. The most high-profile, published in 2008 in the journal Animal Cognition, described using a computer program to classify dog barks (.pdf).
At the time, many journalists — including this one — glibly interpreted the study as a halting step towards dog-to-human translation, but its significance was deeper. Molnar’s statistical algorithm showed that dog barks displayed common patterns of acoustic structure. In terms of pitch and repetition and harmonics, one dog’s alarm bark fundamentally resembled another dog’s alarm bark, and so on.
Intriguingly, the algorithm showed the most between-individual variation in barks made by dogs at play. According to Molnar, this is a hint of human pressure at work. People traditionally needed to identify alarm sounds quickly, but sounds of play were relatively unimportant.
By recording barks in various situations — confronting a stranger, at play, and so on — and playing them back to humans, Molnar’s group then showed that people could reliably identify the context in which barks were made. In short, we understand them.

The findings support Molnar’s original hypothesis, though more work is needed. Molnar started to cross-reference a phylogenetic tree of dog breeds with their barking habits, looking for an evolutionary trajectory, but never finished. He had been a student, and his thesis was complete. Unable to get more funding, he’s now a science journalist.
According to Eugene Morton, a zoologist and animal communication expert at the National Zoo, Molnar’s ideas are quite plausible. Morton noted that barking is a very useful type of sound, simple and capable of carrying over long distances. However, it could have been a side effect of humans favoring other, domestication-friendly traits in the wolves from which modern dogs descended.
“Barks are used by juvenile wolves, by pups. It’s neotenic — something derived from a juvenile stage, and kept in adults. That’s probably what we selected for,” said Morton. “We don’t want dogs who are dominant over us. The bark might go along with that breeding for juvenile behavior. Or it could have come with something else we selected, such as a lack of aggression.”

Molnar’s research is now a fascinating footnote waiting to be pushed forward by other researchers. In addition to that phylogenetic tree of barking, Molnar would like to see analyses of relationships between breeds’ bark characteristics and their traditional roles. If, as with the deep frightening rumble of mastiff guards, breeds’ barks tend to fit their jobs, it would further support the notion of human-guided bark evolution.
The ultimate evidence, said Molnar, would be if human knowledge of bark structure could be used to synthesize barks. “If these barks, played to dogs and humans, had the same effects, it would be awesome,” he said.

Source Wired

Thursday, June 23, 2011

Lab yeast make evolutionary leap to multicellularity

IN JUST a few weeks single-celled yeast have evolved into a multicellular organism, complete with division of labour between cells. This suggests that the evolutionary leap to multicellularity may be a surprisingly small hurdle.

 One giant leap for yeastkind 

Multicellularity has evolved at least 20 times since life began, but the last time was about 200 million years ago, leaving few clues to the precise sequence of events. To understand the process better, William Ratcliff and colleagues at the University of Minnesota in St Paul set out to evolve multicellularity in a common unicellular lab organism, brewer's yeast.

Their approach was simple: they grew the yeast in a liquid and once each day gently centrifuged each culture, inoculating the next batch with the yeast that settled out on the bottom of each tube. Just as large sand particles settle faster than tiny silt, groups of cells settle faster than single ones, so the team effectively selected for yeast that clumped together.

Sure enough, within 60 days - about 350 generations - every one of their 10 culture lines had evolved a clumped, "snowflake" form. Crucially, the snowflakes formed not from unrelated cells banding together but from cells that remained connected to one another after division, so that all the cells in a snowflake were genetically identical relatives. This relatedness provides the conditions necessary for individual cells to cooperate for the good of the whole snowflake.

"The key step in the evolution of multicellularity is a shift in the level of selection from unicells to groups. Once that occurs, you can consider the clumps to be primitive multicellular organisms," says Ratcliff.
In some ways, the snowflakes do behave as if they are multicellular. They grow bigger by cell division and when the snowflakes reach a certain size a portion breaks off to form a daughter cell. This "life cycle" is much like the juvenile and adult stages of many multicellular organisms.

After a few hundred further generations of selection, the snowflakes also began to show a rudimentary division of labour. As the snowflakes reach their "adult" size, some cells undergo programmed cell death, providing weak points where daughters can break off. This lets the snowflakes make more offspring while leaving the parent large enough to sink quickly to the base of the tube, ensuring its survival. Snowflake lineages exposed to different evolutionary pressures evolved different levels of cell death. Since it is rarely to the advantage of an individual cell to die, this is a clear case of cooperation for the good of the larger organism. This is a key sign that the snowflakes are evolving as a unit, Ratcliff reported last week at a meeting of the Society for the Study of Evolution in Norman, Oklahoma.

Other researchers familiar with the work were generally enthusiastic. "It really seemed to me to have the elements of the unfolding in real time of a major transition," says Ben Kerr, an evolutionary biologist at the University of Washington in Seattle. "The fact that it happened so quickly was really exciting."
Sceptics, however, point out that many yeast strains naturally form colonies, and that their ancestors were multicellular tens or hundreds of millions of years ago. As a result, they may have retained some evolved mechanisms for cell adhesion and programmed cell death, effectively stacking the deck in favour of Ratcliff's experiment.

"I bet that yeast, having once been multicellular, never lost it completely," says Neil Blackstone, an evolutionary biologist at Northern Illinois University in DeKalb. "I don't think if you took something that had never been multicellular you would get it so quickly."
Even so, much of evolution proceeds by co-opting existing traits for new uses - and that's exactly what Ratcliff's yeast do. "I wouldn't expect these things to all pop up de novo, but for the cell to have many of the elements already present for other reasons," says Kerr.

Ratcliff and his colleagues are planning to address that objection head-on, by doing similar experiments with Chlamydomonas, a single-celled alga that has no multicellular ancestors. They are also continuing their yeast experiments to see whether further division of labour will evolve within the snowflakes. Both approaches offer an unprecedented opportunity to bring experimental rigour to the study of one of the most important leaps in our distant evolutionary past.

Source New Scientist

Wednesday, June 22, 2011

Researchers identify components of speech recognition pathway in humans

Finding suggests speech perception evolved from animals.


Washington, D.C. — Neuroscientists at Georgetown University Medical Center (GUMC) have defined, for the first time, three different processing stages that a human brain needs to identify sounds such as speech — and discovered that they are the same as ones identified in non-human primates.

In the June 22 issue of the Journal of Neuroscience, the researchers say their discovery — made possible with the help of 13 human volunteers who spent time in a functional MRI machine — could potentially offer important insights into what can go wrong when someone has difficulty speaking, which involves hearing voice-generated sounds, or understanding the speech of others.

But more than that, the findings help shed light on the complex, and extraordinarily elegant, workings of the "auditory" human brain, says Josef Rauschecker, PhD, a professor in the departments of physiology/ biophysics and neuroscience and a member of the Georgetown Institute for Cognitive and Computational Sciences at GUMC.

"This is the first time we have been able to identify three discrete brain areas that help people recognize and understand the sounds they are hearing," says Rauschecker. "These sounds, such as speech, are vitally important to humans, and it is critical that we understand how they are processed in the human brain."
Rauschecker and his colleagues at Georgetown have been instrumental in building a unified theory about how the human brain processes speech and language. They have shown that both human and non-human primates process speech along two parallel pathways, each of which run from lower to higher functioning neural regions.

These pathways are dubbed the "what" and "where" streams and are roughly analogous to how the brain processes sight, but in different regions. The "where" stream localizes sound and the "what" pathway identifies the sound.
Both pathways begin with the processing of signals in the auditory cortex, located inside a deep fissure on the side of the brain underneath the temples - the so-called "temporal lobe." Information processed by the "what" pathway then flows forward along the outside of the temporal lobe, and the job of that pathway is to recognize complex auditory signals, which include communication sounds and their meaning (semantics). The "where" pathway is mostly in the parietal lobe, above the temporal lobe, and it processes spatial aspects of a sound — its location and its motion in space — but is also involved in providing feedback during the act of speaking.

Auditory perception - the processing and interpretation of sound information - is tied to anatomical structures; signals move from lower to higher brain regions, Rauschecker says. "Sound as a whole enters the ear canal and is first broken down into single tone frequencies, then higher-up neurons respond only to more complex sounds, including those used in the recognition of speech, as the neural representation of the sound moves through the various brain regions," he says.

In this study, Rauschecker and his colleagues — computational neuroscientist Maximilian Riesenhuber, Ph.D., and Mark Chevillet, a student in the Interdisciplinary Program in Neuroscience — identified the three distinct areas in the "what" pathway in humans that had been seen in non-human primates. Only two had been recognized before in previous human studies.

The first, and most primary, is the "core" which analyzes tones at the basic level of simple frequencies. The second area, the "belt", wraps around the core, and integrates several tones, "like buzz sounds," that lie close to each other, Rauschecker says. The third area, the "parabelt," responds to speech sounds such as vowels, which are essentially complex bursts of multiple frequencies.

Rauschecker is fascinated by the fact that although speech and language are considered to be uniquely human abilities, the emerging picture of brain processing of language suggests "in evolution, language must have emerged from neural mechanisms at least partially available in animals," he says. "There appears to be a conservation of certain processing pathways through evolution in humans and nonhuman primates."

Source EurekaAlert!

Tuesday, June 21, 2011

Human evolution: the long, winding road to modern man

Professor Chris Stringer tells how conflicting theories and new discoveries have shaped our understanding of humanity's past – and of how narrow the line is between survival and failure.

 Chris Stringer, head of human origins at the Natural History Museum.

Our species' origins have been a source of fascination for millennia and account for the huge range of creation myths that are recorded in different cultures. Linnaeus, that great classifier of living things, gave us our biological name Homo sapiens (meaning "wise man") and our high rounded skulls certainly make us distinctive, as do our small brow ridges and chins. However, we are also remarkable for our language, art and complex technology.

The question is: where did these features evolve? Where can humanity place its homeland? In terms of our earliest ancestors, the answer is generally agreed to be Africa. It was here that our first ape-like ancestors began to make their homes on the savannah. However, a fierce debate has continued about whether it was also the ultimate birthplace of our own species.
Forty years ago, no one believed that modern humans could have originated in Africa. In some cases this idea was based on fading racist agendas. For example, in 1962, the American anthropologist Carleton Coon claimed that "If Africa was the cradle of mankind, it was only an indifferent kindergarten. Europe and Asia were our principal schools."

Part of the confusion was due to the lack of well-dated fossil and archaeological evidence. In the intervening years, however, I have been privileged to be involved in helping to accumulate data – fossil, chronological, archaeological and genetic – that show our species did have a recent African origin. But as the latest evidence shows, this origin was complex and in my new book, The Origin of Our Species, I try to make it clear what it means to be human and change perceptions about our origins.

I had been fascinated by ancient humans called Neanderthals even as a 10-year-old, and in 1971, as a 23-year-old student, I left London on a four-month research trip to museums and institutes in 10 European countries to gather data on the shapes of skulls of Neanderthals and of their modern-looking successors in Europe, the Cro-Magnons. My purpose was to test the then popular theory which held that Neanderthals and people like them in each region of the ancient world were the ancestors of people in those same regions today. I had only a modest grant, and so I drove my old car, sleeping in it, camping or staying in youth hostels – in Belgium I even spent one night in a shelter for the homeless. I survived border confrontations and two robberies, but by the end of my 5,000-mile trip I had collected one of the largest data sets of Neanderthal and early modern skull measurements assembled up to that time.

Over the next three years I added data on other ancient and modern samples, and the results were clear: Neanderthals had evolved their own special characteristics, and did not look like ancestors for the Cro-Magnons or for any modern people. The issue was: where had our species evolved? In 1974 I was unable to say, but taking up a research post at the Natural History Museum meant I could continue the quest.
My research uncovered clues, however, and over the next decade my work – along with that of a few others – focused on Africa as the most likely homeland of our species. We remained an isolated minority until 1987, when the paper "Mitochondrial DNA and Human Evolution", was published by Rebecca Cann, Mark Stoneking and Allan Wilson. It put modern human origins on the front pages of newspapers all over the world for the first time for it showed that a tiny and peculiar part of our genome, inherited only through mothers and daughters, derived from an African ancestor about 200,000 years ago. This woman became known as Mitochondrial Eve. A furore followed, as anthropologists rowed over the implications for human evolution.

After that, the "out of Africa" theory – or as I prefer to call it "the recent African origin" model for our origins – really took off. My version depicted the following background. The ancient species Homo erectus survived in East Asia and Indonesia but evolved into Homo heidelbergensis in Europe and Africa. (This last species had been named from a 600,000-year-old jawbone found in Germany in 1907.) Then, about 400,000 years ago, H. heidelbergensis underwent an evolutionary split: north of the Mediterranean it developed into the Neanderthals, while to the south, in Africa, it became us, modern humans. Finally, about 60,000 years ago Homo sapiens began to leave Africa and by 40,000 years ago, with the advantages of more complex tools and behaviours, spread into Asia and Europe, where we replaced the Neanderthals and all the other archaic people outside of Africa. In other words, under our skins, we are all Africans.

Not every scientist agreed, however. One group continued to support the idea of multiregional evolution, an updated version of ideas from the 1930s. It envisaged deep parallel lines of evolution in each inhabited region of Africa, Europe, Asia and Australasia, stretching from local variants of H. erectus right through to living people in the same areas today. These lines did not diverge through time, since they were glued together by interbreeding across the ancient world, so modern features could gradually evolve, spread and accumulate, alongside long-term regional differences in things like the shape of the face and the size of the nose.
A different model, known as the assimilation model, took the new fossil and genetic data on board and gave Africa a key role in the evolution of modern features. However, this model envisaged a much more gradual spread of those features from Africa than did mine. Neanderthals and archaic people like them were assimilated through widespread interbreeding. Thus the evolutionary establishment of modern features was a blending process rather than a rapid replacement.

So who was right? Genetic data continued to accumulate through the 1990s in support of the recent African origin model, both from recent human populations and Neanderthal fossils. Recent massive improvements in recovery and analysis of ancient DNA have produced even more information, some of it very surprising. Fossil fragments from Croatia have yielded up a nearly entire Neanderthal genome, providing rich data that promise insights into their biology – from eye colour and hair type through to skull shape and brain functions.

These latest results have largely confirmed a separation from our lineage about 350,000 years ago. But when the new Neanderthal genome was compared in detail with modern humans from different continents, the results produced an intriguing twist to our evolutionary story: the genomes of people from Europe, China and New Guinea lay slightly closer to the Neanderthal sequence than did those of Africans. Thus if you are European, Asian or New Guinean, you could have 2.5% of Neanderthal DNA in your genetic make-up.
The most likely explanation for this discovery is that the ancestors of today's Europeans, Asians and New Guineans interbred with Neanderthals (or at least with a population that had a component of Neanderthal genes) in North Africa, Arabia or the Middle East, as they exited Africa about 60,000 years ago. That ancient human exodus may have involved only a few thousand people, so it would have taken the absorption of only a few Neanderthals into a group of H. sapiens for the genetic effect – greatly magnified as modern human numbers exploded – to be felt tens of thousands of years later.

The breakthrough in reconstructing a Neanderthal genome has been mirrored across Asia in equally remarkable work on the human group that has become known as the "Denisovans". A fossil finger bone, about 40,000 years old, found in Denisova Cave, Siberia, together with a huge molar tooth, could not be assigned to a particular human species, though it has also had much of its genome reconstructed. This has revealed a previously unrecognised Asian offshoot of the Neanderthal line, but again with a twist. These Denisovans are also related to one group of living humans – the Melanesians of southeast Asia (and probably their Australian neighbours too). These groups also carry about 5% of Denisovan DNA from another interbreeding event that must have happened as their ancestors passed through southern Asia over 40,000 years ago.

So where does this added complexity and evidence of interbreeding with Neanderthals and Denisovans leave my favoured Recent African Origins model? Has it been disproved in favour of the multiregional model, as some have claimed? I don't think so. As we have seen, back in 1970, no scientists held the view that Africa was the evolutionary home of modern humans; the region was considered backward and largely irrelevant, with the pendulum of scientific opinion strongly swinging towards non-African and Neanderthal ancestry models. Twenty years later, the pendulum was starting to move in favour of our African origins, as fossil evidence began to be reinforced by the clear signals of mitochondrial DNA. The pendulum swung even further with growing fossil, archaeological and genetic data in the 1990s.

Now, the advent of huge amounts of DNA data, including the Neanderthal and Denisovan genomes, has halted and even reversed that pendulum swing, away from absolute replacement. Instead we are looking at a mixed replacement-hybridisation or "leaky replacement" model. This dynamism is what makes studying human evolution so fascinating. Science is not about being right or wrong, but about gradually approaching truth about the natural world.

The big picture is that we are still predominantly of recent African origin (more than 90% of our genetic ancestry). But is there a special reason for this observation? Overall, the pre-eminence of Africa in the story of our origins does not involve a special evolutionary pathway but is a question of the continent's consistently large habitable areas which gave greater opportunities for morphological and behavioural variations, and for genetic and behavioural innovations to develop and be conserved. "Modernity" was not a package that had an origin in one African time, place and population, but was a composite whose elements appeared at different times and places, and then gradually coalesced to assume the form we recognise today.

My studies have led me to a greater recognition in recent human evolution of the forces of demography (the need for large populations and social networks to make progress), drift and contingency (chance events), and cultural rather than natural selection than I had considered before. It seems that cultural "progress" was a stop-start affair for much of our evolution, until human groups were large, had long-lived individuals, and wide social networks, all helping to maximise the chances that innovations would survive and accumulate.

Linnaeus said of Homo sapiens "know thyself". Knowing ourselves means a recognition that becoming modern is the path we perceive when looking back on our own evolutionary history. That history seems special to us, of course, because we owe our very existence to it. Those figures of human species (usually males, who become increasingly hairless and light-skinned) marching boldly across the page have illustrated our evolution in many popular articles, but they have wrongly enshrined the view that evolution was simply a progression leading to us, its pinnacle and final achievement.

Nothing could be further from the truth. There were plenty of other paths that could have been taken; many would have led to no humans at all, others to extinction, and yet others to a different version of "modernity". We can inhabit only one version of being human – the only version that survives today – but what is fascinating is that palaeoanthropology shows us those other paths to becoming human, their successes and their eventual demise, whether through failure or just sheer bad luck.

Sometimes the difference between failure and success in evolution is a narrow one. We are certainly on a knife-edge now, as we confront an overpopulated planet and the prospect of global climate change on a scale that humans have never faced before. Let's hope our species is up to the challenge.
Professor Chris Stringer is the research leader in human origins at the Natural History Museum, London

Source The Guardian

Thursday, June 16, 2011

Breeding with Neanderthals helped humans go global

WHEN the first modern humans left Africa they were ill-equipped to cope with unfamiliar diseases. But by interbreeding with the local hominins, it seems they picked up genes that protected them and helped them eventually spread across the planet.

The publication of the Neanderthal genome last year offered proof that Homo sapiens bred with Neanderthals after leaving Africa. There is also evidence that suggests they enjoyed intimate relations with other hominins including the Denisovans, a species identified last year from a Siberian fossil.
But what wasn't known is whether the interbreeding made any difference to their evolution. To find out Peter Parham of Stanford University in California took a closer look at the genes they picked up along the way.
He focused on human leukocyte antigens (HLAs), a family of about 200 genes that is essential to our immune system. It also contains some of the most variable human genes: hundreds of versions - or alleles - exist of each gene in the population, allowing our bodies to react to a huge number of disease-causing agents and adapt to new ones.

The humans that left Africa probably carried only a limited number of HLA alleles as they likely travelled in small groups. Worse, their HLAs would have been adapted to African diseases.
When Parham compared the HLA genes of people from different regions of the world with the Neanderthal and Denisovan HLAs, he found evidence that non-African humans picked up new alleles from the hominins they interbred with.

One allele, HLA-C*0702, is common in modern Europeans and Asians but never seen in Africans; Parham found it in the Neanderthal genome, suggesting it made its way into H. sapiens of non-African descent through interbreeding. HLA-A*11 had a similar story: it is mostly found in Asians and never in Africans, and Parham found it in the Denisovan genome, again suggesting its source was interbreeding outside of Africa.
Parham points out that because Neanderthals and Denisovans had lived outside Africa for over 200,000 years by the time they encountered H. sapiens, their HLAs would have been well suited to local diseases, helping to protect migrating H. sapiens too.

While only 6 per cent of the non-African modern human genome comes from other hominins, the share of HLAs acquired during interbreeding is much higher. Half of European HLA-A alleles come from other hominins, says Parham, and that figure rises to 72 per cent for people in China, and over 90 per cent for those in Papua New Guinea.

This suggests they were increasingly selected for as H. sapiens moved east. That could be because humans migrating north would have faced fewer diseases than those heading towards the tropics of south-east Asia, says Chris Stringer of the Natural History Museum in London.
Parham presented his work at a Royal Society discussion meeting on human evolution in London last week.

 Source New Scientist

Tuesday, June 14, 2011

Building a dinosaur from a chicken Can scientists convince birds to evolve backward .. into dinosaurs?

 Archaeopteryx lithographica at the Museum für Naturkunde in Berlin, Germany. (This is the original fossil -- not a cast.)

One of the most controversial topics in science during the past many decades has been the debate over the origin of birds: did they evolve from dinosaurs or reptiles? This debate quieted down for awhile until the discovery of an important new fossil in the nineteenth century. This fossil, known today as the Berlin specimen of Archaeopteryx (pictured above), led to fresh insights, thus reigniting this debate. Today, it is fairly well-accepted by the scientific community that birds are a special lineage of theropod dinosaurs.


When you look closely at the above fossil, you can see similarities as well as clear morphological differences between Archaeopteryx and, say, a chicken. Archaeopteryx has fingers instead of wings, Archaeopteryx has a long bony tail instead of a short bony nubbin and, if you look closely, you can also see that Archaeopteryx has teeth -- all of which birds lack.

But ornithologists and birders are familiar with one peculiar South American bird, the hoatzin, Opisthocomus hoazin, whose chicks possess claws on two of their wing digits -- almost like Archaeopteryx! But hoatzins aren't unique: curious traits, traits that had been lost during evolution, sometimes pop up in domestic livestock and even in humans -- chickens with teeth, horses with extra toes and humans with tails, for example. These features, known as atavisms, result from errors in gene regulation: genes are either "turned on" (expressed) or "turned off" (suppressed) at the incorrect times during development. Atavistic traits are reminders of the evolutionary past.

Knowing this, renown paleontologist Jack Horner has spent much of his career trying to turn back the evolutionary clock by reconstructing a dinosaur. He's found dinosaur fossils with extraordinarily well-preserved blood vessels and soft tissues, but never intact DNA. So instead of using the Jurassic Park method to recreate dinosaurs, he's taking a different approach. Mr Horner is taking a living descendant of the dinosaur -- chickens -- and genetically engineering them to reactivate ancestral traits -- including teeth, tails, and even hands. He's making a "chickenosaurus". In this fascinating video, Mr Horner reviews recent dinosaur discoveries and talks about his plans for recreating a "chickenosaurus":


Jack Horner studied geology and zoology but did not complete his bachelor's degree due to his inability to pass the required foreign language courses (he is somewhat dyslexic and could not read adequately in German). However, he did complete his senior thesis on the fauna of the Bear Gulch Limestone in Montana, which is one of the most famous Mississippian fossil sites in the world. He currently is Curator of Paleontology at the Museum of the Rockies and also serves in a number of academic capacities. In recognition of his achievements and contributions to the field of paleontology, he was awarded an Honorary Doctorate of Science in 1986 by the University of Montana and in 2006 by the Pennsylvania State University. In 1986, he was also awarded the prestigious MacArthur Fellowship. Mr Horner further discusses his plans to reconstruct a "chickenosaurus" in his 2009 book, How to Build a Dinosaur: Extinction Doesn't Have to Be Forever [Amazon UK; Amazon US].


Source The Guardian

Sunday, June 12, 2011

Life-history traits may affect DNA mutation rates in males more than in females

For the first time, scientists have used large-scale DNA sequencing data to investigate a long-standing evolutionary assumption: DNA mutation rates are influenced by a set of species-specific life-history traits.

These traits include metabolic rate and the interval of time between an individual's birth and the birth of its offspring, known as generation time. The team of researchers led by Kateryna Makova, a Penn State University associate professor of biology, and first author Melissa Wilson Sayres, a graduate student, used whole-genome sequence data to test life-history hypotheses for 32 mammalian species, including humans. For each species, they studied the mutation rate, estimated by the rate of substitutions in neutrally evolving DNA segments -- chunks of genetic material that are not subject to natural selection. They then correlated their estimations with several indicators of life history. The results of the research will be published in the journal Evolution on 13 June 2011.

One of the many implications of this research is that life-history traits of extinct species now could be discoverable. "Correlations between life-history traits and mutation rates for existing species make it possible to develop a hypothesis in reverse for an ancient species for which we have genomic data, but no living individuals to observe as test subjects," Makova explained. "So, if we have information about how extant species' life history affects mutation rates, it becomes possible to make inferences about the life history of a species that has been extinct for even tens of thousands of years, simply by looking at the genomic data."

To find correlations between life history and mutation rates, the scientists first focused on generation time. "The expected relationship between generation time and mutation rate is quite simple and intuitive," Makova said. "The more generations a species has per unit of time, the more chances there are for something to go wrong; that is, for mutations or changes in the DNA sequence to occur." Makova explained that the difference between mice and humans could be used to illustrate how vastly generation time can vary from species to species. On the one hand, mice in the wild usually have their first litter at just six months of age, and thus their generation time is very short. Humans, on the other hand, have offspring when they are at least in their mid-teens or even in their twenties, and thus have a longer generation time. "If we do the math we see that, for mice, every 100 years equates to about 200 generations, whereas for humans, we end up with only five generations every 100 years," Makova said. After comparing 32 mammalian species, her team found that the strongest, most significant life-history indicator of mutation rate was, in fact, the average time between a species member's birth and the birth of its first offspring, accounting for a healthy 40% of mutation-rate variation among species.

Makova's team also found that generation time affects male mutation bias -- a higher rate of DNA mutation in the male sperm versus the female egg. "Females of a species are born with their entire lifetime supply of oocytes, or egg cells. These cells have to divide only once to become fertilizable," Makova explained. "However, males of a species produce sperm throughout their reproductive life, and, compared with egg cells, sperm cells undergo many more DNA replications -- many more chances for mutations to occur." Previous researchers had demonstrated a higher DNA mutation rate in mammalian males than in mammalian females, a phenomenon called male mutation bias. However, until now, no one had shown that generation time was the main determinant of this phenomenon.

The second life-history trait that Makova's team examined was metabolic rate -- the amount of energy expended by an animal daily -- and how it correlates with genetic mutations. Wilson Sayres explained that some of the team's 32 test species, such as shrews and rodents, fell into the high-metabolism category, while others, such as dolphins and elephants, fell into the low-metabolism category. Previous researchers had hypothesized that the higher the metabolic rate, the greater the number of mutations. "According to this idea, sperm cells should be more affected than egg cells by a higher metabolic rate," Wilson Sayres said. "A sperm cell is very active and constantly moving, and, in addition, its cell membrane is not very dense. But an egg cell basically sits there and does nothing, while being protected by a thicker membrane, much like a coat of armor." Wilson Sayres explained that the combination of high energy and meager protection leaves sperm cells more susceptible to bombardment by free radicals -- atoms or molecules with unpaired electrons -- and that these free radicals can increase mutations. "The hypothesis is that a high metabolism greatly increases this already volatile situation, especially for sperm; so, in our study, we expected stronger male mutation bias in organisms with high metabolic rate," Wilson Sayres said.

Makova's team found that, unlike generation time, metabolic rate appeared to be only a moderate predictor of mutation rates and of male mutation bias. "While this finding was not as significant as the generation-time result, I suspect that further studies may provide stronger evidence that metabolic rate exerts an important influence on mutation rates and male mutation bias," Makova said. She explained that the challenge is to disentangle metabolic rate as a separate factor from generation time. "The two factors strongly correlate with one another, so it's hard to get a clear fix on how metabolism might be acting independently of generation-time intervals."

Third, Makova and her team explored another life-history trait that other researchers had hypothesized might affect mutation rates -- sperm competition. "Sperm competition is just that -- the struggle between the sperm of different males to fertilize egg cells," Wilson Sayres said. "In a species such as the chimpanzee, where females mate with many different males during a given cycle, intense sperm competition results in large testicle size, and thus, high sperm production. But in a harem species such as the gorilla, where each female is basically exclusive to one male, sperm competition is much less relevant, and the result is small testicle size and low sperm production." Makova explained that sperm competition should, in theory, correlate positively with sperm mutation and thus a higher male mutation bias. "The more sperm that are produced, the more cell divisions are needed and the greater the chances are of mistakes during DNA copying, or replication," Makova said.

However, in the case of sperm competition, the results were surprising. "We did not find as strong an association between male mutation bias and sperm competition as other researchers had hypothesized, although we speculate that future studies might yield different results if the data on sperm competition are collected in different ways," Wilson Sayres explained.

Source EurekaAlert!

Friday, June 10, 2011

Meteorite holds clues to organic chemistry of the early Earth

Washington, DC— Carbonaceous chondrites are a type of organic-rich meteorite that contain samples of the materials that took part in the creation of our planets nearly 4.6 billion years ago, including materials that were likely formed before our Solar System was created and may have been crucial to the formation of life on Earth. The complex suite of organic materials found in carbonaceous chondrites can vary substantially from meteorite to meteorite. New research from Carnegie's Department of Terrestrial Magnetism and Geophysical Laboratory, published June 10 in Science, shows that most of these variations are the result of hydrothermal activity that took place within a few million years of the formation of the Solar System, when the meteorites were still part of larger parent bodies, likely asteroids.

Organic material in carbonaceous chondrites shares many characteristics with organic matter found in other primitive samples, including interplanetary dust particles, comet 81P/Wild-2, and Antarctic micrometeorites. It has been argued by some that this similarity indicates that organic material throughout the Solar System largely originated from a common source, possibly the interstellar medium.
A test of this common-source hypothesis stems from its requirement that the organic diversity within and among meteorites be due primarily to chemical and thermal processing that took place while the meteorites were parts of their parent bodies. In other words, there should be a relationship between the extent of hydrothermal alteration that a meteorite experienced and the chemistry of the organic material it contains.
If--as many have speculated--the organic material in meteorites had a role to play in the origin of life on Earth, the attraction of the common-source hypothesis is that the same organic material would have been delivered to all bodies in the Solar System. If the common source was the interstellar medium, then similar material would also be delivered to any forming planetary system.

The research team—led by Christopher Herd of the University of Alberta, Canada, and including Carnegie's Conel Alexander, Larry Nittler, Frank Gyngard, George Cody, Marilyn Fogel, and Yoko Kebukawa—studied four meteorite specimens from the shower of stones, produced by the breakup of a meteoroid as it entered the atmosphere, that fell on Tagish Lake in northern Canada in January 2000. The samples are considered very pristine, because they fell on a frozen lake, were collected without hand contact within a few days of landing and have remained frozen ever since.

The samples were processed and analyzed on the microscopic level using a variety of sophisticated techniques. Examination of their inorganic components indicated that the specimens had experienced large differences in the extent of hydrothermal alteration, prompting an in-depth examination of their organic material. The team demonstrated that the insoluble organic matter found in the samples has properties that span nearly the entire range found in all carbonaceous chondrites and that those properties correlate with other measures of the extent of parent body alteration. Their finding confirms that the diversity of this material is due to processing of a common precursor material in the asteroidal parent bodies.

The team found large concentrations of monocarboxylic acids, or MCAs, which are essential to biochemistry, in their Tagish Lake samples. They attributed the high level of these acids to the pristine nature of the samples, which have been preserved below zero degrees Celsius since they were recovered. There was variety in the types of MCAs, which they determined could also be due to alterations that took place on the parent bodies.
The samples also contained amino acids—the essential-for-life organic building blocks used to create proteins. The types and abundances of amino acids contained in the samples are consistent with an extraterrestrial origin, and were clearly also influenced, albeit in a complex way, by the alteration histories of their host meteorites.

"Taken together these results indicate that the chemical and thermal processing common to the Tagish Lake meteorites likely occurred when the samples were part of a larger parent body that was created from the same raw materials that formed our Solar System," said Larry Nittler of Carnegie's DTM. "These samples can also provide important clues to the source of organic material, and life, on Earth."

Source  ScienceAlert!

Earth-bound asteroids carried ever-evolving, life-starting organic compounds

Detailed analysis of the most pristine meteorite ever recovered shows that the composition of the organic compounds it carried changed during the early years of the solar system

(Edmonton) Detailed analysis of the most pristine meteorite ever recovered shows that the composition of the organic compounds it carried changed during the early years of the solar system. Those changed organics were preserved through billions of years in outer space before the meteorite crashed to Earth.
The research team, led by University of Alberta geologist Chris Herd, analyzed samples of a meteorite that landed on Tagish Lake in northern British Columbia in 2000. Variations in the geology of the meteorite samples were visible to the naked eye and indicated the asteroid, from which the meteorite samples originated, had gone through substantial changes.

The researchers began looking for variations in the organic chemistry that corresponded with variations in the meteorite's geology. Herd says they found a surprising correlation, which gave researchers a snapshot of the process that altered the composition of organic material carried by the asteroid. Among the organic compounds studied were amino acids and monocarboxylic acids, two chemicals essential to the evolution of the first, simple life forms on Earth.

Herd says the finding shows the importance of asteroids to Earth's history.
"The mix of prebiotic molecules, so essential to jump-starting life, depended on what was happening out there in the asteroid belt," said Herd. "The geology of an asteroid has an influence on what molecules actually make to the surface of Earth."
Herd says that, when the asteroid was created by the accumulation of dust around the infant sun, it contained ice. The ice warmed and turned to water, which began percolating and altering the organic compounds buried in the rock.

The Tagish Lake meteorite is considered to be one-of-a-kind because of its landing and handling. It was January when the meteorite exploded at an altitude of 30 to 50 kilometres above Earth and rained meteorite fragments down on the frozen, snow-covered lake. The individual who recovered the samples consulted with experts beforehand and avoided any contamination issues.
Herd says the meteorite's pristine state enabled the breakthrough research. "The variations in the organic makeup are true to what was happing inside the asteroid," said Herd. "This is exactly what has been orbiting in the asteroid belt for the last 4.5 billion years."

Source EurekaAlert!

Wednesday, June 8, 2011

Can evolution outpace climate change?

Animals and plants may not be able to evolve their way out of the threat posed by climate change, according to a UC Davis study of a tiny seashore animal. The work was published today (June 8) in the journal Proceedings of the Royal Society B.


The Tigriopus californicus, or copepod, showed little evidence it could increase its heat tolerance, reports Eric Sanford, an associate professor of evolution and ecology at UC Davis. (Morgan Kelly/UC Davis photo) 

The tide pool copepod Tigriopus californicus is found from Alaska to Baja California — but in a unique lab study, the animals showed little ability to evolve heat tolerance.
“This is a question a lot of scientists have been talking about,” said study co-author Eric Sanford, an associate professor of evolution and ecology at UC Davis and a researcher at the university’s Bodega Marine Laboratory. “Do organisms have the ability to adapt to climate change on a timescale of decades?”
UC Davis graduate student Morgan Kelly, the first author of the paper, collected copepods from eight locations between Oregon and Baja California in Mexico. The tiny shrimplike animals, about a millimeter long, live in tide pools on rocky outcrops high in the splash zone.

Kelly grew the short-lived copepods in the lab for 10 generations, subjecting them to increased heat stress to select for more heat-tolerant animals.
At the outset, copepods from different locations showed wide variability in heat tolerance. But within those populations, Kelly was able to coax only about a half-degree Celsius (about one degree Fahrenheit) of increased heat tolerance over 10 generations. And in most groups, the increase in heat tolerance had hit a plateau before that point.

In the wild, these copepods can withstand a temperature swing of 20 degrees Celsius a day, Kelly said. But they may be living at the edge of their tolerance, she said.
Although the copepods are widespread geographically, individual populations are very isolated, confined to a single rocky outcrop where wave splash can carry them between pools. That means there is very little flow of new genes across the population as a whole.

“It’s been assumed that widespread species have a lot of genetic capacity to work with, but this study shows that may not be so,” said co-author Rick Grosberg, professor of evolution and ecology at UC Davis. Many other species of animals, birds and plants face stress from climate change, and their habitats have also been fragmented by human activity -- perhaps more than we realize, he said.
“The critical point is that many organisms are already at their environmental limits, and natural selection won’t necessarily rescue them,” Grosberg said.

Source UC Davis

Saturday, June 4, 2011

Cryptic Mutations Could Be Evolution’s Hidden Fuel

The transformation of raw genetic material on a laboratory bench has provided a rare empirical demonstration of processes that may be universally crucial to evolution, but are only beginning to be understood.
The processes, called cryptic variation and preadaptation, involve mutations that don’t affect an organism when they first occur, and are initially exempt from pressures of natural selection. As they gather, however, at some later date, they could combine to form the basis for complex, unpredictable new traits.
In the new study, the ability of evolving, chemical-crunching molecules called ribozymes to adapt in new environments proved directly related to earlier accumulations of cryptic mutations. The details are esoteric, but their implications involve the very essence of adaptation and evolution.

“It’s one of the more modern topics in evolutionary theory,” said mathematical biologist Joshua Plotkin of the University of Pennsylvania, author of a commentary on the experiment, which was described June 2 in Nature. “The idea has been around for a while, but direct evidence hasn’t been found until recently.”

The experiment was led by evolutionary biologists Eric Hayden and Andreas Wagner of Switzerland’s University of Zurich, who use ribozymes — molecules made from RNA, a single-stranded form of genetic material – to study evolutionary principles in the simplest possible way.
The principles of cryptic variation and preadaptation were first proposed in the mid-20th century and conceptually refined in the mid-1970s. They were logical answers to the question of how complex traits, seemingly far too complex to be explained by one or a few mutations, could arise.

But even as such leading thinkers as Stephen Jay Gould embraced the concept, it proved difficult to study in detail. The tools didn’t exist to interpret genetic data with the necessary rigor. The concept itself was also difficult to grasp, injecting long periods of accumulation, purposeless mutations into an evolutionary narrative supposedly driven by constant selection.
In recent years, however, with the advent of better tools and a growing appreciation for evolution’s sheer complexity, researchers’ attention has turned again to cryptic variation and preadaptation. Computer models and scattered observations in bacteria and yeast hinted at their importance. But definitive proof, combining exhaustive genetic observation with real-world evolution, was elusive.

“Cryptic variation addresses questions of innovation,” said Hayden. “How do new things come about in biology? There’s been a long history of this concept, but no concrete experimental demonstration.”
In the new study, Hayden and Wagner evolved ribozymes in test tubes of chemicals, then moved them to a new chemical substrate, a shift analogous to requiring animals to suddenly subsist on a new food source.
The ribozymes that flourished were those that had accumulated specific sets of cryptic mutations in their former environment. Those variations, seemingly irrelevant before, became the basis of newly useful adapation. The researchers were able to measure every change in detail.
“It is a groundbreaking proof of principle,” said University of Arizona evolutionary biologist Joanna Masel, who wasn’t involved in the study. “This study is a clear demonstration that cryptic genetic variation can make evolution more effective.”

According to Plotkin, cryptic variation and preadaptation may be crucial to the evolution of drug resistance and immune system evasion in pathogens. Rather than looking for straightforward mutations, researchers could search for combinations, perhaps developing an “advance warning system” to flag seemingly innocuous changes.
Another application could be in genetic engineering. Whereas virus and bacteria designers tend to “accept any mutations that get them closer to their intended outcome,” said Plotkin, “it might be important to take lateral steps as well as uphill steps.”

Cryptic variation and preadaptation could also be important to the evolution of animals, from the origin of multicellularity to complex features like eyes and language. Plotkin would like to see studies revisiting the evolution of Charles Darwin’s famous finch beaks, but with an eye toward these newly described processes.
Masel said that better understanding cryptic variation and preadaptation could help programmers of evolving computer systems, and perhaps explain why some systems are better able than others to evolve. “Why are biological systems so evolvable?” she said. “This dynamic may or may not be the essence of evolvability. That’s certainly one of the hypotheses out there, and I am enthusiastic about it.”

These processes could also help interpret genomic studies that loosely link hundreds or thousands of genetic mutations to disease and development, frustrating geneticists searching for genetic patterns of heritability, said Masel and Hayden. And at a social level, they could be instructive to people interested in fostering innovation.
“My prediction is that it is good to foster lots of variation,” said Masel, who likened cryptic variation and preadaptation to Google’s famous requirement that employees spend 20 percent of their time on projects of personal whimsy. Rather than focusing narrowly on ideas that are obviously good, “Foster circumstances where lots of non-terrible ideas are floating around,” said Masel.

Source WIRED

The Evolutionary Errors of X-Men

Please, Magneto, stop blaming evolution for your anger issues.

In X-Men: First Class, the latest film about the popular comic book superheroes, one of the mutant characters goes by the nickname Darwin because he has the power of "reactive evolution." He instantly adapts to any threat: toss him in water and he sprouts gills; hit him with a club and his skin turns to armored plates.
Biology mavens in the audience may object that this form of evolution is more or less the opposite of what Charles Darwin proposed with his theory of natural selection. If anything, the mutant’s abilities are more in line with the rival, disproved theories of Jean-Baptiste Lamarck, who argued for the heritability of acquired characteristics. But maybe the name "Lamarck" would sound too much like a maitre d' rather than a mutant to fans.


ALPHA X-MAN: James McAvoy is Charles Xavier, aka Professor X, a powerful telepath who can read and control minds, recruits other mutants to stop the greatest threat the world has ever known.

That misappropriation of Darwin’s identity is emblematic of the X-Men films’ tortured portrayals of key ideas in biology. The movies are of course meant to be fun, not factual, and it feels like the height of stodginess to warn: “SPOILER ALERT: This film about superpowered telepaths and shape-shifting blue women is not a science documentary." There’s probably no point in wasting time discussing how various powers conferred by the fictional X-gene mutations violate physical laws, because they are really fantasy devices like the spells in Harry Potter books.

Nevertheless, it is worth looking at some of the film’s errors about evolution and speciation because they may be reinforcing some popular misconceptions.
X-Men: First Class, like earlier movies in the series, repeatedly invokes the idea that its mutants and humans are engaged in an evolutionary struggle for dominance like the one between humans and Neandertals thousands of years ago. Professor Xavier and Magneto talk about the Neandertals having resentfully looked at the superior new species moving in, and the moderns having displaced and slaughtered the older species.
At least this movie has the excuse of being set in 1962, when such ideas about human evolution were more common. Neandertals were then typically portrayed as a species of mentally inferior brutes who could not compete with the smarter, more technologically and culturally advanced Homo sapiens who evolved later.

But today, the paleoanthropological picture of the relations between Neandertals and modern humans is completely different. Skeletal reconstructions show that Neandertals had brains larger than our own, and archaeological digs reveal that they had a distinct culture but sometimes used some of the same tools that our ancestors did. Indeed, studies published in 2010 by Svante Pääbo's group at the Max Planck Institute for Evolutionary Anthropology in Leipzig concluded that several percent of non-African people’s genes came from Neandertals, so Neandertals may not even have been a species apart.

Most important, little evidence supports the idea that Neandertals and modern humans were in much open conflict. During the last ice age, Neandertals may simply have fared poorly and gone extinct largely on their own, with modern humans later occupying their old territories and perhaps breeding with some stragglers. One recent controversial study has even suggested that Neandertals were essentially gone from Europe by 40,000 years ago, thousands of years before modern humans arrived. In any case, Professor X and Magneto had it wrong.

Beyond the particulars of Neandertals and modern humans, though, X-Men’s abundant speeches about the "next step in human evolution" only make sense if evolution is seen as the gradual realization of some design embedded in nature. Accordingly, the emergence of superior new species—not just new species in general but particular, better species—is supposed to happen, and woe betide the older ones that stand in the way of that progress.
Current conceptions of evolution reject those views, however. Smart evolutionary biologists avoid referring to hierarchies of species in which, say, apes count as higher organisms and snakes as lower ones. Species are understood to emerge only if environmental conditions allow new, distinct breeding populations to branch away from their predecessors. The prospects for new species to survive or replace ancestral ones are equally hit or miss.

Mutants in the X-Men films are always treated as a distinct species, but most of them can apparently pass as human and spawn children with them. Those facts do not absolutely eliminate the possibility that the mutants are a different species (because related species do sometimes limitedly crossbreed in nature and bear fertile offspring, as wolves and coyotes have). Yet it also isn't clear that the mutants would preferentially breed among themselves, as a species must under at least one widely-known biological conception of a species.
Recognizable species also usually have a definable phenotype, or set of characteristic physical features. The X-Men mutants, in contrast, are a crazily diverse mélange of types (teleports! banshees! living ray guns!) who are on average at least as different from one another as they are from the rest of humanity.

So contrary to the film’s heroes and villains, X-Men mutants are not innately a new species, just another variant of Homo sapiens. They cannot become a new species (or more than one) unless geophysical or other circumstances create an irrevocable barrier to their breeding with the rest of humanity. Tolerance of one another’s differences could be enough to prevent that outcome. So in that sense, notwithstanding the dodgy science along the way, the film's underlying message is probably right after all.

Source Scientific American

Monday, May 23, 2011

Happy guys finish last, says new study on sexual attractiveness

Women find happy guys significantly less sexually attractive than swaggering or brooding men, according to a new University of British Columbia study that helps to explain the enduring allure of "bad boys" and other iconic gender types.

The study – which may cause men to smile less on dates, and inspire online daters to update their profile photos – finds dramatic gender differences in how men and women rank the sexual attractiveness of non-verbal expressions of commonly displayed emotions, including happiness, pride, and shame.
Very few studies have explored the relationship between emotions and attraction, and this is the first to report a significant gender difference in the attractiveness of smiles. The study, published online today in the American Psychological Association journal Emotion, is also the first to investigate the attractiveness of displays of pride and shame.

"While showing a happy face is considered essential to friendly social interactions, including those involving sexual attraction – few studies have actually examined whether a smile is, in fact, attractive," says Prof. Jessica Tracy of UBC's Dept. of Psychology. "This study finds that men and women respond very differently to displays of emotion, including smiles."

In a series of studies, more than 1,000 adult participants rated the sexual attractiveness of hundreds of images of the opposite sex engaged in universal displays of happiness (broad smiles), pride (raised heads, puffed-up chests) and shame (lowered heads, averted eyes).

The study found that women were least attracted to smiling, happy men, preferring those who looked proud and powerful or moody and ashamed. In contrast, male participants were most sexually attracted to women who looked happy, and least attracted to women who appeared proud and confident.

"It is important to remember that this study explored first-impressions of sexual attraction to images of the opposite sex," says Alec Beall, a UBC psychology graduate student and study co-author. "We were not asking participants if they thought these targets would make a good boyfriend or wife – we wanted their gut reactions on carnal, sexual attraction." He says previous studies have found positive emotional traits and a nice personality to be highly desirable in a relationship partners.

Tracy and Beall say that other studies suggest that what people find attractive has been shaped by centuries of evolutionary and cultural forces. For example, evolutionary theories suggest females are attracted to male displays of pride because they imply status, competence and an ability to provide for a partner and offspring.
According to Beall, the pride expression accentuates typically masculine physical features, such as upper body size and muscularity. "Previous research has shown that these features are among the most attractive male physical characteristics, as judged by women," he says.

The researchers say more work is needed to understand the differing responses to happiness, but suggest the phenomenon can also be understood according to principles of evolutionary psychology, as well as socio-cultural gender norms.

For example, past research has associated smiling with a lack of dominance, which is consistent with traditional gender norms of the "submissive and vulnerable" woman, but inconsistent with "strong, silent" man, the researchers say. "Previous research has also suggested that happiness is a particularly feminine-appearing expression," Beall adds.

"Generally, the results appear to reflect some very traditional gender norms and cultural values that have emerged, developed and been reinforced through history, at least in Western cultures," Tracy says. "These include norms and values that many would consider old-fashioned and perhaps hoped that we've moved beyond."

Displays of shame, Tracy says, have been associated with an awareness of social norms and appeasement behaviors, which elicits trust in others. This may explain shame's surprising attractiveness to both genders, she says, given that both men and women prefer a partner they can trust.

While this study focused on sexual attraction between heterosexual men and women in North America, the researchers say future studies will be required to explore the relationship between emotions and sexual attractiveness among homosexuals and non-Western cultures.

Overall, the researchers found that men ranked women more attractive than women ranked men.

Source  EurekaAlert!

Saturday, May 21, 2011

Errors in Protein Structure Sparked Evolution of Biological Complexity

ScienceDaily (May 21, 2011) — Over four billion years of evolution, plants and animals grew far more complex than their single-celled ancestors. But a new comparison of proteins shared across species finds that complex organisms, including humans, have accumulated structural weaknesses that may have actually launched the long journey from microbe to man.

The study, published in Nature, suggests that the random introduction of errors into proteins, rather than traditional natural selection, may have boosted the evolution of biological complexity. Flaws in the "packing" of proteins that make them more unstable in water could have promoted protein interactions and intracellular teamwork, expanding the possibilities of life.

"Everybody wants to say that evolution is equivalent to natural selection and that things that are sophisticated and complex have been absolutely selected for," said study co-author Ariel Fernández, PhD, a visiting scholar at the University of Chicago and senior researcher at the Mathematics Institute of Argentina (IAM) in Buenos Aires. "What we are claiming here is that inefficient selection creates a niche or an opportunity to evolve complexity."

"This is a novel bridge between protein chemistry and evolutionary biology," said co-author Michael Lynch, PhD, professor of biology at Indiana University. "I hope that it causes us to pause and think about how evolution operates in new ways that we haven't thought about before."

When mildly negative mutations arise in a species with a large population, such as the trillions of bacterial organisms that can fill a small area, they are quickly cleared out by selective forces. But when a new mutation appears in a species with a relatively small population, as in large mammals and humans, selection against the error is slower and less efficient, allowing the mutation to spread through the population.

To look at whether these mild defects accumulate in species with small populations, Fernández and Lynch compared over 100 proteins shared by 36 species of varying population size. Though these shared, "orthologous" proteins are identical in shape and function, genetic differences alter them in more subtle ways.
Fernández and Lynch focused on design flaws called "dehydrons," sites where the protein structure is vulnerable to chemical reactions with water. Proteins with more dehydrons are more "unwrapped" -- unstable in an aqueous environment, and therefore prone to bind with another protein to protect their vulnerable regions.

A computational analysis of 106 orthologous proteins confirmed their hypothesis that proteins from species with smaller populations were more vulnerable in water. The result suggests that structural errors accumulate in large organisms such as humans due to random genetic drift.
"We hate to hear that our structures are actually lousier," Fernández said. "But that has a good side to it. Because they are lousier, they are more likely to participate in complexes, and we have a much better chance of achieving more sophisticated function through teamwork. Instead of being a loner, the protein is a team player."
On their own, these unstable proteins might be expected to perform their cellular duties more poorly, possibly causing harm to the organism. But unstable proteins are also "stickier," more likely to form associations with other proteins that could introduce more flexibility and complexity into the cell. If these complexes create a survival advantage for the organism, forces of natural selection should take over and spread the new protein complex through the population.

"It's not an argument against selection, it's an argument for non-adaptive mechanisms opening up new evolutionary pathways that wouldn't have been there before," Lynch said. "It's those first little nicks getting into the protein armor that essentially open up a new selective environment."
To confirm that the accumulation of structural flaws in proteins preceded, rather than resulted from, the formation of complexes, Fernández and Lynch turned to a natural experiment. Some bacterial species have two types of populations: communities that live inside other organisms and larger populations living free in the environment. When orthologous proteins were compared between these two populations, the same pattern emerged -- proteins from the smaller populations were more flawed than those from the free-living bacteria of the same species.

Despite these accidental benefits, the accumulation of too many structural flaws can be dangerous to an organism. When highly reactive proteins such as prions, amyloid-beta, or tau are too sticky, they can clump into aggregates that kill cells and cause diseases such as Alzheimer's and encephalopathy.
The implication that complexity initially arose by accident may be provocative within the field of evolutionary biology, the authors said. The discovery that flawed proteins are more likely to form complexes could also revolutionize the growing field of bioengineering, where the tools of evolution are used to create stronger, self-assembling, or self-reparing materials.

"Natural designs are often one notch more sophisticated than the best engineering," Fernández said. "This is another example: Nature doesn't change the molecular machinery, but somehow it tinkers with it in subtle ways through the wrapping."
The research was supported by the National Institutes of Health and the National Science Foundation.

Source  Science Daily

Species reemergence after collapse: Possible but different

New study shows how species can reemerge after collapse.

Species pairs that disappear through hybridization after human-induced changes to the environment can reemerge if the disturbance is removed, according to a new mathematical model that shows the conditions under which reemergence might happen.

The findings, published in the journal Evolution, are important for conservationists and ecosystem managers interested in preserving, or even restoring, systems that have been disturbed by human activity.
By simulating environmental disturbances that reduce the ability of individuals to identify and select mates from their own species, the model explores the mechanisms that cause hybridization between closely-related species. Hybridization can lead to population decline and the loss of biodiversity. For instance, certain species of stickleback fish have collapsed into hybrid swarms as water clarity in their native lakes has changed, and certain species of tree frogs have collapsed as vegetation has been removed around their shared breeding ponds. Such hybrid swarms can replace the original species.

"What is happening isn't just speciation in reverse. The model shows that populations after collapse are likely to be different from the parental populations in ways that affect the future evolution of the system," said Tucker Gilman, postdoctoral fellow at the National Institute for Mathematical and Biological Synthesis and the paper's lead author.

According to the model, the reemergence of species pairs was more likely when disturbances were strong than when they were weak, and most likely when disturbances were quickly corrected. However, even temporary bouts of hybridization often led to substantial homogenization of species pairs. This suggests that ecosystem managers may be able to refill ecological niches, but probably won't be able to resurrect lost species after species collapse.

"The encouraging news from an ecosystems service point of view is that, if we act quickly, we may be able to refill ecological niches emptied by species collapse. However, even if we can refill the niches, we probably won't be able to bring back the same species that we lost," Gilman said.

Source  EurekaAlert!

Friday, May 20, 2011

Small insects attacks and kill amphibians much bigger than themselves

New findings of researchers from Tel-Aviv University show that predator-prey interactions between ground beetles of the genus Epomis and amphibians are much more complex than expected. The study was published in the open access journal Zoo Keys.

 This image shows the predation of amphibians by an adult Epomis beetle.

"Amphibians are typical insect predators and their diet may include adult beetles, ground beetles in particular. The recently filmed successful attacks of the beetles on toads and frogs brought new insights on the amphibian-insect interactions, and documented the uncommon phenomenon of invertebrates preying on vertebrate animals," said the senior author Gil Wizen.
 
Previous research has shown that Epomis larvae feed exclusively on amphibians and that this food source is essential for completion of their life cycle, while the diet of the adult beetles consists of terrestrial invertebrates as well as dead vertebrates. Wizen and Gasith's current study shows that adult Epomis beetles can prey upon live amphibians, in addition to their regular diet.

According to the study, the genus Epomis is represented in Israel by two species: E. dejeani and E. circumscriptus. In the central coastal plain these species have similar distribution but do not occur in the same sites. The researchers recorded Epomis sharing shelter with amphibians during the day, but preying on them during the night. In the laboratory, predation behaviour of the adult beetles on five amphibian species was observed: the Green Toad (Bufo viridis), the Savignyi's Frog (Hyla savignyi), the Levant Green Frog (Rana bedriagae), the Banded Newt (Triturus vittatus), and the Fire Salamander (Salamandra salamandra infraimmaculata). These observations showed that the diet of the two Epomis species overlaps only partially, with only one of the species (E. dejeani) preying on the Banded Newt.

The results of this study serve as additional evidence that Epomis beetles, both larvae and adults, are specialized predators of amphibians. Moreover, these beetles prey upon several amphibian species.

 Source EurekaAlert!

Monday, May 16, 2011

Anthropologist discovers new fossil primate species in West Texas

AUSTIN, Texas–Physical anthropologist Chris Kirk has announced the discovery of a previously unknown species of fossil primate, Mescalerolemur horneri, in the Devil's Graveyard badlands of West Texas.
Mescalerolemur lived during the Eocene Epoch about 43 million years ago, and would have most closely resembled a small present-day lemur. Mescalerolemur is a member of an extinct primate group – the adapiforms – that were found throughout the Northern Hemisphere in the Eocene. However, just like Mahgarita stevensi, a younger fossil primate found in the same area in 1973, Mescalerolemur is more closely related to Eurasian and African adapiforms than those from North America.

This is Mescalerolemur horneri's partial upper jaw (in two pieces, at left) and partial lower jaw (at right) (scales = 2 mm).

"These Texas primates are unlike any other Eocene primate community that has ever been found in terms of the species that are represented," says Kirk, associate professor in the Department of Anthropology at The University of Texas at Austin. "The presence of both Mescalerolemur and Mahgarita, which are only found in the Big Bend region of Texas, comes after the more common adapiforms from the Eocene of North America had already become extinct. This is significant because it provides further evidence of faunal interchange between North America and East Asia during the Middle Eocene."

  This is Mescalerolemur horneri's partial right lower jaw (scale = 2 mm).

By the end of the Eocene, primates and other tropically adapted species had all but disappeared from North America due to climatic cooling, so Kirk is sampling the last burst of diversity in North American primates. With its lower latitudes and more equable climate, West Texas offered warm-adapted species a greater chance of survival after the cooling began.

Kirk says Marie Butcher, a then undergraduate who graduated with degrees in anthropology and biology from The University of Texas at Austin, found the first isolated tooth of Mescalerolemur in 2005. Since that time, many more primate fossils have been recovered by Kirk and more than 20 student volunteers at a locality called "Purple Bench." This fossil locality is three to four million years older than the Devil's Graveyard sediments that had previously produced Mahgarita stevensi.

"I initially thought that we had found a new, smaller species of Mahgarita," Kirk says.
However, as more specimens were prepared at the Texas Memorial Museum's Vertebrate Paleontology Lab, Kirk realized he had discovered not just a new species, but a new genus that was previously unknown to science.

Fossils of Mescalerolemur reveal it was a small primate, weighing only about 370 grams. This body weight is similar to that of the living greater dwarf lemur. Mescalerolemur's dental anatomy reveals a close evolutionary relationship with adapiform primates from Eurasia and Africa, including Darwinius masillae, a German fossil primate previously claimed to be a human ancestor. However, the discovery of Mescalerolemur provides further evidence that adapiform primates like Darwinius are more closely related to living lemurs and bush babies than they are to humans.

For example, the right and left halves of Mescalerolemur's lower jaws were two separate bones with a joint along the midline, a common trait for lemurs and bush babies. Mahgarita stevensi, the closest fossil relative of Mescalerolemur, had a completely fused jaw joint like that of humans.
"Because Mescalerolemur and Mahgarita are close relatives, fusion of the lower jaws in Mahgarita must have occurred independently from that observed in humans and their relatives, the monkeys and apes" Kirk says.
The new genus is named Mescalerolemur after the Mescalero Apache, who inhabited the Big Bend region of Texas from about 1700-1880. The species name, horneri, honors Norman Horner, an entomologist and professor emeritus at Midwestern State University (MSU) in Wichita Falls, Texas. Horner helped to establish MSU's Dalquest Desert Research Site, where the new primate fossils were found.

Kirk and his colleague Blythe Williams of Duke University will publish their findings in the Journal of Human Evolution article, "New adapiform primate of Old World affinities from the Devil's Graveyard Formation of Texas."

Source  EurekaAlert!