Sunday 25 October 2009

Snakes helped us to sharpen our vision


By Kenneth Kidd
The need to detect snakes may have given humans an evolutionary nudge toward developing better vision.
When Eve encountered a certain serpent and was bedazzled into eating fruit from the Tree of Knowledge, nothing nice ensued. In the Old Testament, nothing nice ever ensued when God's wrath was aroused.

So while Eve's eyes were, indeed opened, the whole episode ensured a life of toil for Adam and billions of painful childbirths for all women to come.

It now turns out that key elements of that story – serpent, fruit, vision and the large-headed babies that are the proximate cause of painful births – may have played similarly crucial roles in key stages of human evolution, with somewhat more generous results.

Lynne Isbell, a professor of anthropology and animal behaviour at the University of California at Davis (UCD), contends that snakes lie at the root of what makes humans, well, human.

It's a complex thesis, relying on everything from fossil records and primate behaviour to palaeogeography and modern advances in neuroscience. But at the risk of doing violence to that complexity, the essential argument is this:

The earliest predators of our mammal ancestors were snakes, especially constrictors, and venomous snakes can be equally lethal, even when striking defensively. Over the course of millions of years, the need to detect snakes thus gave an evolutionary nudge toward developing better vision.

So for some animals, sight started to become more important than a sense of smell. Better vision, in turn, made it easier for our ancestors to pick out fruits that were ripe, and therefore rich in glucose. We can distinguish reds and oranges from greens.

Through neuroscience, we now know that a vast amount of the human brain is connected to vision, and we also know that glucose is crucial to brain development. Combine the two over millennia, and you end up with today's (painfully) big-headed babies.

Earlier this week, the Star spoke with Isbell, whose book, The Fruit, The Tree and The Serpent, has just been published. The following are edited excerpts from that conversation.

What made you suspect that snakes played such a key role in human evolution?

I was actually looking at a different question: Why is it that female primates in the New World (South America and Central America) are so willing to leave their home areas when there's a heavy cost to leaving. You're more susceptible to predation. But in the Old World, monkeys don't do that. The females stay put.

In my own experience (in Africa), leopards had just decimated the groups of monkeys that I studied in two different places. I started to look at, when did (large predatory) cats get to the New World compared with the Old World?

We know that constrictors are predators of primates, too, so when did constrictors get to the New World?

And it turns out that, not only did constrictors and venomous snakes evolve much earlier than leopards and raptors, they've co-existed with Old World primates for much longer.

They had this bio-geographical relationship with primates that sort of fit with what I knew then to be differences in their visual systems.

In that Old World primates have more advanced vision?

All I knew at the time was that there was a difference is colour vision and the degrees of visual acuity.

There have been other theories about why primates developed better vision, but yours seems to be the first to identify the key role of snakes and to rely heavily on neuroscience. Do you expect neuroscience to start playing a bigger role in answering other evolutionary questions?

It would be great if there could be dialogue back and forth. If people from different disciplines could get together and talk, I think we could make some pretty interesting advances in all fields.

It used to be that mammals were classed into orders based mostly on physical characteristics. Now molecular research is shuffling the deck based on DNA, so some of our closest relatives also include flying lemurs, treeshrews, rabbits and rats. Why didn't our non-primate cousins develop similar vision?

All mammals would have had to deal with constricting snakes, so vision would have been useful to them. But not all mammals eat the same things, to allow them to benefit from the tradeoff between vision and olfaction. If you have good vision, then you're going to lose your sense of smell to some degree.

Because of their diets, then, rabbits and rats had to rely much more on smell to locate greens and seeds?

They eat foods that plants don't want them to eat (so) plants don't evolve ways to make those types of foods smellier. It's the animals that eat fruits that wouldn't suffer any loss if their sense of smell started to get worse. They could afford to expand their vision.

You note how, whenever you're doing field research, it's always the monkeys who detect snakes before anyone else. They get very agitated and even have special alarm calls that only refer to snakes, not other predators.

Yeah. I was just talking with a grad student yesterday. She's worked in Costa Rica and she said that when she's not with the monkeys, she rarely sees snakes, but when she's there with them, she sees them almost every other day. The monkeys point them out.

How much of our fear of snakes is instinctive and how much is socialized?

That's a really good question. Since we are Old World primates ourselves, our ancestors had to deal with snakes for a long time. It's possible that if we're primed to be afraid of snakes, all it takes is for us to learn to be afraid.

Rhesus macaques are not necessarily afraid of snakes the first time they see them, but if they see another rhesus macaque reacting fearfully, then they learn to fear the snakes.

There's an evolutionary preparedness to be afraid of snakes that doesn't exist for more innocuous objects, like flowers. It's probably the same for us.

Is that why snakes figure so prominently in so many myths? Even the mighty Thor gets felled by a snake in Norse mythology.

That's what I was wondering. Why do we focus so much on snakes? Why, unless there's something deep within us, a long evolutionary association with snakes that brings it out in myths and religion.

Malawi could be the cradle of humankind

Mabvuto Banda
KARONGA, Malawi (Reuters) – The latest discovery of pre-historic tools and remains of hominids in Malawi's remote northern district of Karonga provides further proof that the area could be the cradle of humankind, a leading German researcher said.
Professor Friedemann Schrenk of the Goethe University in Frankfurt told Reuters that two students working on the excavation site last month had discovered prehistoric tools and a tooth of an hominid.
"This latest discovery of prehistoric tools and remains of hominids provides additional proof to the theory that the Great Rift Valley of Africa and perhaps the excavation site near Karonga can be considered the cradle of humankind," Schrenk said.
A hominid is a member of a family of primates which includes humans and their prehistoric ancestors.
The discovery was at Malema excavation site, 10 km (6 miles) from Karonga.
The site also contains some of the earliest dinosaurs which lived between 100 million and 140 million years ago and early hominids believed to have lived between a million and 6 million years ago.
He is leading a team of researchers from Europe and Africa to establish an African center for interdisciplinary studies on mammal and hominid evolution in the southern African nation.
Karonga is about 615 km (380 miles) north of the capital Lilongwe and is near the border with Tanzania.

Humans Still Evolving

Eben Harrell – Fri Oct 23
Modern Homo sapiens is still evolving. Despite the long-held view that natural selection has ceased to affect humans because almost everybody now lives long enough to have children, a new study of a contemporary Massachusetts population offers evidence of evolution still in action.

A team of scientists led by Yale University evolutionary biologist Stephen Stearns suggests that if the natural selection of fitter traits is no longer driven by survival, perhaps it owes to differences in women's fertility. "Variations in reproductive success still exist among humans, and therefore some traits related to fertility continue to be shaped by natural selection," Stearns says. That is, women who have more children are more likely to pass on certain traits to their progeny. (See the top 10 scientific discoveries of 2008.)


Stearns' team examined the vital statistics of 2,238 postmenopausal women participating in the Framingham Heart Study, which has tracked the medical histories of some 14,000 residents of Framingham, Mass., since 1948. Investigators searched for correlations between women's physical characteristics - including height, weight, blood pressure and cholesterol levels - and the number of offspring they produced. According to their findings, it was stout, slightly plump (but not obese) women who tended to have more children - "Women with very low body fat don't ovulate," Stearns explains - as did women with lower blood pressure and cholesterol levels. Using a sophisticated statistical analysis that controlled for any social or cultural factors that could impact childbearing, researchers determined that these characteristics were passed on genetically from mothers to daughters and granddaughters.

If these trends were to continue with no cultural changes in the town for the next 10 generations, by 2409 the average Framingham woman would be 2 cm (0.8 in) shorter, 1 kg (2.2 lb.) heavier, have a healthier heart, have her first child five months earlier and enter menopause 10 months later than a woman today, the study found. "That rate of evolution is slow but pretty similar to what we see in other plants and animals. Humans don't seem to be any exception," Stearns says.


Douglas Ewbank, a demographer at the University of Pennsylvania who undertook the statistical analysis for the study, which was published Oct. 21 in the Proceedings of the National Academy of Sciences (PNAS), says that because cultural factors tend to have a much more prominent impact than natural selection in the shaping of future generations, people tend to write off the effect of evolution. "Those changes we predict for 2409 could be wiped out by something as simple as a new school-lunch program. But whatever happens, it's likely that in 2409, Framingham women will be 2 cm shorter and 1 kg heavier than they would have been without natural selection. Evolution is a very slow process. We don't see it if we look at our grandparents, but it's there."

Other recent genetic research has backed up that notion. One study, published in PNAS in 2007 and led by John Hawks, an anthropologist at the University of Wisconsin at Madison, found that some 1,800 human gene variations had become widespread in recent generations because of their modern-day evolutionary benefits. Among those genetic changes, discovered by examining more than 3 million DNA variants in 269 individuals: mutations that allow people to digest milk or resist malaria and others that govern brain development.

But not all evolutionary changes make inherent sense. Since the Industrial Revolution, modern humans have grown taller and stronger, so it's easy to assume that evolution is making humans fitter. But according to anthropologist Peter McAllister, author of Manthropology: the Science of Inadequate Modern Man, the contemporary male has evolved, at least physically, into "the sorriest cohort of masculine Homo sapiens to ever walk the planet." Thanks to genetic differences, an average Neanderthal woman, McAllister notes, could have whupped Arnold Schwarzenegger at his muscular peak in an arm-wrestling match. And prehistoric Australian Aborigines, who typically built up great strength in their joints and muscles through childhood and adolescence, could have easily beat Usain Bolt in a 100-m dash.


Steve Jones, an evolutionary biologist at University College London who has previously held that human evolution was nearing its end, says the Framingham study is indeed an important example of how natural selection still operates through inherited differences in reproductive ability. But Jones argues that variation in female fertility - as measured in the Framingham study - is a much less important factor in human evolution than differences in male fertility. Sperm hold a much higher chance of carrying an error or mutation than an egg, especially among older men. "While it used to be that men had many children in older age to many different women, now men tend to have only a few children at a younger age with one wife. The drop in the number of older fathers has had a major effect on the rate of mutation and has at least reduced the amount of new diversity - the raw material of evolution. Darwin's machine has not stopped, but it surely has slowed greatly," Jones says. (See TIME's special report on the environment.)

Despite evidence that human evolution still functions, biologists concede that it's anyone's guess where it will take us from here. Artificial selection in the form of genetic medicine could push natural selection into obsolescence, but a lethal pandemic or other cataclysm could suddenly make natural selection central to the future of the species. Whatever happens, Jones says, it is worth remembering that Darwin's beautiful theory has suffered a long history of abuse. The bastard science of eugenics, he says, will haunt humanity as long as people are tempted to confuse evolution with improvement. "Uniquely in the living world, what makes humans what we are is in our minds, in our society, and not in our evolution," he says.

Wednesday 14 October 2009

Modern man a wimp says anthropologist

Tue Oct 13, 2009
By John Mehaffey
LONDON (Reuters) - Many prehistoric Australian aboriginals could have outrun world 100 and 200 meters record holder Usain Bolt in modern conditions.

Some Tutsi men in Rwanda exceeded the current world high jump record of 2.45 meters during initiation ceremonies in which they had to jump at least their own height to progress to manhood.

Any Neanderthal woman could have beaten former bodybuilder and current California governor Arnold Schwarzenegger in an arm wrestle.

These and other eye-catching claims are detailed in a book by Australian anthropologist Peter McAllister entitled "Manthropology" and provocatively sub-titled "The Science of the Inadequate Modern Male."

McAllister sets out his stall in the opening sentence of the prologue.

"If you're reading this then you -- or the male you have bought it for -- are the worst man in history.

"No ifs, no buts -- the worst man, period...As a class we are in fact the sorriest cohort of masculine Homo sapiens to ever walk the planet."

Delving into a wide range of source material McAllister finds evidence he believes proves that modern man is inferior to his predecessors in, among other fields, the basic Olympic athletics disciplines of running and jumping.

His conclusions about the speed of Australian aboriginals 20,000 years ago are based on a set of footprints, preserved in a fossilized claypan lake bed, of six men chasing prey.

FLEET-FOOTED ABORIGINALS

An analysis of the footsteps of one of the men, dubbed T8, shows he reached speeds of 37 kph on a soft, muddy lake edge. Bolt, by comparison, reached a top speed of 42 kph during his then world 100 meters record of 9.69 seconds at last year's Beijing Olympics.

In an interview in the English university town of Cambridge where he was temporarily resident, McAllister said that, with modern training, spiked shoes and rubberized tracks, aboriginal hunters might have reached speeds of 45 kph.

"We can assume they are running close to their maximum if they are chasing an animal," he said.

"But if they can do that speed of 37 kph on very soft ground I suspect there is a strong chance they would have outdone Usain Bolt if they had all the advantages that he does.

"We can tell that T8 is accelerating toward the end of his tracks."

McAllister said it was probable that any number of T8's contemporaries could have run as fast.

"We have to remember too how incredibly rare these fossilizations are," he said. "What are the odds that you would get the fastest runner in Australia at that particular time in that particular place in such a way that was going to be preserved?"

Turning to the high jump, McAllister said photographs taken by a German anthropologist showed young men jumping heights of up to 2.52 meters in the early years of last century.

STARK DECLINE

"It was an initiation ritual, everybody had to do it. They had to be able to jump their own height to progress to manhood," he said.

"It was something they did all the time and they lived very active lives from a very early age. They developed very phenomenal abilities in jumping. They were jumping from boyhood onwards to prove themselves."

McAllister said a Neanderthal woman had 10 percent more muscle bulk than modern European man. Trained to capacity she would have reached 90 percent of Schwarzenegger's bulk at his peak in the 1970s.

"But because of the quirk of her physiology, with a much shorter lower arm, she would slam him to the table without a problem," he said.

Manthropology abounds with other examples:

* Roman legions completed more than one-and-a-half marathons a day carrying more than half their body weight in equipment.

* Athens employed 30,000 rowers who could all exceed the achievements of modern oarsmen.

* Australian aboriginals threw a hardwood spear 110 meters or more (the current world javelin record is 98.48).

McAllister said it was difficult to equate the ancient spear with the modern javelin but added: "Given other evidence of Aboriginal man's superb athleticism you'd have to wonder whether they couldn't have taken out every modern javelin event they entered."

Why the decline?

"We are so inactive these days and have been since the industrial revolution really kicked into gear," McAllister replied. "These people were much more robust than we were.

"We don't see that because we convert to what things were like about 30 years ago. There's been such a stark improvement in times, technique has improved out of sight, times and heights have all improved vastly since then but if you go back further it's a different story.

"At the start of the industrial revolution there are statistics about how much harder people worked then.

"The human body is very plastic and it responds to stress. We have lost 40 percent of the shafts of our long bones because we have much less of a muscular load placed upon them these days.

"We are simply not exposed to the same loads or challenges that people were in the ancient past and even in the recent past so our bodies haven't developed. Even the level of training that we do, our elite athletes, doesn't come close to replicating that.

"We wouldn't want to go back to the brutality of those days but there are some things we would do well to profit from."

(Editing by Clare Fallon; To query or comment on this story email sportsfeedback@thomsonreuters.com)
Can Evolution Run in Reverse? A Study Says It’s a One-Way Street
By CARL ZIMMER
Evolutionary biologists have long wondered if history can run backward. Is it possible for the proteins in our bodies to return to the old shapes and jobs they had millions of years ago?
Examining the evolution of one protein, a team of scientists declares the answer is no, saying new mutations make it practically impossible for evolution to reverse direction. “They burn the bridge that evolution just crossed,” said Joseph W. Thornton, a biology professor at the University of Oregon and co-author of a paper on the team’s findings in the current issue of Nature.
The Belgian biologist Louis Dollo was the first scientist to ponder reverse evolution. “An organism never returns to its former state,” he declared in 1905, a statement later dubbed Dollo’s law.
To see if he was right, biologists have reconstructed evolutionary history. In 2003, for example, a team of scientists studied wings on stick insects. They found that the insects’ common ancestor had wings, but some of its descendants lost them. Later, some of those flightless insects evolved wings again.
Yet this study did not necessarily refute Dollo’s law. The stick insects may indeed have evolved a new set of wings, but it is not clear whether this change appeared as reverse evolution at the molecular level. Did the insects go back to the exact original biochemistry for building wings, or find a new route, essentially evolving new proteins?
Dr. Thornton and his colleagues took a close look at the possibility of reverse evolution at this molecular level. They studied a protein called a glucocorticoid receptor that helps humans and most other vertebrates cope with stress by grabbing a hormone called cortisol and then switching on stress-defense genes.
By comparing the receptor to related proteins, the scientists reconstructed its history. Some 450 million years ago, it started out with a different shape that allowed it to grab tightly to other hormones, but only weakly to cortisol. Over the next 40 million years, the receptor changed shape, so that it became very sensitive to cortisol but could no longer grab other hormones.
During those 40 million years, Dr. Thornton found, the receptor changed in 37 spots, only 2 of which made the receptor sensitive to cortisol. Another 5 prevented it from grabbing other hormones. When he made these 7 changes to the ancestral receptor, it behaved just like a new glucocorticoid receptor.
Dr. Thornton reasoned that if he carried out the reverse operation, he could turn a new glucocorticoid receptor into an ancestral one. So he and his colleagues reversed these key mutations to their old form.
To Dr. Thornton’s surprise, the experiment failed. “All we got was a completely dead receptor,” he said.
To figure out why they could go forward but not backward, Dr. Thornton and his colleagues looked closely again at the old and new receptors. They discovered five additional mutations that were crucial to the transition. If they reversed these five mutations as well, the new receptor behaved like an old one.
Based on these results, Dr. Thornton and his colleagues concluded that the evolution of the receptor unfolded in two chapters. In the first, the receptor acquired the seven key mutations that made it sensitive to cortisol and not to other hormones. In the second, it acquired the five extra mutations, which Dr. Thornton called “restrictive” mutations.
These restrictive mutations may have fine-tuned how the receptor grabbed cortisol. Or they may have had no effect at all. In either case, these five mutations added twists and tails to the receptor. When Dr. Thornton tried to return the receptor to its original form, these twists and tails got in the way.
Dr. Thornton argues that once the restrictive mutations evolved, they made it practically impossible for the receptor to evolve back to its original form. The five key mutations could not be reversed first, because the receptor would be rendered useless. Nor could the seven restrictive mutations be reversed first. Those mutations had little effect on how the receptor grabbed hormones. So there was no way that natural selection could favor individuals with reversed mutations.
For now it is an open question whether other proteins have an equally hard time evolving backward. But Dr. Thornton suspects they do.
“I would never say evolution is never reversible,” Dr. Thornton said. But he thinks it can only go backward when the evolution of the trait is simple, like when a single mutation is involved. When new traits are produced by several mutations that influence one another, he argues, that complexity shuts off reverse evolution. “We know that kind of complexity is very common,” he said.
If this molecular Dollo’s law holds up, Dr. Thornton believes it says something important about the course of evolutionary history. Natural selection can achieve many things, but it is hemmed in. Even harmless, random mutations can block its path.
“The biology we ended up with was not inevitable,” he said. “It was just one roll of the evolutionary dice.”
Source: New York Times Sep 29 2009

Thursday 1 October 2009

Our oldest ancestror found


Before Lucy came Ardi, new earliest hominid found
Randolph E. Schmid, Ap Science Writer
WASHINGTON – The story of humankind is reaching back another million years as scientists learn more about "Ardi," a hominid who lived 4.4 million years ago in what is now Ethiopia. The 110-pound, 4-foot female roamed forests a million years before the famous Lucy, long studied as the earliest skeleton of a human ancestor.

This older skeleton reverses the common wisdom of human evolution, said anthropologist C. Owen Lovejoy of Kent State University.

Rather than humans evolving from an ancient chimp-like creature, the new find provides evidence that chimps and humans evolved from some long-ago common ancestor — but each evolved and changed separately along the way.

"This is not that common ancestor, but it's the closest we have ever been able to come," said Tim White, director of the Human Evolution Research Center at the University of California, Berkeley.

The lines that evolved into modern humans and living apes probably shared an ancestor 6 million to 7 million years ago, White said in a telephone interview.

But Ardi has many traits that do not appear in modern-day African apes, leading to the conclusion that the apes evolved extensively since we shared that last common ancestor.

A study of Ardi, under way since the first bones were discovered in 1994, indicates the species lived in the woodlands and could climb on all fours along tree branches, but the development of their arms and legs indicates they didn't spend much time in the trees. And they could walk upright, on two legs, when on the ground.

Formally dubbed Ardipithecus ramidus — which means root of the ground ape — the find is detailed in 11 research papers published Thursday by the journal Science.

"This is one of the most important discoveries for the study of human evolution," said David Pilbeam, curator of paleoanthropology at Harvard's Peabody Museum of Archaeology and Ethnology.

"It is relatively complete in that it preserves head, hands, feet and some critical parts in between. It represents a genus plausibly ancestral to Australopithecus — itself ancestral to our genus Homo," said Pilbeam, who was not part of the research teams.

Scientists assembled the skeleton from 125 pieces.

Lucy, also found in Africa, thrived a million years after Ardi and was of the more human-like genus Australopithecus.

"In Ardipithecus we have an unspecialized form that hasn't evolved very far in the direction of Australopithecus. So when you go from head to toe, you're seeing a mosaic creature that is neither chimpanzee, nor is it human. It is Ardipithecus," said White.

White noted that Charles Darwin, whose research in the 19th century paved the way for the science of evolution, was cautious about the last common ancestor between humans and apes.

"Darwin said we have to be really careful. The only way we're really going to know what this last common ancestor looked like is to go and find it. Well, at 4.4 million years ago we found something pretty close to it," White said. "And, just like Darwin appreciated, evolution of the ape lineages and the human lineage has been going on independently since the time those lines split, since that last common ancestor we shared."

Some details about Ardi in the collection of papers:

• Ardi was found in Ethiopia's Afar Rift, where many fossils of ancient plants and animals have been discovered. Findings near the skeleton indicate that at the time it was a wooded environment. Fossils of 29 species of birds and 20 species of small mammals were found at the site.

• Geologist Giday WoldeGabriel of Los Alamos National Laboratory was able to use volcanic layers above and below the fossil to date it to 4.4 million years ago.

• Ardi's upper canine teeth are more like the stubby ones of modern humans than the long, sharp, pointed ones of male chimpanzees and most other primates. An analysis of the tooth enamel suggests a diverse diet, including fruit and other woodland-based foods such as nuts and leaves.

• Paleoanthropologist Gen Suwa of the University of Tokyo reported that Ardi's face had a projecting muzzle, giving her an ape-like appearance. But it didn't thrust forward quite as much as the lower faces of modern African apes do. Some features of her skull, such as the ridge above the eye socket, are quite different from those of chimpanzees. The details of the bottom of the skull, where nerves and blood vessels enter the brain, indicate that Ardi's brain was positioned in a way similar to modern humans, possibly suggesting that the hominid brain may have been already poised to expand areas involving aspects of visual and spatial perception.

• Ardi's hand and wrist were a mix of primitive traits and a few new ones, but they don't include the hallmark traits of the modern tree-hanging, knuckle-walking chimps and gorillas. She had relatively short palms and fingers which were flexible, allowing her to support her body weight on her palms while moving along tree branches, but she had to be a careful climber because she lacked the anatomical features that allow modern-day African apes to swing, hang and easily move through the trees.

• The pelvis and hip show the gluteal muscles were positioned so she could walk upright.

• Her feet were rigid enough for walking but still had a grasping big toe for use in climbing.

The research was funded by the National Science Foundation, the Institute of Geophysics and Planetary Physics of the University of California, Los Alamos National Laboratory, the Japan Society for the Promotion of Science and others.

Ardi, Humans and Primates
Between present humans and our earliest prehuman ancestor, there is a direct genetic and evolutionary link, a clear map of descent that includes the earliest common ancestors we share with other primates. We just don’t know what it looks like yet. Whether paleontologists will ever be able to fill in all the details on that map depends on discoveries like one made by a team of scientists led by Tim D. White from the University of California, Berkeley — the fossils of a species called Ardipithecus ramidus, or Ardi for short.

According to a report in the journal Science, Ardi pushes the hominid story back to 4.4 million years ago and to a site in the Afar Rift region of Ethiopia. She (the most complete skeleton is probably female) also pushes the human story into a different ecosystem than Australopithecus, the grassland ancestor who lived, in various subspecies, as long as 3.7 million years ago. Ardi, who was discovered in 1992, lived in a “woodland with small patches of forest,” a discovery that downplays the importance of open grassland to human evolution.

Like Australopithecus, she walked upright without most of the characteristic postures of chimpanzees and gorillas. Her skull is smaller than Australopithecus, about the same size as that of a bonobo.

Paleontologists are not looking for a “missing link” between humans and present-day primates closest to us — gorillas, chimpanzees and bonobos. What they’re hoping to find is the earliest common ancestor from which the separate lines of development leading to humans and modern great apes emerged. Ardi is not that common ancestor. If anything, this find helps demonstrate how quickly early hominids moved down a separate path of evolution. It also suggests that living primates do not represent some primitive stage of a shared ancestry but are, as the scientists write, “highly specialized, but through very different evolutionary pathways.”

These are tremendously important discoveries, recasting the story of hominid evolution and making us eager for the next chapter.
Also check

Wednesday 9 September 2009

The Beagle returns to Montevideo!


This year will be unforgettable for all of us who appreciate the work of Darwin and the evolution theory. Not only did we have the pleasure of a superb congress in Punta del Este last week, which included a presentation by philosopher Daniel Dennet, but also we will enjoy a historic visit.


A Re-enactment of Darwin’s expedition arrives in Montevideo late October. It was launched on September first from Plymouth, England and should be calling in Montevideo, Uruguay at the end of October. The ship, Stad Amsterdam will be playing the part of HMS Beagle

As you surely know, on December 27, 1831, Darwin left Plymouth on the voyage which later formed the basis of his book On the Origin Of Species By Means Of Natural Selection, and at the beginning of this month the ambitious expedition left for the same route – complete with one of Darwin's descendants, Sarah Darwin among the crew.
The role HMS Beagle is played by the Dutch clipper Stad Amsterdam that will be carrying renowned historians, scientists and even actor John Malkovich who will feature on the voyage to be filmed for a new series which will be broadcast to millions on Dutch and Belgian television.
Sarah Darwin’s departure marked the beginning of the incredible 8-month trip around the world. “Beagle, on the future of species” is the European project of the Dutch and Flemish public broadcasting companies VPRO and Canvas. As Charles Darwin contemplated the origin of species on Earth, the producers and their guests on the Beagle project will address their future.
This year marks the 200th anniversary of Darwin's birth and the 150th year since publication of his world-changing book.
The expedition will take the crew across the Atlantic, to Brazil and Patagonia, around Cape Horn and up the west coast of South America, from the Andes mountain range to the Galapagos Islands, across the Pacific to Australia and back to Europe via the Cape of Good Hope
The tentative schedule for the Atlantic ocean is that this week “Beagle” should be arriving in Tenerife, Canary Islands. October 12 to 16, Rio do Janeiro; Montevideo, 23/25 October; Buenos Aires, 26/30 October; Bahía Blanca, November 2/5, Puerto Madryn and Punta Arenas, Chile, November17/20.
More info: http://onthefutureofspecies.nl/

Monday 24 August 2009

Brian Eno and Richard Dawkins, art in evolution




It seems like September will be a great celebratory month. Not only full of meetings

Coming very soon, a dialog between Brian Eno, one of the directors of the long now foundation and the exquisite musician that gave us records like Before and After Science, or Another Green World, and Richard Dawkins, probably the most important living scientist and philosopher, author of The Selfish Gene and God Delusion.


The meeting will be at the Oxford Playhouse on September 4, and just six days later we will be able to enjoy the new Richard Dawkins opus: The Greatest Show on Earth: The Evidence for Evolution.


The diversity of evolutionary thinking continues to grow.





Friday 14 August 2009

Daniel Dennet in Uruguay!


It will be the best to celebrate the 200 years of Charles Darwin in Punta del Este. Among several scientists, the author of Darwin Dangerous Idea will be arriving to Punta del Este, Uruguay, and giving a lecture in September 4th.

Since I will have the great honour of helping for the event, I will not be posting quite regularly.

Please find more information at http://www.darwin200.edu.uy/schedule.htm

Saturday 1 August 2009

Genes memes and ...a third kind of replicator?

The proposal is from Sue Blackmore, the author of the Meme Machine, and a superb thinker in Universal Darwinism. May we consider technology as memes, or a new and different kind of replicators?
Read yourself

Evolution's third replicator: Genes, memes, and now what?
31 July 2009 by Susan Blackmore
(Published at New Scientist magazine)

WE HUMANS have let loose something extraordinary on our planet - a third replicator - the consequences of which are unpredictable and possibly dangerous.

What do I mean by "third replicator"? The first replicator was the gene - the basis of biological evolution. The second was memes - the basis of cultural evolution. I believe that what we are now seeing, in a vast technological explosion, is the birth of a third evolutionary process. We are Earth's Pandoran species, yet we are blissfully oblivious to what we have let out of the box.

This might sound apocalyptic, but it is how the world looks when we realise that Darwin's principle of evolution by natural selection need not apply just to biology. Given some kind of copying machinery that makes lots of slightly different copies of the same information, and given that only a few of those copies survive to be copied again, an evolutionary process must occur and design will appear out of destruction. You might call it "design by death" since clever designs thrive because of the many failures that don't.

The information that is copied, varied and selected is called the replicator, and the process is well understood when applied to biology. Genes are copied, mutated and selected over and over again. Assemblages of genes are used to build vehicles that carry them around, protect them and propagate them. These vehicles - the lumbering robots, as Richard Dawkins calls them - are animals and plants, the prolific and exquisitely designed products of the first replicator.

About 4 billion years after the appearance of the first replicator, something extraordinary happened. Members of one species of lumbering robot began to imitate one another. Imitation is a kind of copying, and so a new evolutionary process was born. Instead of cellular chemistry copying the order of bases on DNA, a sociable species of bipedal ape began to use its big brain to copy gestures, sounds and other behaviours. This copying might not have been very accurate, but it was enough to start a new evolutionary process. Dawkins called the new replicators "memes". A living creature, once just a vehicle of the first replicator, was now the copying machinery for the next.

The idea of memes as a cultural analogue of genes has been much maligned, and most biologists still reject it. Yet memetics has much to offer in explaining human nature. According to meme theory, humans are radically different from all other species because we alone are meme machines. Human intelligence is not just a bit more or a bit better than other kinds of intelligence, it is something completely different, based on a new evolutionary process and a new kind of information.

The main difference between conventional theories and memetics is this: most biologists assume that culture and language evolved because they helped humans survive and pass on their genes, and that genes retain ultimate control. Memetics challenges that assumption. Although the capacity for imitation must once have been adaptive for the apes who started it, evolution has no foresight and could not have predicted the consequences of letting loose a new evolutionary process. Nor could it have retained control of memes once they began evolving in their own right.

So memes began to proliferate. What began as an adaptation soon became like a parasite - a new evolving entity that changed the apes and their world forever. Once memes were proliferating, individuals benefited from copying the latest and most successful ones, and then passed on any genes that helped them do so. This "memetic drive" forced their brains to get bigger and bigger, and to become adept at copying the most successful memes, eventually leading to language, art, music, ritual and religion - the successful designs of human culture.

This process was dangerous. Small brains are much more efficient if you don't have to copy anything, but once memes are around you cannot survive unless you do. So brains had to get bigger, and big brains are costly to produce, dangerous to give birth to and expensive to run.

There is also danger in what is copied. If you start copying anything at all then you might copy dangerous memes, like throwing yourself off a cliff or using up all your resources in pointless rituals. This creates an arms race between two selfish replicators - memes benefiting from brains that copy anything and everything; genes benefiting from brains that are smaller, more efficient and highly selective.

Either of these dangers might have finished our ancestors off, but they pulled through. The result was a compromise, with human brains being just about as big as our bodies could stand, and yet selective enough to avoid copying lethal memes. In the same way that parasites tend to co-evolve with their hosts to become less lethal, so memes co-evolved with us. Languages, religions, skills and fashions that began as parasites turned into symbionts. Not only do we get along with our memes now, we could not live without them.

There was also a cost to the rest of life on Earth. Wherever they went humans took memes with them, spreading agriculture and changing the landscape, obliterating some species, domesticating others and changing whole ecosystems. Then, much more recently, they began to build radically new kinds of technology, and the changes they effected dwarfed anything that had gone before. Was this just more of the same or something new?

In all my previous work in memetics I have used the term "meme" to apply to any information that is copied between people, including stories in books, ideas embodied in new technology, websites and so on. The reason was that there seemed no way of distinguishing between "natural" human memes, such as spoken words, habits, fashions, art and religions, and what we might call "artificial" memes, such as websites and high-tech goods. So on the grounds that a false distinction is worse than none I stuck to the term "meme". Yet an email encrypted in digital code, broken into tiny packets and beamed around the planet does seem qualitatively different from someone shaking hands and saying "Hi". Could there be a fundamental principle lurking here? If we ask what made memes different from genes, would that help us decide what would make a new replicator different from memes?

Putting it that way makes the answer easier to see. Memes are a new kind of information - behaviours rather than DNA - copied by a new kind of machinery - brains rather than chemicals inside cells. This is a new evolutionary process because all of the three critical stages - copying, varying and selection - are done by those brains. So does the same apply to new technology?

There is a new kind of information: electronically processed binary information rather than memes. There is also a new kind of copying machinery: computers and servers rather than brains. But are all three critical stages carried out by that machinery?

We're close. We may even be right on the cusp. Think of programs that write original poetry or cobble together new student essays, or programs that store information about your shopping preferences and suggest books or clothes you might like next. They may be limited in scope, dependent on human input and send their output to human brains, but they copy, select and recombine the information they handle.

Machines now copy information to other machines without human intervention
Or think of Google. It copies information, selects what it needs and puts the selections together in new variations - that's all three. The temptation is to think that since we designed search engines and other technologies for our own use they must remain subservient to us. But if a new replicator is involved we must think again. Search results go not only to screens for people to look at, but to other programs, commercial applications and even viruses - that's machines copying information to other machines without the intervention of a human brain. From there, we should expect the system to grow rapidly beyond our control and for our role in it to change. We should also expect design to appear spontaneously, and it does. Much of the content on the web is now designed automatically by machines rather than people.

The temptation is to think that technology we designed must remain subservient to us - but think again
Memes work differently from genes, and digital information works differently from memes, but some general principles apply to them all. The accelerating expansion, the increasing complexity, and the improving interconnectivity of all three are signs that the same fundamental design process is driving them all. Road networks look like vascular systems, and both look like computer networks, because interconnected systems outcompete isolated systems. The internet connects billions of computers in trillions of ways, just as a human brain connects billions of neurons in trillions of ways. Their uncanny resemblance is because they are doing a similar job.

So where do we go from here? We humans were vehicles for the first replicator and copying machinery for the second. What will we be for the third? For now we seem to have handed over most of the storage and copying duties to our new machines, but we still do much of the selection, which is why the web is so full of sex, drugs, food, music and entertainment. But the balance is shifting.

Outnumbered
Last year Google announced that the web had passed the trillion mark, with more than 1,000,000,000,000 unique URLs. Many countries now have nearly as many computers as people, and if you count phones and other connected gadgets they far outnumber people. Even if we all spent all day reading this stuff it would expand faster than we could keep up.

Billions of years ago, free-living bacteria are thought to have become incorporated into living cells as energy-providing mitochondria. Both sides benefited from the deal. Perhaps the same is happening to us now. The growing web of machines we let loose needs us to run the power stations, build the factories that make the computers, and repair things when they go wrong - and will do for some time yet. In return we get entertainment, tedious tasks done for us, facts at the click of a mouse and as much communication as we can ask for. It's a deal we are not likely to turn down.

Yet this shift to a new replicator may be a dangerous tipping point. Our ancestors could have killed themselves off with their large brains and dangerous memes, but they pulled through. This time the danger is to the whole planet. Gadgets like phones and PCs are already using 15 per cent of household power and rising (New Scientist, 23 May, p 17); the web is using over 5 per cent of the world's entire power and rising. We blame ourselves for climate change and resource depletion, but perhaps we should blame this new evolutionary process that is greedy, selfish and utterly blind to the consequences of its own expansion. We at least have the advantage that we can understand what is happening. That must be the first step towards working out what, if anything, to do about it.

Your ideas: Help find a name for the third replicator

Replicators on other planets?
We are able to ask the question "Are we alone in the universe?" because our ancestors created memes, turning Earth into a "two replicator", or R2, planet, rich in language and culture. We are able to contemplate communicating with other worlds because Earth is fast becoming an R3 planet, rich in digital technology that passes information around at the speed of light, and with the potential to send it out into the galaxy. How many other planets have taken a similar course? And why haven't we heard from them yet?

The standard approach to answering that question focuses on the search for extraterrestrial intelligence. In 1961 Frank Drake proposed his famous equation for estimating the number of intelligent civilisations capable of communicating with us in our own galaxy. It includes the rate of star formation, the fraction of stars with planets, the fraction of planets that can sustain life and the fraction that get intelligent life and then technology.

Perhaps intelligence and civilisation are not what we should be concentrating on. My analysis based on Universal Darwinism suggests that instead we should be looking for R3 planets. The number of those in our galaxy will depend on the probability of a planet getting a stable first replicator, then a second, and then a third. Maybe each step is hard, or maybe each is easy but dangerous. This new and simpler equation won't tell us the answers, but by posing new questions it may help us understand why - so far - we have not heard from anyone else out there.

Susan Blackmore is a writer and psychologist based in Devon, UK

Tuesday 28 July 2009

Singularity is coming


Machines evolve faster than we -humans- do.

Is that something to worry? Remember Hal, from 2001 space odity

Some scientis think so.
In the photograph, a robot that plugs itself when perceives it has low battery reserves.
The report is from New York Times


July 26, 2009
Scientists Worry Machines May Outsmart Man
By JOHN MARKOFF
A robot that can open doors and find electrical outlets to recharge itself. Computer viruses that no one can stop. Predator drones, which, though still controlled remotely by humans, come close to a machine that can kill autonomously.

Impressed and alarmed by advances in artificial intelligence, a group of computer scientists is debating whether there should be limits on research that might lead to loss of human control over computer-based systems that carry a growing share of society’s workload, from waging war to chatting with customers on the phone.

Their concern is that further advances could create profound social disruptions and even have dangerous consequences.

As examples, the scientists pointed to a number of technologies as diverse as experimental medical systems that interact with patients to simulate empathy, and computer worms and viruses that defy extermination and could thus be said to have reached a “cockroach” stage of machine intelligence.

While the computer scientists agreed that we are a long way from Hal, the computer that took over the spaceship in “2001: A Space Odyssey,” they said there was legitimate concern that technological progress would transform the work force by destroying a widening range of jobs, as well as force humans to learn to live with machines that increasingly copy human behaviors.

The researchers — leading computer scientists, artificial intelligence researchers and roboticists who met at the Asilomar Conference Grounds on Monterey Bay in California — generally discounted the possibility of highly centralized superintelligences and the idea that intelligence might spring spontaneously from the Internet. But they agreed that robots that can kill autonomously are either already here or will be soon.

They focused particular attention on the specter that criminals could exploit artificial intelligence systems as soon as they were developed. What could a criminal do with a speech synthesis system that could masquerade as a human being? What happens if artificial intelligence technology is used to mine personal information from smart phones?

The researchers also discussed possible threats to human jobs, like self-driving cars, software-based personal assistants and service robots in the home. Just last month, a service robot developed by Willow Garage in Silicon Valley proved it could navigate the real world.

A report from the conference, which took place in private on Feb. 25, is to be issued later this year. Some attendees discussed the meeting for the first time with other scientists this month and in interviews.

The conference was organized by the Association for the Advancement of Artificial Intelligence, and in choosing Asilomar for the discussions, the group purposefully evoked a landmark event in the history of science. In 1975, the world’s leading biologists also met at Asilomar to discuss the new ability to reshape life by swapping genetic material among organisms. Concerned about possible biohazards and ethical questions, scientists had halted certain experiments. The conference led to guidelines for recombinant DNA research, enabling experimentation to continue.

The meeting on the future of artificial intelligence was organized by Eric Horvitz, a Microsoft researcher who is now president of the association.

Dr. Horvitz said he believed computer scientists must respond to the notions of superintelligent machines and artificial intelligence systems run amok.

The idea of an “intelligence explosion” in which smart machines would design even more intelligent machines was proposed by the mathematician I. J. Good in 1965. Later, in lectures and science fiction novels, the computer scientist Vernor Vinge popularized the notion of a moment when humans will create smarter-than-human machines, causing such rapid change that the “human era will be ended.” He called this shift the Singularity.

This vision, embraced in movies and literature, is seen as plausible and unnerving by some scientists like William Joy, co-founder of Sun Microsystems. Other technologists, notably Raymond Kurzweil, have extolled the coming of ultrasmart machines, saying they will offer huge advances in life extension and wealth creation.

“Something new has taken place in the past five to eight years,” Dr. Horvitz said. “Technologists are replacing religion, and their ideas are resonating in some ways with the same idea of the Rapture.”

The Kurzweil version of technological utopia has captured imaginations in Silicon Valley. This summer an organization called the Singularity University began offering courses to prepare a “cadre” to shape the advances and help society cope with the ramifications.

“My sense was that sooner or later we would have to make some sort of statement or assessment, given the rising voice of the technorati and people very concerned about the rise of intelligent machines,” Dr. Horvitz said.

The A.A.A.I. report will try to assess the possibility of “the loss of human control of computer-based intelligences.” It will also grapple, Dr. Horvitz said, with socioeconomic, legal and ethical issues, as well as probable changes in human-computer relationships. How would it be, for example, to relate to a machine that is as intelligent as your spouse?

Dr. Horvitz said the panel was looking for ways to guide research so that technology improved society rather than moved it toward a technological catastrophe. Some research might, for instance, be conducted in a high-security laboratory.

The meeting on artificial intelligence could be pivotal to the future of the field. Paul Berg, who was the organizer of the 1975 Asilomar meeting and received a Nobel Prize for chemistry in 1980, said it was important for scientific communities to engage the public before alarm and opposition becomes unshakable.

“If you wait too long and the sides become entrenched like with G.M.O.,” he said, referring to genetically modified foods, “then it is very difficult. It’s too complex, and people talk right past each other.”

Tom Mitchell, a professor of artificial intelligence and machine learning at Carnegie Mellon University, said the February meeting had changed his thinking. “I went in very optimistic about the future of A.I. and thinking that Bill Joy and Ray Kurzweil were far off in their predictions,” he said. But, he added, “The meeting made me want to be more outspoken about these issues and in particular be outspoken about the vast amounts of data collected about our personal lives.”

Despite his concerns, Dr. Horvitz said he was hopeful that artificial intelligence research would benefit humans, and perhaps even compensate for human failings. He recently demonstrated a voice-based system that he designed to ask patients about their symptoms and to respond with empathy. When a mother said her child was having diarrhea, the face on the screen said, “Oh no, sorry to hear that.”

A physician told him afterward that it was wonderful that the system responded to human emotion. “That’s a great idea,” Dr. Horvitz said he was told. “I have no time for that.”

Ken Conley/Willow Garage

Wednesday 22 July 2009

Darwin, psychology and the way we spend our money

Is Darwin Running Up Your Credit Cards?
by Laura Rowley
Posted on Wednesday, July 15, 2009, 12:00AM
If you're struggling with overspending and don't know where the money's going, Darwin may provide the answer.
In the new book "Spent: Sex, Evolution and Consumer Behavior," evolutionary psychologist Geoffrey Miller argues humans are instinctively driven to spend money in an effort to display winning qualities and high status to others. And, not surprisingly, that can result in dysfunctional spending behaviors.
Conspicuous consumption "is not an inevitable outcome of human nature, but it's an understandable way that human nature will try to display itself in a market economy," explains Miller, who teaches at the University of New Mexico. "So instead of trying to attract mates and friends by being the best mammoth hunter, we try to be the best lawyer or the most successful entrepreneur, and display success through the goods and services we buy." Miller's book doesn't examine purchases that are merely useful or pleasurable. I buy a certain kind of Nike running shoe because it minimizes the painful stabbing in my left foot (plantar fasciitis), not because I unconsciously strive to signal my fitness to potential mates (which could complicate my marriage).
The Trouble With Marketing
But Miller suggests a good chunk of spending is prompted by unconscious desires that we have adapted over millennia to signal certain traits to others, such as youth (botox), fertility (Carrie Prejean's pre-pageant breast implants) and status (billionaire Larry Ellison's yacht, which at 138 meters, apparently measures 10 meters longer than Paul Allen's). Other core traits humans attempt to display include openness, agreeableness, conscientiousness, stability and extraversion, as well as general intelligence, Miller says. "Spent" looks at the historic shift in business from a production orientation to marketing orientation. Instead of selling something and figuring out how to convince you to buy it, smart companies are figuring out what you actually want from your products and supplying it. While it might result in happier consumers, Miller is not sure it bodes well for our collective soul.
"I think on the one hand marketing is absolutely wonderful. I'm really glad that Starbucks figured out that what you want from a coffee shop is not just decent coffee but comfortable chairs, good lighting, magazines, WiFi and a place to socialize and hang out," he says. "But the more seductive those consumer experiences are, the harder it is to save money and avoid debt and jump off the consumerist treadmill. It's an arms race of sophistication between marketers and consumers."
As marketers become increasingly savvy in associating their physical products with desirable display traits, it's easier to lose oneself in narcissism -- flaunting and chasing an endless option of fitness indicators. Consider the person who buys Glaceau Smart Water at $5.20 a gallon -- or 870 times the price of tap -- hoping to show off status and intelligence (now there's an irony).
"Narcissism is a runaway personality disorder where somebody pours too much effort and energy into trait display, and not enough into following up relationships over the long term that may have been started by the effective trait display," Miller explains. "The narcissist will effectively keep investing all his time and energy and money in display and never reap the emotional rewards of long-term relationships that those displays lead to. Narcissists care a lot about getting deference and respect from strangers. but won't cultivate relationships with the strangers worth getting to know. It's exactly what marketers want them to do because it maximizes consumer spending."
Conversation Tops Consumerism
What's somewhat disturbing (or possibly hilarious) about all the money spent on consumer goods in pursuit of desirable trait-display is that most people simply don't notice. "Social psychologists have found we remember someone's age, sex, race, possibly how physically attractive they were or some impression of social class," says Miller. "But we typically do not remember the pants they wore, the specifics about their watch or the car they were driving. Even if you talk to them over dinner, you'll get mostly a general impression about their personality or level of intelligence."
And after relationships are established, we rightly focus on more important matters like character, action and words. "The fundamentalist consumer delusion that products and brands matter, that they constitute a reasonable set of life aspirations, seems … infantile, inhuman and essentially toxic," Miller writes.
Although evolution may be driving misguided materialistic displays in an attempt to communicate our fitness, Miller argues that it doesn't have to be so.
"The cool thing about signaling is it's very non-materialistic -- it's not about taking in energy and matter to support our health as an organism but about sending symbols and signals back and forth to others," he says. "It's also appreciating the full complexity of human nature and romance and friendship, and saying we're not just after fertility or youth, but we also care about moral virtues like kindness, agreeableness and intelligence and seek that in humans we like to hang out with. Evolutionary biology actually tries to offer a vision of human nature that's more consistent with the way mature adults actually socialize -- which is not caring about physical appearance or wealth."
Bottom line, human beings who are aware of their instinctual drives to impress others will recognize that it pays to shop less and talk more. "We already have the most powerful signaling methods evolved in any species, which is language," says Miller. "The added value you get from consumerism is pretty small. People fall in love mostly through conversation. Given the richness of that signaling, what you happen to wear or the brand you favor might add 10 percent to the information which is already conveyed."
Perhaps that's the secret of the people profiled in the classic book "The Millionaire Next Door." Authors Thomas Stanley and William Danko found that many millionaires are self-made businesspeople who live in the first home they bought, drive used cars and are modest in their material displays.
"Those men and women have figured out that attracting mates and friends happens through conversation anyway," Miller suggests, adding that instead of buying stuff to display their wealth, they can talk about their passion for business. "It provides the same information about success as owning all the trinkets, but it's a lot cheaper."

Thursday 16 July 2009

The evolution of politeness

How politeness evolved

By Alan Boyle


Taking turns isn't just a nice idea. It may be as much a part of the theory of evolution as survival of the fittest - at least that's the conclusion that British researchers reached after running a genetic simulation through thousands of generations of evolutionary change.

Turn-taking behavior seem to come naturally to humans, whether it's standing in line or deciding who's going to do the dishes tonight. But such behavior has been observed in a wide variety of other species as well: Chimps take turns grooming each other, for example, and penguins take turns minding their eggs.

"It is far from obvious how turn-taking evolved without language or insight in animals shaped by natural selection to pursue their individual self-interests," University of Leicester psychologist Andrew Colman said last week in a news release about the research.

Colman and a university colleague of his, Lindsay Browning, looked into the evolution of politeness for a paper published in the September issue of the journal Evolutionary Ecology Research - not by studying actual monkeys, penguins or line-standers, but by setting up a series of genetic simulations where they could dictate the rules of the evolutionary game.

The experiment was as much an exercise in game theory as in evolutionary biology. Colman and Browning programmed a computer to play a variety of games in which the payoff varied depending on whether the simulated players made the same or different choices.

One of the best-known games in this genre is the Prisoner's Dilemma, in which two prisoners receive different penalties depending on whether they defect or stay loyal to each other. Under the most common rules of the game, the most frequent outcome is for the prisoners to rat on each other, even though they would have been better off if they had both stayed loyal.

"The Prisoner's Dilemma, which is being used to study cooperation almost exclusively to date, doesn't ever give any advantage to automata that take turns," Colman told me. "In fact, it's created a blind spot in studying this issue, in our opinion."

He and Browning mixed up the repertoire by using six games, including the Prisoner's Dilemma as well as variations of cooperative games known as the Battle of the Sexes and Stag Hunt. They also built in a little mathematical mutation to duplicate what biologists have found happens in real life. Then they ran the simulation through 2,000 evolutionary generations. Each 2,000-generation simulation was repeated 10 times to check the stability of the results.

Here's how the experiment turned out: Under the right conditions, different players locked themselves into a pattern of mutually beneficial turn-taking that could sustain itself indefinitely.

"They didn't have the benefit of language to plan any strategy such as that," Colman said. "It could be something that just evolves through natural selection, just with hard wiring."

One factor was key, he said: "You've got to have two different types, because they've got to behave in different ways in the same situation in order to initiate this behavior. Without this genetic diversity, the behavior cannot evolve."

Even though game theorists may cast this diversity as a battle of the sexes (for example, she likes opera, he likes boxing), Colman emphasized that the diversity he had in mind was not necessarily a gender split, a la "Men Are From Mars, Women Are From Venus."

"I always tell my students, 'Women are from Earth, men are from Earth ... deal with it,'" he joked.

Rather, the diversity may take the form of different responses to environment changes (for example, becoming more dormant to conserve energy vs. becoming more active to seek out new food sources). Colman said turn-taking appears to be an instance of the "invisible hand" of natural selection at work.

"The assumption in the early days of evolutionary theory was that evolution would tend to make all organisms conform to an optimal form, and this would tend to reduce diversity. ... That turned out to be a primitive idea and not sufficiently subtle," he told me.

The fact that so many species exhibit turn-taking behavior suggests that the genetic code for cooperative behavior goes way back, Colman said. And that's a good thing, whether you're a yeast organism trying to metabolize sugar, an eel hunting for food in a coral reef ... or a filmgoer standing in line to see the latest "Harry Potter" movie.

"Humans obviously engage in turn-taking behavior. Queueing is an elaborate example of it," Colman said. "What this shows is that it's probably deep in our DNA. You don't have to necessarily assume that this is something that developed recently just because we're a civilized species."

Now it's your turn: Does this research shed new light on evolutionary theory? Is it merely a case of scientists stating the obvious? Or do you think "survival of the fittest" really doesn't explain turn-taking and other forms of altruistic behavior? Feel free to weigh in with your comments below.

2009/07/14/1996118.aspx

Thursday 9 July 2009

Evolution is: live longer

If you want to know how evolved is a country look to the life expectancy. The more the people live, the better that place surely is. But what about expanding our lifetime? Why can not we dream with a future when dying is just an option? Maybe that is extremely long term for you. But since today we are one step closer.
Organ transplant drug extends life of older mice
SETH BORENSTEIN, AP Science Writer Seth Borenstein,
Ap Science Writer – Wed Jul 8, 2:53 pm ET
WASHINGTON – A drug used to prevent the rejection of organ transplants was found to significantly increase the life span of older mice, researchers report. The National Institute on Aging is testing compounds that may extend the life span of mice. The drug rapamycin is the first to work for both male and female mice, according to a study published online in the journal Nature.

The drug couldn't be used for that purpose in people. It suppresses the human immune system to prevent a transplant recipient's body from attacking the donated tissues, raising the odds of disease.

Researchers didn't start the medicine on the mice until they were about 600 days old, the equivalent of about 60 years for people. Despite that delay, the rapamycin seemed to work, said lead author David Harrison of the Jackson Laboratory in Bar Harbor, Maine.

That surprised and impressed gerontologist George Martin at the University of Washington, who was not part of the study.

Females fed rapamycin lived 14 percent longer than those that didn't take the drug. For males, it was 9 percent longer.

Randy Strong, a study co-author and professor of pharmacology at the University of Texas Health Science Center in San Antonio, said it is the equivalent of adding six extra years of life to men and eight years for women.

Rapamycin already extended life for yeast, worms and fruit flies.

"This is most promising," said Nancy Nadon, of the National Institute on Aging and another study co-author. She said the key is to find other compounds that target the same cellular pathway without the harmful side effects of rapamycin.

Earlier studies showed that resveratrol, which is in red wine, extended the life of obese mice. Unlike resveratrol, rapamycin worked on normal size mice of both genders, Harrison said.

Tuesday 7 July 2009

How much is Darwin known worldwide?

About 70% of World population. In Egypt he is almost unknown. In the United States he is widely known, but 42% of the population prefer not to believe in his evolution theory.
Lots of people think that evolution by natural selection and God are compatible, not my opinion.
The results are from the British Council Darwin now iniciative.
http://www.britishcouncil.org/darwin-about-us.htm#survey

Monday 6 July 2009

Flu virus evolving faster than our science

Virus continue to evolve faster than our intelligence. Beware of this new mutation. Here in Uruguay thousands of people with the new flu, and confident on Tamiflu.

Tamiflu-resistant swine flu patient found in Japan: govt
Thu Jul 2, 5:19 pm ET
TOKYO (AFP) – A genetic mutation of swine flu that is resistant to the anti-viral Tamiflu has been discovered in Japan, the first such case in the country, the health ministry said.
It was the second reported case of Tamiflu resistance linked to swine flu in less than a week.
The latest case was found in a patient who had been given the drug since first being diagnosed with A(H1N1) around two weeks ago, Kyodo news agency reported Thursday, citing the Health, Welfare and Labour Ministry.
The patient -- a woman in Osaka prefecture -- was recovering after having been given Relenza, an alternative anti-flu medication, the report said.
A spokeswoman for Swiss pharmaceuticals giant Roche, which makes Tamiflu, said the company had been informed of the case and called it "normal."
"It is absolutely normal," she said, adding that "0.4 percent of adults develop resistance" to Tamiflu.
She said the case does not indicate Tamiflu has become less effective against swine flu.
Danish authorities announced earlier this week they had discovered resistance to Tamiflu in a female patient. Relenza was also used successfully to treat her.
According to the latest World Health Organization figures, Japan has 1,266 reported cases of swine flu, but has so far recorded no fatalities.

Thursday 2 July 2009

The construction of the global brain

Evolution, Revolution and Punctuated Equilibrium
By Denis Pombriant
CRM Buyer
Part of the ECT News Network
07/01/09 4:00 AM PT

One of the big questions flying around the Enterprise 2.0 conference last week was whether we're looking at a revolution or an evolution. The answer "both" might sound like a cop-out, but in a larger scope, the two forces really are part of the same continuum. The name for it is "punctuated equilibrium" -- a revolution made possible by years of stealthy evolution.


Infusionsoft is offering a free, 15-day trial to small businesses looking to grow fast. With email marketing, automated follow-up, and CRM, Infusionsoft is the only marketing software guaranteed to double your sales. Sign up for a free trial today.

Kudos to all those who participated in, organized or even attended the Enterprise 2.0 conference in drizzly Boston last week. There is a lot to write about.

The big ideas that I took away include disruption and evolution, ROI and a need to sharpen our focus. Here are a few thoughts on a very good show.


Disruption and Evolution
The Tuesday keynotes generated needless confusion by asking a simple question: Is Enterprise 2.0 a revolution or an evolution? Such a question is often resolved in a cowardly compromise to split the difference. As they used to say on SNL, "It's a floor wax and a dessert topping!"

But not so fast. In this case, splitting the difference by saying it's both is not far off the mark. It is both revolution and evolution, but only because some people prefer to see a difference between the two. In fact, evolution experts might tell you that the two are part of the same continuum. They even have a name for it: punctuated equilibrium. Enterprise 2.0 is a revolution (punctuation) made possible by years of stealthy evolution (equilibrium) -- small changes with incremental effects that, with critical mass, result in the revolution we see.

Many of the sessions I attended struck that tone. One of the best was "Networked: How the 2.0 Enterprise Makes Itself Transparent, Participatory and Collaborative," by the husband and wife team of Jessica Lipnack and Jeff Stamps from Netage.

The thought that sticks with me though is how hard it is to achieve punctuation as time goes on. Entrenched interests from the last revolution draw a lesson from their own success and work to prevent the same disruption from happening to them. Look at Iran, for example. No more street protests, thank you very much.

A good point made at a keynote by Matthew Fraser co-author of Throwing Sheep in the Boardroom, is that the enterprise phase of Enterprise 2.0 failed to ignite because entrenched, hierarchical interests in corporations successfully thwarted it. The social networking revolution, which started at the grass roots, is the result. The question to be answered is now less whether but how social networking and social media will scale corporate walls.

ROI Is Not Important
I just love this one because I am a disruptive thinker, and showing the ROI analysis for me is like having to be constantly reminded to say "please" and "thank you" and to put my toys away. Your pants are on fire -- do I really need to say, "Pardon the interruption," before I get the extinguisher? Sheesh!

I am back.

Stowe Boyd, CEO, Edglings, said it best. It's not that ROI is unimportant -- many people in enterprises around the world would violently agree with that. However, there are certain times when ROI may be irrelevant, or at least an irrelevant barrier. That time is during a paradigm shift or revolution cited above -- we are in an "ROI is not important" era all of a sudden.

A real paradigm shift happens quickly, and when it does, it wipes out what stood for business as usual and replaces it with something new. For example, an asteroid hit this planet about 65 million years ago and wiped out most of the dinosaurs. (I say most because there are scientists who think that birds are their direct descendants, and who am I to dicker?) The asteroid was not the paradigm shift, but it caused one, and the shift took many millions of years to fully roll out. However, it was the primary cause that made dinosaurs irrelevant and mammals ascendant.

In business, change is more rapid and usually less violent, but paradigms nonetheless shift dramatically. We are the mammals of earlier shifts. Boyd gave the example that after World War II, the major shift was to place a telephone on each employee's desk. No one at the time could produce an ROI analysis that would provide the justification, and many people worried that the phones would be abused for personal use (sound familiar?), but the shift proceeded. It was unstoppable, and today we couldn't imagine a time when a phone was optional or reserved for a chosen few.

Boyd's point was that placing the phones on desks was so important, such a game changer, that few people waited for the ROI analysis before beginning deployments. Of course there were laggards, but history does not record their names. We could trace the same trajectory through such radical deployments as typewriters, mimeo machines, copiers, faxes and, in our time, PCs. It took about a decade before the full productivity of having a PC on nearly everyone's desk produced the significant change (punctuated equilibrium again) that seemingly overnight produced one of the greatest productivity bursts in history.

Let's Sharpen Our Focus
It's fine to tout the importance of the social revolution taking place right now, the end-point of which is Enterprise 2.0, but there's still a lot of work to do. The greatest challenge is to accept the paradigm shift for what it is and not some kind of extension of an earlier paradigm. We see paradigm extension all the time in the cut-over from one paradigm to another. The old paradigm adopts some aspects of the new in an effort to forestall change. We see it right now in the silly argument about Software as a Service (SaaS) and multi-tenancy's centrality to it all.

At the recent Sales 2.0 conference, also held in Boston, I got a whiff of paradigm extension from most vendors still confused about social media's centrality and purpose. Social media is not a way to turbo charge your spamming efforts or to round up more low-grade suspects for your pipeline. Sales and a lot else needs to look at social media with open eyes and a minimum of pre-judgment if we are to be successful in bringing Enterprise 2.0 to life.

So there, in a large nutshell, is my first cut at what happened in Boston last week. Despite the rain, Boston was -- and remains -- a good place to start a revolution.



--------------------------------------------------------------------------------
Denis Pombriant is the managing principal of the Beagle Research Group , a CRM market research firm and consultancy. Pombriant's research concentrates on evolving product ideas and emerging companies in the sales , marketing and call center disciplines. His research is freely distributed through a blog and Web site. He is working on a book and can be reached at denis.pombriant@beagleresearch.com.

Source
http://www.ecommercetimes.com/story/67467.html?u=btreinen&p=ENNSS_38703fcd169dbba37a7cb047d38c0714

Know yourself

The evolutionary way of thinking makes it easier to think about ourselves as being part of the tree of life and the planet. We are maping the Earth better and better. Will that mean that we will protect Gaia better in the future?

Most complete Earth map published
The most complete terrain map of the Earth's surface has been published.

The data, comprising 1.3 million images, come from a collaboration between the US space agency Nasa and the Japanese trade ministry.

The images were taken by Japan's Advanced Spaceborne Thermal Emission and Reflection Radiometer (Aster) aboard the Terra satellite.

The resulting Global Digital Elevation Map covers 99% of the Earth's surface, and will be free to download and use.

The Terra satellite, dedicated to Earth monitoring missions, has shed light on issues ranging from algal blooms to volcano eruptions.

For the Aster measurements, local elevation was mapped with each point just 30m apart.

"This is the most complete, consistent global digital elevation data yet made available to the world," said Woody Turner, Nasa programme scientist on the Aster mission.

"This unique global set of data will serve users and researchers from a wide array of disciplines that need elevation and terrain information."

Previously, the most complete such topographic map was Nasa's Shuttle Radar Topography Mission, covering 80% of the Earth's surface. However, the mission's results were less accurate in steep terrain and in some deserts.

Nasa is now working to combine those data with the new Aster observations to further improve on the global map.

Source
http://news.bbc.co.uk/2/hi/science/nature/8126197.stm

Wednesday 1 July 2009

The evolution of media Twitter

The extraordinary amount of news coverage the mainstream media has recently devoted to Twitter has led some to think the press is in love with the 3-year-old microblogging service. But it's a jealous love.

Twitter's constantly updating record of up-to-the-minute reaction has in some instances threatened to usurp media coverage of breaking news. It has also helped many celebrities, athletes and politicians bypass the media to get their message directly to their audience.

Make no mistake about it, Twitter has in many ways been a boon to the media. It's one more way a story might go viral and it's arguably the best way for a news outlet to get closer to its readership. Most outlets now have a presence on Twitter with a feed directing readers to their respective sites.

But even in an Internet world that has for years eroded the distance between media and consumer, Twitter is a jolt of democratization to journalism.

To date, the most salient, powerful example of Twitter's influence has been Iranian protesters using the service (among many other methods) to assemble marches against what they feel has been an unjust election.

Early in the protests, the State Department even urged Twitter to put off maintenance that would have temporarily cut off service. Twitter is difficult for governments to block because tweets — 140 characters or less — can be uploaded from mobile phones like a text message. (The Iranian government has nevertheless often succeeded in blocking Twitter, Facebook and other social networks.)

Further, many Americans were upset at what they considered CNN's thin early coverage of the revolution in Iran and voiced their complaints (where else?) on Twitter. Some said they preferred news on Twitter to the cable news network.

Twitter also produced eyewitness accounts of the Mumbai terrorist attacks last year. And when the US Airways jetliner crashed into New York's Hudson River, Twitter was among the first places photos of the landing were linked.

Many users have become accustomed to clicking on Twitter when news breaks. There, they can find a sea of reaction, commentary and links to actual articles.

The popular technology blog TechCrunch recently questioned whether Twitter is "the CNN of the new media generation."
Source: http://news.yahoo.com/s/ap/20090701/ap_on_hi_te/us_web_twitter_and_media

Tuesday 30 June 2009

Oceans rising with global warming

We change our enviroment without conscience that we can suicide ourselves.

Oceans Rising Faster Than UN Forecast, Scientists Say

By Alex Morales
June 18 (Bloomberg) -- Polar ice caps are melting faster and oceans are rising more than the United Nations projected just two years ago, 10 universities said in a report suggesting that climate change has been underestimated.

Global sea levels will climb a meter (39 inches) by 2100, 69 percent more than the most dire forecast made in 2007 by the UN’s climate panel, according to the study released today in Brussels. The forecast was based on new findings, including that Greenland’s ice sheet is losing 179 billion tons of ice a year.

“We have to act immediately and we have to act strongly,” Hans Joachim Schellnhuber, director of Germany’s Potsdam Institute for Climate Impact Research, told reporters in the Belgian capital. “Time is clearly running out.”

In six months, negotiators from 192 nations will meet in Copenhagen to broker a new treaty to fight global warming by limiting the release of greenhouse gases from burning fossil fuels and clearing forests.

“A lukewarm agreement” in the Danish capital “is not only inexcusable, it would be reckless,” Schellnhuber said.

Fossil-fuel combustion in the world’s power plants, vehicles and heaters alone released 31.5 billion metric tons of carbon dioxide, the main greenhouse gas, 1.8 percent more than in 2007, according to calculations from BP Plc data.

Look what response we get today

India Rejects Any Greenhouse-Gas Cuts Under New Climate Treaty

June 30 (Bloomberg) -- India said it will reject any new treaty to limit global warming that makes the country reduce greenhouse-gas emissions because that will undermine its energy consumption, transportation and food security.

Cutting back on climate-warming gases is a measure that instead must be taken by industrialized countries, and India is mobilizing developing nations to push that case, Environment Minister Jairam Ramesh told the media today in New Delhi.

“India will not accept any emission-reduction target -- period,” Ramesh said. “This is a non-negotiable stand.”

India, which has more than 800 million people living on less than $2 a day, is talking with Brazil, China and South Africa on taking a common stand in international negotiations that richer countries like the U.S. and Britain must reduce their emissions 45 percent by the year 2020 from 1990 levels.