Tuesday 28 July 2009

Singularity is coming


Machines evolve faster than we -humans- do.

Is that something to worry? Remember Hal, from 2001 space odity

Some scientis think so.
In the photograph, a robot that plugs itself when perceives it has low battery reserves.
The report is from New York Times


July 26, 2009
Scientists Worry Machines May Outsmart Man
By JOHN MARKOFF
A robot that can open doors and find electrical outlets to recharge itself. Computer viruses that no one can stop. Predator drones, which, though still controlled remotely by humans, come close to a machine that can kill autonomously.

Impressed and alarmed by advances in artificial intelligence, a group of computer scientists is debating whether there should be limits on research that might lead to loss of human control over computer-based systems that carry a growing share of society’s workload, from waging war to chatting with customers on the phone.

Their concern is that further advances could create profound social disruptions and even have dangerous consequences.

As examples, the scientists pointed to a number of technologies as diverse as experimental medical systems that interact with patients to simulate empathy, and computer worms and viruses that defy extermination and could thus be said to have reached a “cockroach” stage of machine intelligence.

While the computer scientists agreed that we are a long way from Hal, the computer that took over the spaceship in “2001: A Space Odyssey,” they said there was legitimate concern that technological progress would transform the work force by destroying a widening range of jobs, as well as force humans to learn to live with machines that increasingly copy human behaviors.

The researchers — leading computer scientists, artificial intelligence researchers and roboticists who met at the Asilomar Conference Grounds on Monterey Bay in California — generally discounted the possibility of highly centralized superintelligences and the idea that intelligence might spring spontaneously from the Internet. But they agreed that robots that can kill autonomously are either already here or will be soon.

They focused particular attention on the specter that criminals could exploit artificial intelligence systems as soon as they were developed. What could a criminal do with a speech synthesis system that could masquerade as a human being? What happens if artificial intelligence technology is used to mine personal information from smart phones?

The researchers also discussed possible threats to human jobs, like self-driving cars, software-based personal assistants and service robots in the home. Just last month, a service robot developed by Willow Garage in Silicon Valley proved it could navigate the real world.

A report from the conference, which took place in private on Feb. 25, is to be issued later this year. Some attendees discussed the meeting for the first time with other scientists this month and in interviews.

The conference was organized by the Association for the Advancement of Artificial Intelligence, and in choosing Asilomar for the discussions, the group purposefully evoked a landmark event in the history of science. In 1975, the world’s leading biologists also met at Asilomar to discuss the new ability to reshape life by swapping genetic material among organisms. Concerned about possible biohazards and ethical questions, scientists had halted certain experiments. The conference led to guidelines for recombinant DNA research, enabling experimentation to continue.

The meeting on the future of artificial intelligence was organized by Eric Horvitz, a Microsoft researcher who is now president of the association.

Dr. Horvitz said he believed computer scientists must respond to the notions of superintelligent machines and artificial intelligence systems run amok.

The idea of an “intelligence explosion” in which smart machines would design even more intelligent machines was proposed by the mathematician I. J. Good in 1965. Later, in lectures and science fiction novels, the computer scientist Vernor Vinge popularized the notion of a moment when humans will create smarter-than-human machines, causing such rapid change that the “human era will be ended.” He called this shift the Singularity.

This vision, embraced in movies and literature, is seen as plausible and unnerving by some scientists like William Joy, co-founder of Sun Microsystems. Other technologists, notably Raymond Kurzweil, have extolled the coming of ultrasmart machines, saying they will offer huge advances in life extension and wealth creation.

“Something new has taken place in the past five to eight years,” Dr. Horvitz said. “Technologists are replacing religion, and their ideas are resonating in some ways with the same idea of the Rapture.”

The Kurzweil version of technological utopia has captured imaginations in Silicon Valley. This summer an organization called the Singularity University began offering courses to prepare a “cadre” to shape the advances and help society cope with the ramifications.

“My sense was that sooner or later we would have to make some sort of statement or assessment, given the rising voice of the technorati and people very concerned about the rise of intelligent machines,” Dr. Horvitz said.

The A.A.A.I. report will try to assess the possibility of “the loss of human control of computer-based intelligences.” It will also grapple, Dr. Horvitz said, with socioeconomic, legal and ethical issues, as well as probable changes in human-computer relationships. How would it be, for example, to relate to a machine that is as intelligent as your spouse?

Dr. Horvitz said the panel was looking for ways to guide research so that technology improved society rather than moved it toward a technological catastrophe. Some research might, for instance, be conducted in a high-security laboratory.

The meeting on artificial intelligence could be pivotal to the future of the field. Paul Berg, who was the organizer of the 1975 Asilomar meeting and received a Nobel Prize for chemistry in 1980, said it was important for scientific communities to engage the public before alarm and opposition becomes unshakable.

“If you wait too long and the sides become entrenched like with G.M.O.,” he said, referring to genetically modified foods, “then it is very difficult. It’s too complex, and people talk right past each other.”

Tom Mitchell, a professor of artificial intelligence and machine learning at Carnegie Mellon University, said the February meeting had changed his thinking. “I went in very optimistic about the future of A.I. and thinking that Bill Joy and Ray Kurzweil were far off in their predictions,” he said. But, he added, “The meeting made me want to be more outspoken about these issues and in particular be outspoken about the vast amounts of data collected about our personal lives.”

Despite his concerns, Dr. Horvitz said he was hopeful that artificial intelligence research would benefit humans, and perhaps even compensate for human failings. He recently demonstrated a voice-based system that he designed to ask patients about their symptoms and to respond with empathy. When a mother said her child was having diarrhea, the face on the screen said, “Oh no, sorry to hear that.”

A physician told him afterward that it was wonderful that the system responded to human emotion. “That’s a great idea,” Dr. Horvitz said he was told. “I have no time for that.”

Ken Conley/Willow Garage

Wednesday 22 July 2009

Darwin, psychology and the way we spend our money

Is Darwin Running Up Your Credit Cards?
by Laura Rowley
Posted on Wednesday, July 15, 2009, 12:00AM
If you're struggling with overspending and don't know where the money's going, Darwin may provide the answer.
In the new book "Spent: Sex, Evolution and Consumer Behavior," evolutionary psychologist Geoffrey Miller argues humans are instinctively driven to spend money in an effort to display winning qualities and high status to others. And, not surprisingly, that can result in dysfunctional spending behaviors.
Conspicuous consumption "is not an inevitable outcome of human nature, but it's an understandable way that human nature will try to display itself in a market economy," explains Miller, who teaches at the University of New Mexico. "So instead of trying to attract mates and friends by being the best mammoth hunter, we try to be the best lawyer or the most successful entrepreneur, and display success through the goods and services we buy." Miller's book doesn't examine purchases that are merely useful or pleasurable. I buy a certain kind of Nike running shoe because it minimizes the painful stabbing in my left foot (plantar fasciitis), not because I unconsciously strive to signal my fitness to potential mates (which could complicate my marriage).
The Trouble With Marketing
But Miller suggests a good chunk of spending is prompted by unconscious desires that we have adapted over millennia to signal certain traits to others, such as youth (botox), fertility (Carrie Prejean's pre-pageant breast implants) and status (billionaire Larry Ellison's yacht, which at 138 meters, apparently measures 10 meters longer than Paul Allen's). Other core traits humans attempt to display include openness, agreeableness, conscientiousness, stability and extraversion, as well as general intelligence, Miller says. "Spent" looks at the historic shift in business from a production orientation to marketing orientation. Instead of selling something and figuring out how to convince you to buy it, smart companies are figuring out what you actually want from your products and supplying it. While it might result in happier consumers, Miller is not sure it bodes well for our collective soul.
"I think on the one hand marketing is absolutely wonderful. I'm really glad that Starbucks figured out that what you want from a coffee shop is not just decent coffee but comfortable chairs, good lighting, magazines, WiFi and a place to socialize and hang out," he says. "But the more seductive those consumer experiences are, the harder it is to save money and avoid debt and jump off the consumerist treadmill. It's an arms race of sophistication between marketers and consumers."
As marketers become increasingly savvy in associating their physical products with desirable display traits, it's easier to lose oneself in narcissism -- flaunting and chasing an endless option of fitness indicators. Consider the person who buys Glaceau Smart Water at $5.20 a gallon -- or 870 times the price of tap -- hoping to show off status and intelligence (now there's an irony).
"Narcissism is a runaway personality disorder where somebody pours too much effort and energy into trait display, and not enough into following up relationships over the long term that may have been started by the effective trait display," Miller explains. "The narcissist will effectively keep investing all his time and energy and money in display and never reap the emotional rewards of long-term relationships that those displays lead to. Narcissists care a lot about getting deference and respect from strangers. but won't cultivate relationships with the strangers worth getting to know. It's exactly what marketers want them to do because it maximizes consumer spending."
Conversation Tops Consumerism
What's somewhat disturbing (or possibly hilarious) about all the money spent on consumer goods in pursuit of desirable trait-display is that most people simply don't notice. "Social psychologists have found we remember someone's age, sex, race, possibly how physically attractive they were or some impression of social class," says Miller. "But we typically do not remember the pants they wore, the specifics about their watch or the car they were driving. Even if you talk to them over dinner, you'll get mostly a general impression about their personality or level of intelligence."
And after relationships are established, we rightly focus on more important matters like character, action and words. "The fundamentalist consumer delusion that products and brands matter, that they constitute a reasonable set of life aspirations, seems … infantile, inhuman and essentially toxic," Miller writes.
Although evolution may be driving misguided materialistic displays in an attempt to communicate our fitness, Miller argues that it doesn't have to be so.
"The cool thing about signaling is it's very non-materialistic -- it's not about taking in energy and matter to support our health as an organism but about sending symbols and signals back and forth to others," he says. "It's also appreciating the full complexity of human nature and romance and friendship, and saying we're not just after fertility or youth, but we also care about moral virtues like kindness, agreeableness and intelligence and seek that in humans we like to hang out with. Evolutionary biology actually tries to offer a vision of human nature that's more consistent with the way mature adults actually socialize -- which is not caring about physical appearance or wealth."
Bottom line, human beings who are aware of their instinctual drives to impress others will recognize that it pays to shop less and talk more. "We already have the most powerful signaling methods evolved in any species, which is language," says Miller. "The added value you get from consumerism is pretty small. People fall in love mostly through conversation. Given the richness of that signaling, what you happen to wear or the brand you favor might add 10 percent to the information which is already conveyed."
Perhaps that's the secret of the people profiled in the classic book "The Millionaire Next Door." Authors Thomas Stanley and William Danko found that many millionaires are self-made businesspeople who live in the first home they bought, drive used cars and are modest in their material displays.
"Those men and women have figured out that attracting mates and friends happens through conversation anyway," Miller suggests, adding that instead of buying stuff to display their wealth, they can talk about their passion for business. "It provides the same information about success as owning all the trinkets, but it's a lot cheaper."

Thursday 16 July 2009

The evolution of politeness

How politeness evolved

By Alan Boyle


Taking turns isn't just a nice idea. It may be as much a part of the theory of evolution as survival of the fittest - at least that's the conclusion that British researchers reached after running a genetic simulation through thousands of generations of evolutionary change.

Turn-taking behavior seem to come naturally to humans, whether it's standing in line or deciding who's going to do the dishes tonight. But such behavior has been observed in a wide variety of other species as well: Chimps take turns grooming each other, for example, and penguins take turns minding their eggs.

"It is far from obvious how turn-taking evolved without language or insight in animals shaped by natural selection to pursue their individual self-interests," University of Leicester psychologist Andrew Colman said last week in a news release about the research.

Colman and a university colleague of his, Lindsay Browning, looked into the evolution of politeness for a paper published in the September issue of the journal Evolutionary Ecology Research - not by studying actual monkeys, penguins or line-standers, but by setting up a series of genetic simulations where they could dictate the rules of the evolutionary game.

The experiment was as much an exercise in game theory as in evolutionary biology. Colman and Browning programmed a computer to play a variety of games in which the payoff varied depending on whether the simulated players made the same or different choices.

One of the best-known games in this genre is the Prisoner's Dilemma, in which two prisoners receive different penalties depending on whether they defect or stay loyal to each other. Under the most common rules of the game, the most frequent outcome is for the prisoners to rat on each other, even though they would have been better off if they had both stayed loyal.

"The Prisoner's Dilemma, which is being used to study cooperation almost exclusively to date, doesn't ever give any advantage to automata that take turns," Colman told me. "In fact, it's created a blind spot in studying this issue, in our opinion."

He and Browning mixed up the repertoire by using six games, including the Prisoner's Dilemma as well as variations of cooperative games known as the Battle of the Sexes and Stag Hunt. They also built in a little mathematical mutation to duplicate what biologists have found happens in real life. Then they ran the simulation through 2,000 evolutionary generations. Each 2,000-generation simulation was repeated 10 times to check the stability of the results.

Here's how the experiment turned out: Under the right conditions, different players locked themselves into a pattern of mutually beneficial turn-taking that could sustain itself indefinitely.

"They didn't have the benefit of language to plan any strategy such as that," Colman said. "It could be something that just evolves through natural selection, just with hard wiring."

One factor was key, he said: "You've got to have two different types, because they've got to behave in different ways in the same situation in order to initiate this behavior. Without this genetic diversity, the behavior cannot evolve."

Even though game theorists may cast this diversity as a battle of the sexes (for example, she likes opera, he likes boxing), Colman emphasized that the diversity he had in mind was not necessarily a gender split, a la "Men Are From Mars, Women Are From Venus."

"I always tell my students, 'Women are from Earth, men are from Earth ... deal with it,'" he joked.

Rather, the diversity may take the form of different responses to environment changes (for example, becoming more dormant to conserve energy vs. becoming more active to seek out new food sources). Colman said turn-taking appears to be an instance of the "invisible hand" of natural selection at work.

"The assumption in the early days of evolutionary theory was that evolution would tend to make all organisms conform to an optimal form, and this would tend to reduce diversity. ... That turned out to be a primitive idea and not sufficiently subtle," he told me.

The fact that so many species exhibit turn-taking behavior suggests that the genetic code for cooperative behavior goes way back, Colman said. And that's a good thing, whether you're a yeast organism trying to metabolize sugar, an eel hunting for food in a coral reef ... or a filmgoer standing in line to see the latest "Harry Potter" movie.

"Humans obviously engage in turn-taking behavior. Queueing is an elaborate example of it," Colman said. "What this shows is that it's probably deep in our DNA. You don't have to necessarily assume that this is something that developed recently just because we're a civilized species."

Now it's your turn: Does this research shed new light on evolutionary theory? Is it merely a case of scientists stating the obvious? Or do you think "survival of the fittest" really doesn't explain turn-taking and other forms of altruistic behavior? Feel free to weigh in with your comments below.

2009/07/14/1996118.aspx

Thursday 9 July 2009

Evolution is: live longer

If you want to know how evolved is a country look to the life expectancy. The more the people live, the better that place surely is. But what about expanding our lifetime? Why can not we dream with a future when dying is just an option? Maybe that is extremely long term for you. But since today we are one step closer.
Organ transplant drug extends life of older mice
SETH BORENSTEIN, AP Science Writer Seth Borenstein,
Ap Science Writer – Wed Jul 8, 2:53 pm ET
WASHINGTON – A drug used to prevent the rejection of organ transplants was found to significantly increase the life span of older mice, researchers report. The National Institute on Aging is testing compounds that may extend the life span of mice. The drug rapamycin is the first to work for both male and female mice, according to a study published online in the journal Nature.

The drug couldn't be used for that purpose in people. It suppresses the human immune system to prevent a transplant recipient's body from attacking the donated tissues, raising the odds of disease.

Researchers didn't start the medicine on the mice until they were about 600 days old, the equivalent of about 60 years for people. Despite that delay, the rapamycin seemed to work, said lead author David Harrison of the Jackson Laboratory in Bar Harbor, Maine.

That surprised and impressed gerontologist George Martin at the University of Washington, who was not part of the study.

Females fed rapamycin lived 14 percent longer than those that didn't take the drug. For males, it was 9 percent longer.

Randy Strong, a study co-author and professor of pharmacology at the University of Texas Health Science Center in San Antonio, said it is the equivalent of adding six extra years of life to men and eight years for women.

Rapamycin already extended life for yeast, worms and fruit flies.

"This is most promising," said Nancy Nadon, of the National Institute on Aging and another study co-author. She said the key is to find other compounds that target the same cellular pathway without the harmful side effects of rapamycin.

Earlier studies showed that resveratrol, which is in red wine, extended the life of obese mice. Unlike resveratrol, rapamycin worked on normal size mice of both genders, Harrison said.

Tuesday 7 July 2009

How much is Darwin known worldwide?

About 70% of World population. In Egypt he is almost unknown. In the United States he is widely known, but 42% of the population prefer not to believe in his evolution theory.
Lots of people think that evolution by natural selection and God are compatible, not my opinion.
The results are from the British Council Darwin now iniciative.
http://www.britishcouncil.org/darwin-about-us.htm#survey

Monday 6 July 2009

Flu virus evolving faster than our science

Virus continue to evolve faster than our intelligence. Beware of this new mutation. Here in Uruguay thousands of people with the new flu, and confident on Tamiflu.

Tamiflu-resistant swine flu patient found in Japan: govt
Thu Jul 2, 5:19 pm ET
TOKYO (AFP) – A genetic mutation of swine flu that is resistant to the anti-viral Tamiflu has been discovered in Japan, the first such case in the country, the health ministry said.
It was the second reported case of Tamiflu resistance linked to swine flu in less than a week.
The latest case was found in a patient who had been given the drug since first being diagnosed with A(H1N1) around two weeks ago, Kyodo news agency reported Thursday, citing the Health, Welfare and Labour Ministry.
The patient -- a woman in Osaka prefecture -- was recovering after having been given Relenza, an alternative anti-flu medication, the report said.
A spokeswoman for Swiss pharmaceuticals giant Roche, which makes Tamiflu, said the company had been informed of the case and called it "normal."
"It is absolutely normal," she said, adding that "0.4 percent of adults develop resistance" to Tamiflu.
She said the case does not indicate Tamiflu has become less effective against swine flu.
Danish authorities announced earlier this week they had discovered resistance to Tamiflu in a female patient. Relenza was also used successfully to treat her.
According to the latest World Health Organization figures, Japan has 1,266 reported cases of swine flu, but has so far recorded no fatalities.

Thursday 2 July 2009

The construction of the global brain

Evolution, Revolution and Punctuated Equilibrium
By Denis Pombriant
CRM Buyer
Part of the ECT News Network
07/01/09 4:00 AM PT

One of the big questions flying around the Enterprise 2.0 conference last week was whether we're looking at a revolution or an evolution. The answer "both" might sound like a cop-out, but in a larger scope, the two forces really are part of the same continuum. The name for it is "punctuated equilibrium" -- a revolution made possible by years of stealthy evolution.


Infusionsoft is offering a free, 15-day trial to small businesses looking to grow fast. With email marketing, automated follow-up, and CRM, Infusionsoft is the only marketing software guaranteed to double your sales. Sign up for a free trial today.

Kudos to all those who participated in, organized or even attended the Enterprise 2.0 conference in drizzly Boston last week. There is a lot to write about.

The big ideas that I took away include disruption and evolution, ROI and a need to sharpen our focus. Here are a few thoughts on a very good show.


Disruption and Evolution
The Tuesday keynotes generated needless confusion by asking a simple question: Is Enterprise 2.0 a revolution or an evolution? Such a question is often resolved in a cowardly compromise to split the difference. As they used to say on SNL, "It's a floor wax and a dessert topping!"

But not so fast. In this case, splitting the difference by saying it's both is not far off the mark. It is both revolution and evolution, but only because some people prefer to see a difference between the two. In fact, evolution experts might tell you that the two are part of the same continuum. They even have a name for it: punctuated equilibrium. Enterprise 2.0 is a revolution (punctuation) made possible by years of stealthy evolution (equilibrium) -- small changes with incremental effects that, with critical mass, result in the revolution we see.

Many of the sessions I attended struck that tone. One of the best was "Networked: How the 2.0 Enterprise Makes Itself Transparent, Participatory and Collaborative," by the husband and wife team of Jessica Lipnack and Jeff Stamps from Netage.

The thought that sticks with me though is how hard it is to achieve punctuation as time goes on. Entrenched interests from the last revolution draw a lesson from their own success and work to prevent the same disruption from happening to them. Look at Iran, for example. No more street protests, thank you very much.

A good point made at a keynote by Matthew Fraser co-author of Throwing Sheep in the Boardroom, is that the enterprise phase of Enterprise 2.0 failed to ignite because entrenched, hierarchical interests in corporations successfully thwarted it. The social networking revolution, which started at the grass roots, is the result. The question to be answered is now less whether but how social networking and social media will scale corporate walls.

ROI Is Not Important
I just love this one because I am a disruptive thinker, and showing the ROI analysis for me is like having to be constantly reminded to say "please" and "thank you" and to put my toys away. Your pants are on fire -- do I really need to say, "Pardon the interruption," before I get the extinguisher? Sheesh!

I am back.

Stowe Boyd, CEO, Edglings, said it best. It's not that ROI is unimportant -- many people in enterprises around the world would violently agree with that. However, there are certain times when ROI may be irrelevant, or at least an irrelevant barrier. That time is during a paradigm shift or revolution cited above -- we are in an "ROI is not important" era all of a sudden.

A real paradigm shift happens quickly, and when it does, it wipes out what stood for business as usual and replaces it with something new. For example, an asteroid hit this planet about 65 million years ago and wiped out most of the dinosaurs. (I say most because there are scientists who think that birds are their direct descendants, and who am I to dicker?) The asteroid was not the paradigm shift, but it caused one, and the shift took many millions of years to fully roll out. However, it was the primary cause that made dinosaurs irrelevant and mammals ascendant.

In business, change is more rapid and usually less violent, but paradigms nonetheless shift dramatically. We are the mammals of earlier shifts. Boyd gave the example that after World War II, the major shift was to place a telephone on each employee's desk. No one at the time could produce an ROI analysis that would provide the justification, and many people worried that the phones would be abused for personal use (sound familiar?), but the shift proceeded. It was unstoppable, and today we couldn't imagine a time when a phone was optional or reserved for a chosen few.

Boyd's point was that placing the phones on desks was so important, such a game changer, that few people waited for the ROI analysis before beginning deployments. Of course there were laggards, but history does not record their names. We could trace the same trajectory through such radical deployments as typewriters, mimeo machines, copiers, faxes and, in our time, PCs. It took about a decade before the full productivity of having a PC on nearly everyone's desk produced the significant change (punctuated equilibrium again) that seemingly overnight produced one of the greatest productivity bursts in history.

Let's Sharpen Our Focus
It's fine to tout the importance of the social revolution taking place right now, the end-point of which is Enterprise 2.0, but there's still a lot of work to do. The greatest challenge is to accept the paradigm shift for what it is and not some kind of extension of an earlier paradigm. We see paradigm extension all the time in the cut-over from one paradigm to another. The old paradigm adopts some aspects of the new in an effort to forestall change. We see it right now in the silly argument about Software as a Service (SaaS) and multi-tenancy's centrality to it all.

At the recent Sales 2.0 conference, also held in Boston, I got a whiff of paradigm extension from most vendors still confused about social media's centrality and purpose. Social media is not a way to turbo charge your spamming efforts or to round up more low-grade suspects for your pipeline. Sales and a lot else needs to look at social media with open eyes and a minimum of pre-judgment if we are to be successful in bringing Enterprise 2.0 to life.

So there, in a large nutshell, is my first cut at what happened in Boston last week. Despite the rain, Boston was -- and remains -- a good place to start a revolution.



--------------------------------------------------------------------------------
Denis Pombriant is the managing principal of the Beagle Research Group , a CRM market research firm and consultancy. Pombriant's research concentrates on evolving product ideas and emerging companies in the sales , marketing and call center disciplines. His research is freely distributed through a blog and Web site. He is working on a book and can be reached at denis.pombriant@beagleresearch.com.

Source
http://www.ecommercetimes.com/story/67467.html?u=btreinen&p=ENNSS_38703fcd169dbba37a7cb047d38c0714

Know yourself

The evolutionary way of thinking makes it easier to think about ourselves as being part of the tree of life and the planet. We are maping the Earth better and better. Will that mean that we will protect Gaia better in the future?

Most complete Earth map published
The most complete terrain map of the Earth's surface has been published.

The data, comprising 1.3 million images, come from a collaboration between the US space agency Nasa and the Japanese trade ministry.

The images were taken by Japan's Advanced Spaceborne Thermal Emission and Reflection Radiometer (Aster) aboard the Terra satellite.

The resulting Global Digital Elevation Map covers 99% of the Earth's surface, and will be free to download and use.

The Terra satellite, dedicated to Earth monitoring missions, has shed light on issues ranging from algal blooms to volcano eruptions.

For the Aster measurements, local elevation was mapped with each point just 30m apart.

"This is the most complete, consistent global digital elevation data yet made available to the world," said Woody Turner, Nasa programme scientist on the Aster mission.

"This unique global set of data will serve users and researchers from a wide array of disciplines that need elevation and terrain information."

Previously, the most complete such topographic map was Nasa's Shuttle Radar Topography Mission, covering 80% of the Earth's surface. However, the mission's results were less accurate in steep terrain and in some deserts.

Nasa is now working to combine those data with the new Aster observations to further improve on the global map.

Source
http://news.bbc.co.uk/2/hi/science/nature/8126197.stm

Wednesday 1 July 2009

The evolution of media Twitter

The extraordinary amount of news coverage the mainstream media has recently devoted to Twitter has led some to think the press is in love with the 3-year-old microblogging service. But it's a jealous love.

Twitter's constantly updating record of up-to-the-minute reaction has in some instances threatened to usurp media coverage of breaking news. It has also helped many celebrities, athletes and politicians bypass the media to get their message directly to their audience.

Make no mistake about it, Twitter has in many ways been a boon to the media. It's one more way a story might go viral and it's arguably the best way for a news outlet to get closer to its readership. Most outlets now have a presence on Twitter with a feed directing readers to their respective sites.

But even in an Internet world that has for years eroded the distance between media and consumer, Twitter is a jolt of democratization to journalism.

To date, the most salient, powerful example of Twitter's influence has been Iranian protesters using the service (among many other methods) to assemble marches against what they feel has been an unjust election.

Early in the protests, the State Department even urged Twitter to put off maintenance that would have temporarily cut off service. Twitter is difficult for governments to block because tweets — 140 characters or less — can be uploaded from mobile phones like a text message. (The Iranian government has nevertheless often succeeded in blocking Twitter, Facebook and other social networks.)

Further, many Americans were upset at what they considered CNN's thin early coverage of the revolution in Iran and voiced their complaints (where else?) on Twitter. Some said they preferred news on Twitter to the cable news network.

Twitter also produced eyewitness accounts of the Mumbai terrorist attacks last year. And when the US Airways jetliner crashed into New York's Hudson River, Twitter was among the first places photos of the landing were linked.

Many users have become accustomed to clicking on Twitter when news breaks. There, they can find a sea of reaction, commentary and links to actual articles.

The popular technology blog TechCrunch recently questioned whether Twitter is "the CNN of the new media generation."
Source: http://news.yahoo.com/s/ap/20090701/ap_on_hi_te/us_web_twitter_and_media