CGS-authored

Is technology poised to develop machines that can outsmart their human creators?

And what will happen to mere mortals if such superintelligent machines arise?

These will be among the questions pondered when experts in artificial intelligence, brain research and other futuristic fields gather at Stanford University on Saturday for what is being called the Singularity Summit.

Borrowing a term from physics, singularity suggests a horizon beyond which we can't see. It describes the point at which some form of intelligence spawned by technology gains the ability to rapidly improve its own programming -- becoming so powerful that we cannot predict what it might do. At that point, its capabilities could exceed even the power of our imaginations.

"This could be very, very good if we get it right, and very, very bad if we get it wrong,'' said Eliezer Yudkowsky, a research fellow with the Singularity Institute for Artificial Intelligence, a nonprofit group in Palo Alto that is co-sponsoring the event.

The speakers' lineup will include inventor and author Ray Kurzweil, whose recent book, "The Singularity Is Near," argues that...