Eliezer Yudkowsky, one of the founders of The Singularity Institute for Artificial Intelligence, believes that singularity will lead to an "intelligence explosion" as super-intelligent machines design even more intelligent machines, with each generation repeating this process.
( Peter Singer )
[ Ethics in the Real World: 82 ]
www.QuoteSweet.com