The Worst Lies You’ve Been Told About the Singularity

Ocy86w2potjw9szquu0g

George Dvorsky in io9:

In a nutshell, the Technological Singularity is a term used to describe the theoretical moment in time when artificial intelligence matches and then exceeds human intelligence. The term was popularized by scifi writer Vernor Vinge, but full credit goes to the mathematician John von Neumann, who spoke of [in the words of Stanislaw Ulam] “ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”

By “not continue” von Neumann was referring to the potential for humanity to lose control and fall outside the context of its technologies. Today, this technology is assumed to be artificial intelligence, or more accurately, recursively-improving artificial intelligence (RIAI), leading to artificial superintelligence (ASI).

Because we cannot predict the nature and intentions of an artificial superintelligence, we have come to refer to this sociological event horizon the Technological Singularity — a concept that’s open to wide interpretation, and by consequence, gross misunderstanding. Here are the worst:

“The Singularity Is Not Going to Happen"

Oh, I wouldn’t bet against it. The onslaught of Moore’s Law appears to be unhindered, while breakthroughs in brainmapping and artificial intelligencecontinue apace. There are no insurmountable conceptual or technological hurdles awaiting us.

And what most ASI skeptics fail to understand is that we have yet to even enter the AI era, a time when powerful — but narrow — systems subsume many domains currently occupied by humans. There will be tremendous incentive to develop these systems, both for economics and security. Superintelligence will eventually appear, likely the product of megacorporations and the military.

More here.