Joel Achenbach in The Washington Post:
The world’s spookiest philosopher is Nick Bostrom, a thin, soft-spoken Swede. Of all the people worried about runaway artificial intelligence, and Killer Robots, and the possibility of a technological doomsday, Bostrom conjures the most extreme scenarios. In his mind, human extinction could be just the beginning. Bostrom’s favorite apocalyptic hypothetical involves a machine that has been programmed to make paper clips (although any mundane product will do). This machine keeps getting smarter and more powerful, but never develops human values. It achieves “superintelligence.” It begins to convert all kinds of ordinary materials into paper clips. Eventually it decides to turn everything on Earth — including the human race (!!!) — into paper clips. Then it goes interstellar. “You could have a superintelligence whose only goal is to make as many paper clips as possible, and you get this bubble of paper clips spreading through the universe,” Bostrom calmly told an audience in Santa Fe, N.M., earlier this year. He added, maintaining his tone of understatement, “I think that would be a low-value future.”
Bostrom’s underlying concerns about machine intelligence, unintended consequences and potentially malevolent computers have gone mainstream. You can’t attend a technology conference these days without someone bringing up the A.I. anxiety. It hovers over the tech conversation with the high-pitched whine of a 1950s-era Hollywood flying saucer. People will tell you that even Stephen Hawking is worried about it. And Bill Gates. And that Elon Musk gave $10 million for research on how to keep machine intelligence under control. All that is true. How this came about is as much a story about media relations as it is about technological change. The machines are not on the verge of taking over. This is a topic rife with speculation and perhaps a whiff of hysteria.
More here.