ON WEDNESDAY, HUNDREDS of feet below ground in Europe, a proverbial switch will be pulled on the Large Hadron Collider, a new multibillion dollar machine designed to smash subatomic particles together at immense speeds. The device could help physicists rewrite the rules of the universe. It could also, just possibly, do something else: create a tiny black hole that would result in the end of all life as we know it.
Most scientists are confident that the danger is vanishingly small, and a number of research papers have concluded the experiment is safe. But are the potential gains to science really worth even a tiny risk of eradicating the earth? This question, writ large, is the province of a group of scholars who study potential global catastrophe. At the center of their work lies an almost unanswerable question: How should we deal with very unlikely threats that also carry the potential to extinguish human civilization?
This past July, specialists convened in Oxford, England, for the first Global Catastrophic Risks Conference. The group included philosophers, physicists, and sociologists; aside from the huge particle accelerator, they looked at the threat of massive asteroid collisions, gamma ray bursts from supernovas that could sterilize the planet, man-made nanobots that could replicate and consume the earth’s surface, and out-of-control artificial intelligence.
more from Boston Globe Ideas here.