Peter Conrad in The Guardian:
Revolutions usually leave ancient institutions tottering, societies shaken, the streets awash with blood. But what Walter Isaacson calls the “digital revolution” has kept its promise to liberate mankind. Enrichment for the few has been balanced by empowerment for the rest of us, and we can all – as the enraptured Isaacson says – enjoy a “sublime user experience” when we turn on our computers. Wikipedia gives us access to a global mind; on social media we can chat with friends we may never meet and who might not actually exist; blogs “democratise public discourse” by giving a voice to those who were once condemned to mute anonymity. Has heaven really come down to our wired-up, interconnected Earth?
What Isaacson sees as an eruption of communal creativity began with two boldly irreligious experiments: an attempt to manufacture life scientifically, followed by a scheme for a machine that could think. After Mary Shelley’s Frankenstein stitched together his monster, Byron’s bluestocking daughter Ada Lovelace devised an “analytical engine” that could numerically replicate the “changes of mutual relationship” that occurred in God’s creation. Unlike Shelley’s mad scientist, Lovelace stopped short of challenging the official creator: her apparatus had “no pretension to originate anything”. A century later, political necessity quashed this pious dread. The computing pioneers of the 1930s, as Isaacson points out, served military objectives. At MIT, Vannevar Bush’s differential analyser churned out artillery firing tables, and at Bletchley Park, after the war began, an all-electronic computer called the Colossus deciphered German codes. Later, the US air force and navy gobbled up all available microchips, which were used for guiding warheads aimed at targets in Russia or Cuba; only when the price of the chips dropped could they be used to power consumer products, not just weapons.