Alan Turing in Three Words

Saler_313811h

Michael Saler in the TLS:

B. Jack Copeland’s new biography of the father of modern computing opens with an Alan Turing Test: “Three words to sum up Alan Turing?”. It’s a challenging question for such a multifaceted man. No doubt Watson, the polymath computer that won at Jeopardy! in 2011, could generate apt terms, but so, too, could more ordinary folk, especially given the wide publicity the English mathematician has received during this centenary year of his birth. Yet Turing’s justified fame was unthinkable a little more than a generation ago. The man who helped defeat the Nazis and create the digital age was known among mathematicians and computer scientists, but few others. His crucial contributions to decrypting German codes during the war remained classified for decades after his death in 1954. And his signal concept of an all-purpose, stored-program computer – the model for our digital devices today – was often attributed to others, from Charles Babbage in the nineteenth century to John von Neumann in the twentieth. (With his unfinished “Analytical Engine”, Babbage was on the right track, but never posited the critical idea of storing programs in memory, enabling a single machine to execute multiple tasks – in effect becoming the “universal” machine residing on our desks and chirping in our pockets. Turing made this breakthrough, which in turn inspired von Neumann’s general architecture for electronic computers that became the industry standard.) Until his classified war work became public knowledge and the genealogy of modern computing was sorted out, Turing would not have been associated with Newton, Darwin or Einstein – a comparison drawn by Barack Obama in an address to the Houses of Parliament in 2011 – or considered among the “leading figures in the Allied victory over Hitler”, as Copeland does here. Picasso would have been an unlikely comparison also, but like the restless artist Turing was a fertile innovator, leaving new fields to sprout from his seedlings (computer science, artificial intelligence, mathematical biology), and making pioneering contributions to others (logic, cryptography, statistics).