George Dyson in Edge:
Data mining, on the scale now practiced by Google and the NSA, is the realization of what Alan Turing was getting at, in 1939, when he wondered “how far it is possible to eliminate intuition, and leave only ingenuity,” in postulating what he termed an “Oracle Machine.” He had already convinced himself of the possibility of what we now call artificial intelligence (in his more precise terms, mechanical intelligence) and was curious as to whether intuition could be similarly reduced to a mechanical procedure—although it might (indeed should) involve non-deterministic steps. He assumed, for sake of argument, that “we do not mind how much ingenuity is required, and therefore assume it to be available in unlimited supply.”
And, as if to discount disclaimers by the NSA that they are only capturing metadata, Turing, whose World War II work on the Enigma would make him one of the patron saints of the NSA, was already explicit that it is the metadata that count. If Google has taught us anything, it is that if you simply capture enough links, over time, you can establish meaning, follow ideas, and reconstruct someone's thoughts. It is only a short step from suggesting what a target may be thinking now, to suggesting what that target may be thinking next.
Does this not promise a safer world, protected not only from bad actors attempting to do dangerous things, but from bad actors developing dangerous thoughts? Yes, but at what cost? There's a problem, and it's the problem that Alan Turing was trying to answer when he first set us down this path. Turing delivered us into the digital age, as a 24-year-old graduate student, not by building a computer, but by writing a purely mathematical paper, “On Computable Numbers, with an Application to the Entscheidungsproblem,” published in 1936. The Decision Problem, articulated by Göttingen's David Hilbert, concerned the abstract mathematical question of whether there could ever be any systematic mechanical procedure to determine, in a finite number of steps, whether any given string of symbols represented a provable statement or not.
The answer was no. In modern computational terms (which just happened to be how, in an unexpected stroke of genius, Turing framed his argument) no matter how much digital horsepower you have at your disposal, there is no systematic way to determine, in advance, what every given string of code is going to do except to let the codes run, and find out.
Nicholas G. Carr's criticism can be found here, and Dyson's rejoinder here.