The Cul-de-Sac of the Computational Metaphor

Rodney A. Brooks at Edge:

I’m going to go over a wide range of things that everyone will likely find something to disagree with. I want to start out by saying that I’m a materialist reductionist. As I talk, some people might get a little worried that I’m going off like Chalmers or something, but I’m not. I’m a materialist reductionist.

I’m worried that the crack cocaine of Moore’s law, which has given us more and more computation, has lulled us into thinking that that’s all there is. When you look at Claus Pias’s introduction to the Macy Conferences book, he writes, “The common precondition of the three foundational concepts of cybernetics—switching (Boolean) algebra, information theory and feedback—is digitality.” They go straight into digitality in this conference. He says, “We considered Turing’s universal machine as a ‘model’ for brains, employing Pitts’ and McCulloch’s calculus for activity in neural nets.” Anyone who has looked at the Pitts and McCulloch papers knows it’s a very primitive view of what is happening in neurons. But they adopted Turing’s universal machine.

How did Turing come up with Turing computation? In his 1936 paper, he talks about a human computer. Interestingly, he uses the male pronoun, whereas most of them were women. A human computer had a piece of paper, wrote things down, and followed rules—that was his model of computation, which we have come to accept.

More here.