A Conversation with Neil Gershenfeld at Edge.org:
Digital is one of the most widely misunderstood concepts. In computing there's a notion of a sign bit error, where you calculate something and you get one bit wrong, so the sign is the opposite of what it should be, which means everything you calculate is the opposite of what it's supposed to be. There's a sense in which that's happening right now in maybe three different areas.
Claude Shannon wrote the best master's thesis ever when he was at MIT, inventing digital. He went on to Bell Labs and did two core things. The one that's most interesting for me is he proved the first threshold theorem. What that means is I could send my voice to you today as a wave, or I could send it to you as a symbol. What he showed is if I send it to you as a symbol, for a linear increase in the resource used to represent the symbol, there is an exponential reduction in the error of you getting the symbol correctly as long as the noise is below a threshold. If the noise is above the threshold, you're doomed. If it's below a threshold, a linear increase in the symbol gives you an exponential reduction in error. There are very few exponentials in engineering. That's the big one
What he showed is you can communicate reliably even though the communication medium is unreliable; that's what digital means. That's the essence of digital. It wasn't obvious, Claude Shannon got that. When I was at Bell Labs, Bob Lucky was still around there and could tell me stories. Claude Shannon had this idea that we should communicate digitally. There was a real battle between analog communication and digital communication.
The sobering lesson from Bob Lucky is the resolution of the battle was death. The analog managers died and a new generation of digital managers took over. Then we had digital communication, and now the Internet. But the meaning of digital is this threshold property, this exponential scaling.