Jay Elwes in Prospect:
Each day, humans create 2.5 quintillion bytes of data. A byte is the amount of data needed by a computer to encode a single letter. A quintillion is one followed by 18 zeros. We float on an ocean of data.
You’d arrive at an even bigger number if you put it in terms of “bits”, the ultimate basic building block out of which every wonder of the digital age is built. A bit is simply a one or a zero or, equivalently, a single switch inside an electronic processor that must be either on or off. Put eight in a row, and you’ve got enough combinations to label and store every character on your keyboard—there are thus eight bits to the byte.
These days your newspapers, your tax records, your shopping list and perhaps your love life are nothing more than a long series of “ons” and “offs” generated by the digital processors that lurk in your phone, your car, or your TV. The correct sequence of ones and zeros is all that computers need in order to control the traffic lights at the end of your street, run a nuclear power station, or find you a date for next Friday night. From one perspective, they are simply doing—on a vast scale—the tallying and reckoning we have always done on our fingers: on our digits.
The “digital age” is a colossal achievement of human ingenuity. But this world of ones and zeros is not an end state. Humankind has passed through other ages before: bronze, iron, the era of steam and then of the telegraph, each of which constituted a revolution, before being brought to a close by some further advance of human ingenuity. And that raises a question—if our present digital age will pass just like all the rest, what might come after it?
We are starting to see the answer to that question, and it looks as though the successor to the age of the digital computer will be a startlingly new kind of device—the quantum computer.
More here.