by Yohan J. John
We are routinely told that we live in a brave new Information Age. Every aspect of human life — commerce, entertainment, education, and perhaps even the shape of consciousness itself — seems to be undergoing an information-driven revolution. The tools for storing and sharing information are becoming faster, more ubiquitous, and less visible. Meanwhile, we are increasingly employing information as an explanation of phenomena outside the world of culture and technology — as the central metaphor with which to talk about the nature of life and mind. Molecular biology, for instance, tells us how genetic information is transferred from one generation to the next, and from one cell to the next. And neuroscience is trying to tell us how information from the external world and the body percolates through the brain, influencing behavior and giving rise to conscious experience.
But do we really know what information is in the first place? And is it really a helpful way to think about biological phenomena? I'd like to argue that explanations of natural phenomena that involve information make inappropriate use of our latent, unexamined intuitions about inter-personal communication, blurring the line between what we understand and what we don't quite have a grip on yet.
People who use information technologies presumably have a working definition of information. We often see it as synonymous with data: whatever can be stored on a hard drive, or downloaded from the internet. This covers text, images, sound, and video — anything that can be represented in bits and bytes. Vision and hearing are the senses we seem to rely on most often for communication, so it's easy to forget that there are still experiences that we cannot really communicate yet, like textures, odors or tastes. (Smellevision still seems a long way off.)
The data-centric conception of information is little over half a century old, and sits alongside an older sense of information. The word 'information' comes from the verb 'inform', which is from the Old French word informer, which means 'instruct' or 'teach'. This word in turn derives from Latin informare, which means 'to shape, form'. The concept of form is closely linked to this sense of information. When something is informative, it creates a specific form or structure in the mind of the receiver — one that is presumably useful.
But there is a tension between seeing information as a unit of communication, and seeing it as something that allows a sender to create a desired result in the mind of a receiver. And this tension goes back to the origins of information theory. Claude Shannon introduced the modern technical notion of information in 1948, in a paper called A Mathematical Theory of Communication. He framed his theory in terms of a transmitter, a channel, and a receiver. The mathematical results he derived showed how any signal could be coded as a series of discrete symbols, and transmitted with perfect fidelity between sender and receiver, even if the channel is noisy. But for the purposes of the theory, the meaning or content of the information was irrelevant. The theory explained how to efficiently send symbols between point A and point B, but had nothing to say about what was actually done with these symbols. All that mattered was that the sender and receiver agree on a system of encoding and decoding. Information theory, and all the technologies that emerged in its wake, allows us to communicate more and communicate faster, but it doesn't really tell us everything we would like to know about communication.


