The American Institute of Physics website contains a fascinating and enjoyably dense history of the science of climatology. Particularly pleasurable is the website’s explication of related developments in chaos theory, computer science, and hermeneutics, all of which deeply effected climatology’s evolution as a science:
The more people worked with computers, the more examples they found of oddly unstable results. Start two computations with exactly the same initial conditions, and they must always come to precisely the same conclusion. But make the slightest change in the fifth decimal place of some initial number, and as the machine cycled through thousands of arithmetic operations the difference might grow and grow, in the end giving a seriously different result. Of course people had long understood that a pencil balanced on its point could fall left or right depending on the tiniest difference in initial conditions, to say nothing of the quantum uncertainties. Scientists had always supposed that this kind of situation only arose under radically simplified circumstances, far from the stable balance of real-world systems like global climate. It was not until the 1950s, when people got digital machines that could do many series of huge computations, that a few began to wonder whether their surprising sensitivity pointed to some fundamental difficulty.
In 1961, an accident cast new light on the question. Luck in science comes to those in the right place and time with the right set of mind, and that was where Edward Lorenz stood. He was at the Massachusetts Institute of Technology, where development of computer models was in the air, and intellectually he was one of a new breed of professionals who were combining meteorology with mathematics. Lorenz had devised a simple computer model that produced impressive simulacra of weather patterns. One day he decided to repeat a computation in order to run it longer from a particular point. His computer worked things out to six decimal places, but to get a compact printout he had truncated the numbers, printing out only the first three digits. Lorenz entered these digits back into his computer. After a simulated month or so, the weather pattern diverged from the original result. A difference in the fourth decimal place was amplified in the thousands of arithmetic operations, spreading through the computation to bring a totally new outcome. “It was possible to plug the uncertainty into an actual equation,” Lorenz later recalled, “and watch the things grow, step by step.”
Lorenz was astonished. While the problem of sensitivity to initial numbers was well known in abstract mathematics, and computer experts were familiar with the dangers of truncating numbers, he had expected his system to behave like real weather. The truncation errors in the fourth decimal place were tiny compared with any of a hundred minor factors that might nudge the temperature or wind speed from minute to minute. Lorenz had assumed that such variations could lead only to slightly different solutions for the equations, “recognizable as the same solution a month or a year afterwards… and it turned out to be quite different from this.” Storms appeared or disappeared from the weather forecasts as if by chance.
Full article here.
For the American Petroleum Institute’s wholly predictable take on the unpredictability inherent in forecasting drastic climate change, click here.
A shorter article from Physics Today can be found here.