On “the sequencing of the mathematical genome”
Mathematics is funnier than it gets credit for, and the best laugh I ever had about math involved a friend in college and a course so intimidating he almost quit his mathematics major after hearing the name. “Advanced Calculus—A rigorous approach,” it was called, a title that betrayed a martinet attitude. Whereas your average multivariable calc class was flabby and slack-ass, here you’d finally get some goddamn discipline, boy. You will throw up.
Word around the chalkboard was that every homework problem in “Advanced Calculus—A rigorous approach” required six sheets of paper, because you wrote out every nitpicky step and made every assumption explicit, no matter how obvious—not even arithmetic was taken for granted. For some reason I got endless delight terrorizing Mark by pointing out all the horrid, spindly-legged theorems in other books he would have to dissect in “Advanced Calculus—A rigorous approach,” predicting the logic would drive him actually mad. Every time I mentioned the class, I used its full draconian title, “Advanced Calculus—A rigorous approach,” and fear of it drove him to the brink both of hyperventilation and of dropping his major, which probably would have meant him dropping out of college.
Mark was spared by an administrative overhaul of the department, so he never took the class. For my part, I’d almost forgotten the whole incident until I came across a curious bundle of papers in a recent issue of the Notices of the American Mathematical Society—four treaties on the future of mathematical proofs, and specifically on how computers were going to take over a large burden of the work in mathematical proofs. However unpromising that topic sounds, it soon had my thoughts dilating like a marijuana smoker’s thoughts into all sorts of wild conjectures, because it turns out (1) “Advanced Calculus—A rigorous approach” was flabby and slack-ass compared to what’s coming in formal mathematics, and (2) the idea of mathematical beauty might soon be extinct.
First, a few lines of history (cribbed from the papers, natch): The first great revolution in math came from Pythagoras, Euclid, and the rest of the Greeks, who introduced the concept of proofs. You saw examples of this in your high-school geometry class. Next, in the 1800s, came the next big thing, rigor. Oddly, rigor in math is most easily recognized as a feeling—the scrotum-shrinking embarrassment that even people really, really good at college math feel upon realizing that some people are way the hell smarter. Namely, people who do original work in rigorous mathematics.
The next and latest revolution in math was the subject of the NAMS papers—formalization. Formalization means tracing math back to fundamental axioms, the kind of migrainous set theory that takes pages to explicate just why two is the successor of one. It turns out there’s currently a movement in mathematics—and the authors of the quartet of papers claim that not even most mathematicians realize at, or at least don’t admit it—but there’s a movement to make all mathematical proofs fully formal. To basically take a six-page homework problem from “Advanced Calculus—A rigorous approach” and apply even more destructive methods to every single line of that problem, expanding the amount of rigor geometrically if not exponentially.
Why? Because when you tease apart every atom of every line, you can actually convert very hard mathematical concepts into a series of very simple steps. The steps might seem stunted and overly obvious and useless, but they lo and behold add up to something in the end. It’s like someone explaining how a car engine works by starting with the theory of bolts screwing onto threads and going into incredibly arcane detail about it, and then repeating that nut-and-bolt explanation every time you came across another screw. You’d get pissed off, but you’d also probably understand how a car engine worked if he continued to break everything down to concepts that simple. That’s formal mathematics. And once you’ve checked all the ticky-tack steps between lines of a proof, you can be pretty darn sure it’s correct. One paper’s author called this “the sequencing of the mathematical genome.”
Read more »