Four decades have passed since the first small step on the dusty surface of our nearest neighbor in the solar system in 1969. It has been almost that long since the last man to walk on the Moon did so in late 1972. The Apollo missions were a stunning technological achievement and a significant Cold War victory for the United States. However, despite the hope of observers at the time—and despite the nostalgia and mythology that now cloud our memory—Apollo was not the first step into a grand human future in space. From the perspective of forty years, Apollo, for all its glory, can now be seen as a detour away from a sustainable human presence in space. By and large, the NASA programs that succeeded Apollo have kept us heading down that wrong path: Toward more bureaucracy. Toward higher costs. And away from innovation, from risk-taking, and from any concept of space as a useful place. In a sense, Apollo occurred too soon. Had you asked the boldest science fiction writers in, say, 1954 whether men would walk on the Moon within a decade and a half, they would have scoffed—and justifiably so. Even though writers of fiction and nonfiction alike had theorized for decades about putting objects into orbit, and even though work was already underway in 1954 to put the first small unmanned satellites into orbit, the notion that we could develop so rapidly the capability to put men on the Moon on a politically feasible budget would have seemed ludicrous.
more from Rand Simberg at The New Atlantis here.