by Chris Knight
Noam Chomsky is the world’s most prominent anti-militarist campaigner and, wearing a different hat, the acknowledged founder of modern scientific linguistics. Any attempt to understand Chomsky’s huge influence on modern thought must appreciate the connection between these two roles. And to do this we must begin with a paradoxical fact: Chomsky has spent his career in the Massachusetts Institute of Technology, working in what was originally a military lab. As he himself says about MIT in the 1960s:
[It] was about 90 per cent Pentagon funded at that time. And I personally was right in the middle of it. I was in a military lab. If you take a look at my early publications, they all say something about Air Force, Navy, and so on, because I was in a military lab, the Research Lab for Electronics [RLE].
The Pentagon showed a keen interested in linguistics during this period. Colonel Gaines of the US Air Force recalled why during an interview in 1971. Having referred to the military’s computerised systems of command and control, both for defense against nuclear missiles and for use in Vietnam, he complained about the difficulties of teaching computer languages to military personnel. “We sponsored linguistic research”, he explained, “in order to learn how to build command and control systems that could understand English queries directly.”
If Chomsky’s linguistics was being funded for this purpose, it seems all the more remarkable that Chomsky ended up publicly opposing the Vietnam War and denouncing the very military institutions that were sponsoring his research.
Modestly, Chomsky has always downplayed any moral dilemmas, suggesting that the military had no interest in his work. But I have recently come across restricted-access documents which refer to Chomsky as a “consultant” to a project funded by the Air Force in order “to establish natural language as an operational language for command and control.”
The military’s direct interest in Chomsky is recalled by former Air Force Colonel Anthony Debons, writing in 1971:
Much of the research conducted at MIT by Chomsky and his colleagues [has] direct application to the efforts undertaken by military scientists to develop … languages for computer operations in military command and control systems.
In a 1965 article, another Air Force officer, Jay Keyser, expressed his hope that the military’s artificial computer languages might be replaced by an English command and control language based on Chomsky’s ground-breaking theories. Keyser illustrated his article with “aircraft”, “missile” and similar words plus sample sentences such as “The bomber the fighter attacked landed safely.”
One of Chomsky’s students working on this Air Force-sponsored project was Barbara Partee. She has confirmed to me that Chomsky was involved until at least 1965. According to her testimony, the head of the project, Donald Walker, convinced the military to hire Chomsky’s students on the basis that “in the event of a nuclear war, the generals would be underground with some computers trying to manage things, and that it would probably be easier to teach computers to understand English than to teach the generals to program.”
In the end, the Air Force never succeeded in making Chomsky’s theories work. Barbara Partee hints that such uselessness may have been intentional – an instance of “benign subversion of the military-industrial complex.” Clearly, notions of that kind may have helped soothe her and her colleagues’ consciences. What we can say is that from then on, Chomsky never again worked directly on a military project. Indeed, from 1965, he threw himself into political activism, campaigning tirelessly against the Vietnam War.
At one point, Chomsky considered resigning from MIT, which he said was “more than any other university associated with the activities of the Department of ‘Defense’.” But MIT had been always very supportive of his linguistic research, enabling him to become the most influential linguist of his generation. So Chomsky remained at the university. To really appreciate his situation there, we need to understand the nature of MIT during this period.
The Cognitive Revolution
Chomsky had originally been hired to work at MIT in 1955 by Jerome Wiesner, an influential military scientist who was proud of the fact that his laboratory had made “major scientific and technical contributions to the continuing and growing military technology of the United States.” He was also proud to have “helped get the United States ballistic missile program established in the face of strong opposition.” Wiesner later became a nuclear strategy adviser to both Presidents Eisenhower and Kennedy. Then, during the Vietnam War, he brought together dozens of scientists from MIT and elsewhere in a huge project to deploy sensors, mines and cluster bombs along the border between North and South Vietnam.
As MIT’s provost and president, Wiesner was, in effect, Chomsky’s boss for over 20 years. He also played a crucial role in setting up MIT’s linguistics program. As Chomsky says: “Modern linguistics developed as part of what’s sometimes called the ‘cognitive revolution’ of the 1950s, actually to a large extent here in [RLE’s] Building 20, thanks initially to Jerry Wiesner’s initiatives.”
Although fans of Chomsky’s anti-militarist politics tend to be unaware of this, his status among academics stems from his crucial contribution to this “cognitive revolution”, which was essentially the reflection in psychology and philosophy of the impact of digital computers. Without Chomsky, the stock-in-trade of computer engineering – all that stuff about software versus hardware, inputs and outputs – would have languished unseen within the fields of electronics and telecommunications. It was Chomsky who convinced people that a digital computer – he called it the Language Acquisition Device – existed inside the human brain. Children, he said, are able to quickly acquire the grammar of their first language because inside each mind is a digital computer module wired up from the outset in the necessary way.
Chomsky thereby removed linguistics from its former home within the social sciences and repositioned it firmly within intellectual territory familiar to computer engineers. It was not long before this intellectual revolution spread like wildfire across the human sciences until not one discipline was left untouched.
To grasp why the cognitive revolution seemed so seductive, we need to understand the unique quality of digital communication. We are all familiar with the idea that if you make a photocopy and then copy that photocopy, it degrades with each step until the image gets completely lost. However, from a digital original you can make a million copies of copies of copies, each as perfect as the first. That is because each digital signal is either fully on or fully off, without intermediate positions and therefore impossible to degrade.
Digital information is autonomous with respect to the material in which it is encoded. Or you could say that information now floats free of its material embodiment. When philosophers in the United States discussed the implications, they imagined that science had now solved the problem that philosophers had been struggling with since the ancient Greeks: how such an intangible thing as the soul can possibly exert influence over the material body. They imagined they now had the solution to the mystery. If mind can be seen as software and the body as hardware, all was now clear. It even meant that we might be able in the future to discard our hardware – our bodies – while still remaining who we really are.
Among the most famous of the scientists running with these ideas was Marvin Minsky – brilliant co-founder in 1959 of MIT’s Artificial Intelligence laboratory. Chomsky had been the first to claim with any credibility that a digital device really does operate inside the human brain, and it was in large measure this exciting claim which enabled Minsky to fly so high.
If the mind really is a digital computer, concluded Minsky, then our bodies no longer really matter. Our arms, legs and brain cells are all just imperfect and perishable hardware, essentially irrelevant to the weightless and immortal software – the information – that constitutes who we really are. Minsky even dreamed of banishing death by downloading consciousness into a computer. As he put it:
The most important thing about each person is the data, and the programs in the data that are in the brain. And some day you will be able to take all that data, and put it on a little disk, and store it for a thousand years, and then turn it on again and you will be alive in the fourth millennium.
Now, these ideas would have been of interest only to technicians and engineers had it not been for Chomsky. It was Chomsky who connected all this with what it means to be human, doing so by persuading an awed intellectual community that the mind itself – and in particular the human language faculty – is best understood as a digital device.
Specialists in the structure of the human brain are today united in saying that the human mind is emphatically not a digital computer, but works on entirely different principles. Our minds are far from being just information or software, running on any old hardware. Emotional intelligence matters. We breathe, think and live in our bodies, and do so through our relationships with one another, not simply in our heads.
But ideas about emotions, bodies and relationships were of limited interest to the military in the 1960s. The Pentagon wanted new ways to coordinate and program their computer-controlled weaponry. Hence their enthusiasm for Chomsky’s work.
At first, this enthusiasm may have appeared relatively harmless. After all, Chomsky was originally employed by Jerome Wiesner to work on machine translation. Although this project was of considerable interest both to the military and to the CIA, it was not directly concerned with weapons command and control. But, once Chomsky found himself working for an Air Force sponsored project of direct military use, it must have been impossible to ignore the moral and political implications. Other scientists may have been able to ignore their consciences and continue to do research designed to kill people. But Chomsky was too principled for this. He had to do something and, if leaving MIT was not an option, he would have to find another way to stop his linguistics from being used for military purposes.
Chomsky’s Choice
From the beginning of his career at MIT, Chomsky had always felt most comfortable with concepts of language so abstract and formal that linguistics had come to resemble mathematics. Typically, colleagues and students would try to tinker with his latest theory to make it more realistic. That is what his students were doing on that Air Force sponsored command and control project. Chomsky seems to have gone along with this for a while, but then resolved to retreat back to pure abstractions. At all times, his preferred choice was to satisfy his moral conscience by treating language as something so utterly abstract and other-worldly – so completely removed from social usage or any practical application – that no matter what insights he came up with, nothing could possibly be used to kill anyone.
Chomsky succeeded in the sense that none of his research was ever found workable or useful by the US military, despite their best efforts. Unfortunately, Chomsky’s insistence on keeping faith with classical Greek philosophy – with pure, eternal and therefore harmless abstract forms – also meant that, in the view of most contemporary linguists, half a century of theorizing failed to make any genuine intellectual breakthrough.
In order to disconnect language from dubious social usage or politics, Chomsky pushed individualism and genetic determinism to unprecedented extremes. To give one example involving my own specialism, language origins: Chomsky claims that language did not gradually evolve but suddenly appeared in the head of one single individual. Once equipped with language, this individual then talked to itself. He goes on to claim that the concepts we put together in sentences – for example “book” or “carburettor” – became specified in the human genome when our species first evolved, many thousands of years before such things had been invented.
Chomsky’s polemical skills allow him to make these arguments seem credible at first sight, but on reflection they make no sense. But there has to be a reason why Chomsky so passionately espouses them. I am not claiming any privileged access to Chomsky’s psychology but what I can say is that such strange ideas categorically remove linguistics from all contact with the real world, situating the entire field in a realm of eternal abstractions. This successfully removes linguistic theory from any dubious military use but it also means, as he himself readily admits, that his theoretical models have nothing to say about how sentences are really used by people, how words are able to refer to things in the external world or how languages change over historical time.
For anyone who admires Chomsky as I do, it feels risky to say things that can so easily be misunderstood. No part of my account can detract from Chomsky’s record as a tireless anti-militarist campaigner. Neither can it detract from his persistence in withstanding the institutional pressures that he must have endured at MIT. Had he resigned in disgust in 1967, when he was thinking of doing so, he might never have gained the platform he needed to signal his dissidence across the world. There are times when all of us have to make compromises, some more costly than others. My argument is that it was Chomsky’s science, rather than his politics, that bore the brunt of those damaging pressures and costs.
* * *
Chris Knight is currently Senior Research Fellow in the Department of Anthropology at University College London, exploring what it means to be human by focusing on the evolutionary emergence of language and symbolic culture. His latest book is Decoding Chomsky: Science and revolutionary politics (Yale University Press 2018). His website is www.scienceandrevolution.org.