Saturday Poem

Arriving Shortly

When amma came
to New York City,
she wore unfashionably cut
salwar kurtas,
mostly in beige,
so as to blend in,
her body
a puzzle that was missing a piece –
the many sarees
she had left behind:
that peacock blue
Kanjeevaram,
that nondescript nylon in which she had raised
and survived me,
the stiff chikan saree
that had once held her up at work.

When amma came to
New York City,
an Indian friend
who swore by black
and leather,
remarked in a stage whisper,

“This is New York, you know –
not Madras.
Does she realise?”

Ten years later,
transiting through L.A. airport
I find amma
all over again
in the uncles and aunties
who shuffle past the Air India counter
in their uneasily worn, unisex Bata sneakers,
suddenly brown in a white space,
louder than ever in their linguistic unease
as they look for quarters and payphones.
I catch the edge of amma’s saree
sticking out
like a malnourished fox’s tail
from underneath
some other woman’s sweater
meant really for Madras’ gentle Decembers.

by K. Srilata
from Arriving Shortly
Publisher: Writers Workshop, Kolkata, © 2011

Look, I Made a Hat

1_listing

It might be that the stage musical is now pretty well over as a form. Certainly, the gloomy parade of ‘juke-box’ musicals through the West End doesn’t give one much hope for the future. It is difficult to pick out a worst offender, but the Ben Elton We Will Rock You, confected from the Queen catalogue, is as bad as any. Its premise, of taking the work of a curious-looking, homosexual, Parsi, excessive genius like Freddie Mercury and turning it into an idiotic story about two clean-cut stage-school kids Putting the Show on Right Now says something truly terrible about the musical: it says that it can only deal with conventional views of conventional subjects. The demonstration of just how untrue that really is comes with the collected works of Stephen Sondheim, who is surely the greatest figure in the entire history of the stage musical. In his long career, he has not hesitated to address difficult subjects. It’s certainly true that other classics in the genre have dealt with some serious issues — race relations in Showboat, the Anschluss in The Sound of Music, even trade union movements in The Pyjama Game and urban prostitution in Sweet Charity. When Sondheim takes on themes of colonial exploitation (Pacific Overtures), political assassinations (Assassins) or Freudian psychological depths (pretty well the whole oeuvre), he is not stepping outside the previously established limits of the form.

more from Philip Hensher at The Spectator here.

thinking through ows

Ows-1

Protests do not write policy. And something as loosely formed as the OWS action shouldn’t be drafting white papers. What protests can do most effectively is to alter the common sense understanding of what is right and wrong. In this case, the OWS action makes other sufferers of debt and disenfranchisement feel that their problems are political—not a symptom of personal shortcomings, and not just the unfortunate side effect of a passing miscalculation by the Peter Orszags of the world. The real “goal” of OWS is to rally together everyone who is willing to say to Washington, “American democracy cannot bear this inequality.” This movement may prove to be adept at waging ideological war against the disastrous free-marketeers, occupying the airwaves as well as the streets—but it will indeed fall to others to write legislation and to organize economic priorities in debt-wracked communities. The OWS protests should operate in concert with such efforts (OWSers have assisted foreclosure resistance in Queens, for instance), and should put up new forms of protests that keep the public’s eyes on the culprits. Bank occupations have already begun. Major campaigns are now successfully exhorting citizens to move their savings and checking accounts from big banks to local credit unions. The black box of high finance has finally been pried open and exposed for the unregulated machine of destruction that it is, and the alternatives being proposed in the tumult of Occupy Wall Street sound pretty smart to me.

more from Sarah Leonard at Bookforum here.

agatha

600full-agatha-christie

Agatha Christie was not cozy. She earned the title the Queen of Crime the old-fashioned way — by killing off a lot of people. Although never graphic or gratuitous, she was breathtakingly ruthless. Children, old folks, newlyweds, starlets, ballerinas — no one is safe in a Christie tale. In “Hallowe’en Party,” she drowns a young girl in a tub set up for bobbing apples and, many chapters later, sends Poirot in at the very last minute to prevent a grisly infanticide. In “The ABC Murders,” she sets up one of the first detective-taunting serial killers. The signature country home aside, Christie’s literary world was far from homogenous. Her plots, like her life, were international, threading through urban and pastoral, gentry and working class, dipping occasionally into the truly psychotic or even supernatural. Christie murders were committed for all the Big Reasons — love, money, ambition, fear, revenge — and they were committed by men, women, children and in one case, the narrator. Some of her books are truly great — “Death on the Nile,” “And Then There Were None,” “The Secret Adversary,” “Murder on the Orient Express,” “Curtain” to name a few — and some are not. But even the worst of them (“The Blue Train,” “The Big Four”) bear the hallmarks of a master craftsman. Perhaps not on her best day, but the failures make us appreciate the successes, and the woman behind them, that much more.

more from Mary McNamara at the LA Times here.

The Logic of Deceit and Self-Deception

Drew DeSilver in The Seattle Times:

2016723867Back in 1982, Air Florida Flight 90 was attempting to take off from Washington, D.C., in a blinding snowstorm. Though the co-pilot was concerned the plane's wings hadn't been thoroughly de-iced and his instrument panel wasn't displaying the correct airspeed, the pilot dismissed his concerns until seconds before the plane crashed into the Potomac River, killing all but five aboard.

The crash, as cockpit voice recordings later showed, was primarily the result of the pilot's overconfidence leading him to ignore or minimize a whole series of warning signs that his more observant, but less assertive, colleague had pointed out to him. It's one of the most dramatic illustrations of the costs of self-deception in Robert Trivers' new book, “The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life” (Basic Books, 397 pp., $28).

Trivers, an evolutionary biologist who teaches at Rutgers, starts by asking one of those questions that seems obvious once someone else asks it: Why should our brains — whose job, after all, is to make sense of everything we see, hear, touch, taste and smell — be so prone to self-deception? Natural selection would seem to work against creatures who persistently fail to see the world as it is, yet self-deception seems to be deeply embedded in our psyches.

Trivers' answer, which he first advanced in 1976 and has been elaborating since, is that we deceive ourselves the better to deceive others. If we can convince ourselves that we are stronger, smarter, more skillful, more ethical or better drivers than others, we're a long way toward convincing other people too.

More here.

Africa Unleashed: Explaining the Secret of a Belated Boom

Edward Miguel in Foreign Affairs:

9781933286518It is well known that the 1970s, 1980s, and 1990s were a disaster for the countries of sub-Saharan Africa. In a period when other underdeveloped regions, especially Asia, were experiencing steady economic growth, Africa as a whole saw its living standards plummet. Nearly all Africans lived under dictatorships, and millions suffered through brutal civil wars. Then, in the 1990s, the HIV/AIDS epidemic exploded, slashing life expectancy and heightening the sense that the region had reached rock bottom. It was no surprise when an intellectual cottage industry of Afro-pessimists emerged, churning out a stream of plausible-sounding explanations for Africa's stunning decline. The verdict was simple: Africa equaled failure.

What is less well known is that Africa's prospects have changed radically over the past decade or so. Across the continent, economic growth rates (in per capita terms) have been positive since the late 1990s. And it is not just the economy that has seen rapid improvement: in the 1990s, the majority of African countries held multiparty elections for the first time since the heady postindependence 1960s, and the extent of civic and media freedom on the continent today is unprecedented. Even though Africa's economic growth rates still fall far short of Asia's stratospheric levels, the steady progress that most African countries have experienced has come as welcome news after decades of despair. But that progress raises a critical question: what happened?

More here.

How It Went

Christopher Buckley in the New York Times Book Review:

Buckley-popupKurt Vonnegut died in 2007, but one gets the sense from Charles J. Shields’s sad, often heartbreaking biography, “And So It Goes,” that he would have been happy to depart this vale of tears sooner. Indeed, he did try to flag down Charon the Ferryman and hitch a ride across the River Styx in 1984 (pills and booze), only to be yanked back to life and his marriage to the photographer Jill Krementz, which, in these dreary pages, reads like a version of hell on earth. But then Vonnegut’s relations with women were vexed from the start. When he was 21, his mother successfully committed suicide — on Mother’s Day.

It’s a truism that comic artists tend to hatch from tragic eggs. But as Vonnegut, the author of zesty, felicitous sci-fi(esque) novels like “Cat’s Cradle” and “Sirens of Titan” and “Breakfast of Champions” might put it, “So it goes.”

Vonnegut’s masterpiece was “Slaughterhouse-Five,” the novelistic account of being present at the destruction of Dresden by firebombing in 1945. Between that horror (his job as a P.O.W. was to stack and burn the corpses); the mother’s suicide; the early death of a beloved sister, the only woman he seems truly to have loved; serial unhappy marriages; and his resentment that the literary establishment really considered him (just) a writer of juvenile and jokey pulp fiction, Vonnegut certainly earned his status as Man of Sorrows, much as Mark Twain, to whom he is often compared, earned his.

Was Kurt Vonnegut, in fact, just that — a writer one falls for in high school and college and then puts aside, like one of St. Paul’s “childish things,” for sterner stuff?

More here.

New Analysis Deals Critical Blow to Faster-than-Light Results

Bubble-chamber-tracksNatalie Wolchover in Live Science:

Those famous neutrinos that appeared to travel faster than light in a recent experiment probably did not, a group of scientists say, because they failed to emit a telltale type of radiation.

According to one physicist in the group, “it's hard to argue against” this latest objection to the controversial faster-than-light result produced by other scientists in the same Italian laboratory.

In a paper posted to the physics pre-print site arXiv.org, the group, which runs the ICARUS (Imaging Cosmic and Rare Underground Signals) experiment based at Gran Sasso Laboratory (LNGS) outside Rome, argues that any faster-than-light particles would be expected to emit a particular type of radiation as they traveled. Because they didn't detect any of this coming from the neutrinos — and because the particles didn't seem to be shedding energy in the form of undetected radiation — they must have been traveling at or below the speed of light.

Ultimately, the ICARUS group is arguing that the OPERA group, which ran the experiment that measured neutrinos making a trip from CERN Laboratory in Switzerland to LNGS in Italy 60 nanoseconds faster than light would have done, must have made some mistake in its timekeeping.

The Justice Cascade: Six Questions for Kathryn Sikkink

Sikkink_kathryn_credit_doug_knutsonScott Horton interviews Kathryn Sikkink in Harper's, via Andrew Sullivan:

1. You start your work by examining the collapses of brutal military dictatorships in Europe’s southern tier (Greece, Spain and Portugal), and point out that although political and social processes led to accountability in Greece and Portugal, they didn’t in Spain. Will accountability for the horrendous crimes of the Franco period be avoided forever, or have they merely been delayed?

Based on charges filed by associations of victims and their families, Spanish Judge Baltasar Garzón opened an investigation in 2008 into more than 100,000 cases of executions and disappearances that took place from 1936 to 1951. So, we are talking here about executions and disappearances that happened between sixty and seventy-five years ago. My book is about the trend toward individual criminal accountability, which requires that cases be brought against specific living perpetrators. Virtually all of the suspected perpetrators in Spain are now dead. Although individual criminal accountability for human rights violations from that period is no longer possible, other forms of accountability are needed. In particular, many family members still hope to locate the remains of their relatives, to rebury those remains, and to know more about the circumstances that led to the deaths. Such truth-telling is still necessary and possible, even if individual criminal accountability is not.

2. Samuel Huntington wrote that if accountability trials were to be conducted, they had to occur immediately in the wake of transition or not at all. His view seems to have been the received wisdom of political scientists twenty years ago. Have the intervening events tended to sustain or to refute him?

The single most forceful finding of my research is that on this issue, Huntington was completely wrong. Justice comes slowly — often painfully, unacceptably slowly in the eyes of victims — but surprisingly it often does come. Domestic courts in Uruguay took twenty years to sentence former authoritarian leaders Juan María Bordaberry and General Gregorio Álvarez for ordering the murders of political opponents. The Extraordinary Chambers in Cambodia issued its first conviction last year, more than thirty years after the horrors of the killing fields.

Beautiful Minds

29d6d034-176f-11e1-8766-000b5dabf613Niranjan Rajadhyaksha reviews Sylvia Nassar's Grand Pursuit—The Story of Economic Genius in Live Mint:

The man who first described economics as a dismal science was a defender of the slave trade. Thomas Carlyle​, an English historian and writer in the 19th century, claimed that slavery was a superior institution to the market, and that the liberation of slaves in the Caribbean islands had led to the moral decline of “the Negroes”. He was attacked by economists such as John Stuart Mill​ for this bizarre argument. Adam Smith​ had written much earlier about the common humanity of the street porter and the philosopher.

The human condition has been one of the central concerns of the best economists. Sylvia Nasar has chosen an opportune moment to remind us about this, at a time when economists have been criticized for being too engrossed in technical trivialities even as the world economy was rolling towards its deepest crisis in more than seven decades. Grand Pursuit: The Story of Economic Genius is an ambitious book by a writer who won well-deserved praise for A Beautiful Mind, her dazzling biography of John Nash, the tormented genius who revolutionized game theory but then fell prey to schizophrenia.

Nasar starts her story with Charles Dickens rather than Adam Smith. “Political economy is a mere skeleton unless it has a little human covering and filling out,” Dickens wrote in the first issue of a magazine he edited. “A little human bloom upon it, and a little human warmth in it.” It was a call to humanize economics.

Dickens was writing at a time when economists took a dim view of human progress. The clergyman Thomas Malthus​ believed that extreme poverty was the inevitable situation of “nine parts in ten of the whole human race”. The sexual drive was at fault, said Malthus, as mindless procreation would ensure that the human population would tend to outstrip available food supply, with disease and famine helping to correct the imbalance. It was this dire prognosis that earned economics the moniker of being a dismal science. The economic historian James Henderson has argued that A Christmas Carol, the famous moral tale written by Dickens, with its descriptions of abundant food, is an attack on Malthus.

Habermas, the Last European

Image-285570-panoV9-tgkmGeorg Diez in Spiegel:

Jürgen Habermas is angry. He's really angry. He is nothing short of furious — because he takes it all personally.

He leans forward. He leans backward. He arranges his fidgety hands to illustrate his tirades before allowing them to fall back to his lap. He bangs on the table and yells: “Enough already!” He simply has no desire to see Europe consigned to the dustbin of world history.

“I'm speaking here as a citizen,” he says. “I would rather be sitting back home at my desk, believe me. But this is too important. Everyone has to understand that we have critical decisions facing us. That's why I'm so involved in this debate. The European project can no longer continue in elite modus.”

Enough already! Europe is his project. It is the project of his generation.

Jürgen Habermas, 82, wants to get the word out. He's sitting on stage at the Goethe Institute in Paris. Next to him sits a good-natured professor who asks six or seven questions in just under two hours — answers that take fewer than 15 minutes are not Habermas' style.

Usually he says clever things like: “In this crisis, functional and systematic imperatives collide” — referring to sovereign debts and the pressure of the markets.

Sometimes he shakes his head in consternation and says: “It's simply unacceptable, simply unacceptable” — referring to the EU diktat and Greece's loss of national sovereignty.

'No Convictions'

And then he's really angry again: “I condemn the political parties. Our politicians have long been incapable of aspiring to anything whatsoever other than being re-elected. They have no political substance whatsoever, no convictions.”

It's in the nature of this crisis that philosophy and bar-room politics occasionally find themselves on an equal footing.

It's also in the nature of this crisis that too many people say too much, and we could definitely use someone who approaches the problems systematically, as Habermas has done in his just published book.

But above all, it is in the nature of this crisis that the longer it continues, the more confusing it gets.

Cigarettes May be Useful for Distance Runners?!? (or, How to Prove Anything with a Review Article)

Smoking-213x300Travis Saunders in Obesity Panacea:

Could smoking really be beneficial for distance runners like myself?

Here are Ken’s arguments:

1. Serum hemoglobin is related to endurance running performance. Smoking is known to enhance serum hemoglobin levels and (added bonus), alcohol may further enhance this beneficial adaptation.

2. Lung volume also correlates with running performance, and training increases lung volume. Guess what else increases lung volume? Smoking.

3. Running is a weight-bearing sport, and therefore lighter distance runners are typically faster runners. Smoking is associated with reduced body weight, especially in individuals with chronic obstructive pulmonary disease (these folks require so much energy just to breath that they often lose weight).

In the discussion, Ken goes on to point out that:

Cigarette smoking has been shown to increase serum hemoglobin, increase total lung capacity and stimulate weight loss, factors that all contribute to enhanced performance in endurance sports. Despite this scientific evidence, the prevalence of smoking in elite athletes is actually many times lower than in the general population. The reasons for this are unclear; however, there has been little to no effort made on the part of national governing bodies to encourage smoking among athletes.

Now at this point I assume that people are wondering how something this insane came to be published in a respected medical journal (as of 2010, CMAJ was ranked 9th of out 40 medical journals, with an impact factor of 9). The answer, of course, is that the point of Ken’s article was to illustrate how you can fashion a review article to support almost any crazy theory if you’re willing to cherry-pick the right data.

Lynn Margulis, Evolution Theorist, Dies at 73

Bruce Weber in the New York Times:

ScreenHunter_06 Nov. 25 17.38Lynn Margulis, a biologist whose work on the origin of cells helped transform the study of evolution, died on Tuesday at her home in Amherst, Mass. She was 73.

She died five days after suffering a hemorrhagic stroke, said Dorion Sagan, a son she had with her first husband, the cosmologist Carl Sagan.

Dr. Margulis had the title of distinguished university professor of geosciences at the University of Massachusetts, Amherst, since 1988. She drew upon earlier, ridiculed ideas when she first promulgated her theory, in the late 1960s, that cells with nuclei, which are known as eukaryotes and include all the cells in the human body, evolved as a result of symbiotic relationships among bacteria.

The hypothesis was a direct challenge to the prevailing neo-Darwinist belief that the primary evolutionary mechanism was random mutation.

Rather, Dr. Margulis argued that a more important mechanism was symbiosis; that is, evolution is a function of organisms that are mutually beneficial growing together to become one and reproducing. The theory undermined significant precepts of the study of evolution, underscoring the idea that evolution began at the level of micro-organisms long before it would be visible at the level of species.

More here.

Friday Poem

Letter From a Shortsighted Girl

To Daniel

My hushed voice cannot reach you
My shortsighted eye cannot see you.

Maybe it is better like this.

Today I didn't have too much to tell you
Just that in the afternoon I went out for a walk.
It started raining.
Kissing in the rain, what a silly cliché
I thought, as I was searching for a shelter.

If I put all my courage together I would have told you
that in the last year I have learned to miss you reasonably,
while remembering the traps of the happy days.
Otherwise, I would have spoken about traveling and books.

Once I had a dream about you.
You were writing our embraces
on a piece of my unwrinkled skin.
In the morning, you wrapped it back around my body.

Last week I bought a green sun umbrella and a lily,
and put them on the balcony, in the place where I like to read.
From there I can see the horizon, stretching its back like a cat
ready to jump into my lap.

I don't miss you. It is just me,
that I don't understand anymore.
.

by Yodie

‘We’re blind to our blindness. We have very little idea of how little we know. We’re not designed to’

From The Independent:

DanDaniel Kahneman, 77, is the Eugene Higgins Professor of Psychology Emeritus at Princeton University. In 2002, he was awarded the Nobel Prize in Economics for his analyses of decision-making and uncertainty, developed with the late Amos Tversky. His work has influenced not only psychology and economics, but also medicine, philosophy, politics and the law. In his new book, Thinking, Fast and Slow, Kahneman explains the ideas that have driven his career over the past five decades, providing an unrivalled insight into the workings of our own minds. Nicholas Nassim Taleb has called it “a landmark book in social thought”.

“Fast” and “Slow” thinking is a distinction recognised in psychology under various names, such as system one [intuitive thought] and system two [deliberate thought]. The subtitle for my talks on the subject is: “The marvels and the flaws of intuitive thinking.” We act intuitively most of the time. System one learns how to navigate the world, and mostly it does so very well. But when system one doesn't have the answer to a question, it answers another, related question.

A study was done after there were terror incidents in Europe. It asked people how much they would be willing to pay for an insurance policy that covered them against death, for any reason, during a trip abroad. Another group of people were asked how much they would pay for a policy that covered them for death in a terrorist incident during the trip. People paid substantially more for the second than for the first, which is absurd. But the reason is that we're more afraid when we think of dying in a terrorist incident, than we are when we think simply of dying. You're asked how much you're willing to pay, and you answer something much simpler, which is: “How afraid am I?” Some students were asked two questions: “How happy are you?” and “How many dates did you go on last month?” If you ask the questions in that order, the answers are completely uncorrelated. But if you reverse the order, the correlation is very high. When you ask people how many dates they had last month, they have an emotional reaction: if they went on dates, then they're happier than if they went on none. So if you then ask them how happy they are, that emotional reaction is going on already, and they use it as a substitute for the answer to the question. On the most elementary level, what we feel is a story. System one generates interpretations, which are like stories. They tend to be as coherent as possible, and they tend to suppress alternatives, so that our interpretation of the world is simpler than the world really is. And that breeds overconfidence.

More here.

What Scientists Can Be Grateful for on Thanksgiving

Adam Ruben in Science:

Science_thanksgiving_400X317At Thanksgiving, we identify the usual culprits. We’re thankful for family, we’re thankful for friends, we’re thankful for the food itself. We’re thankful that Farting Cousin Barry’s flight was delayed. But do we ever stop and express our appreciation for science? No, says Google: A search for “Thanksgiving science” yields only articles about whether turkey really makes you sleepy. So let’s do it now.

• We are thankful for our families who don’t flinch when we say that we need to go into the lab at midnight, even though the gist of this sentiment is that we’re choosing bacterial cultures over them.

• We are thankful that some branches of science have produced some pretty useful things, because their success allows the other branches to keep working on fun, pointless crap below the radar.

• We are thankful for the goggles that keep our eyeballs intact, albeit at the expense of long-lasting dark lines on our foreheads.

• We are thankful for the big words that make us sound smart.

• We are thankful that our profession inspires an entire branch of wonderfully inventive fiction.

• We are thankful to the funding agencies that support our research. Without them, we’d be at home experimenting on our cats.

• We are thankful for high-quality journals that allow us to share our advances with the world, like Science — and there’s this other one, I think, a British one that starts with an “N”. Nurture? Neighbors? I don’t remember.

More here.

Religion’s Truce with Science Can’t Hold

Apollo-moon-landings-007Julian Baggini in Comment is Free from a month ago (with responses by many, including Keith Ward, Jerry Coyne, Jim P. Houston, Ophelia Benson, Jean Kazez, and Russell Blackford):

One of the most tedious recurring questions in the public debate about faith has been “is religion compatible with science?” Why won't it just go away?

I'm convinced that one reason is that the standard affirmative answer is sophisticated enough to persuade those willing to be persuaded, but fishy enough for those less sure to keep sniffing away at it. That defence is that religion and science are compatible because they are not talking about the same things. Religion does not make empirical claims about how the universe works, and to treat it as though it did is to make a category mistake of the worst kind. So we should just leave science and religion to get on with their different jobs free from mutual molestation.

The biologist Stephen Jay Gould made just this kind of move when he argued that science and religion have non-overlapping magisteria (noma). In Rock of Ages, Gould wrote that science deals with “the empirical realm: what the universe is made of (fact) and why does it work in this way (theory). The magisterium of religion extends over questions of ultimate meaning and moral value. These two magisteria do not overlap, nor do they encompass all inquiry.” In short, science is empirical, religion is ethical.

A version of this strategy was also adopted by the physicist John Polkinghorne and the mathematician Nicholas Beale in their book, Questions of Truth. As they put it: “Science is concerned with the question, How? – By what process do things happen? Theology is concerned with the question, Why? – Is there a meaning and purpose behind what is happening?”

It sounds like a clear enough distinction, but maintaining it proves to be very difficult indeed. Many “why” questions are really “how” questions in disguise.

The Minds of Machines

Issue87Namit Arora in Philosophy Now:

René Descartes held that science and math would one day explain everything in nature. Early AI researchers embraced Hobbes’ view that reasoning was calculating, Leibniz’s idea that all knowledge could be expressed as a set of primitives [basic ideas], and Kant’s belief that all concepts were rules. At the heart of Western rationalist metaphysics – which shares a remarkable continuity with ancient Greek and Christian metaphysics – lay Cartesian mind-body dualism. This became the dominant inspiration for early AI research. Early researchers pursued what is now known as ‘symbolic AI’. They assumed that our brain stored discrete thoughts, ideas, and memories at discrete points, and that information is ‘found’ rather than ‘evoked’ by humans. In other words, the brain was a repository of symbols and rules which mapped the external world into neural circuits. And so the problem of creating AI was thought to boil down to creating a gigantic knowledge base with efficient indexing, ie, a search engine extraordinaire. That is, the researchers thought that a machine could be made as smart as a human by storing context-free facts, and rules which would reduce the search time effectively. Marvin Minsky of MIT’s AI lab went as far as claiming that our common sense could be produced in machines by encoding ten million facts about objects and their functions.

It is one thing to feed millions of facts and rules into a computer, another to get it to recognize their significance and relevance. The ‘frame problem’, as this last problem is called, eventually became insurmountable for the ‘symbolic AI’ research paradigm. One critic, Hubert L. Dreyfus, expressed the problem thus: “If the computer is running a representation of the current state of the world and something in the world changes, how does the program determine which of its represented facts can be assumed to have stayed the same, and which might have to be updated?” (‘Why Heideggerian AI Failed and how Fixing it would Require making it more Heideggerian’).

GOFAI – Good Old Fashioned Artificial Intelligence – as symbolic AI came to be called, soon turned into what philosophers of science call a degenerative research program – reduced to reacting to new discoveries rather than making them. It is unsettling to think how many prominent scientists and philosophers held (and continue to hold), such naïve assumptions about how human minds operate. A few tried to understand what went wrong and looked for a new paradigm for AI.