From the Naturalism Workshop, Part I

Around the table-2

Massimo Pigliucci reports on the Naturalism Workshop conceived of by Sean Carroll, over at Rationally Speaking:

During the roundtable introductions, Dawkins (as well as the rest of us) was asked what he would be willing to change his mind about; he said he couldn’t conceive of a sensible alternative to naturalism. Rosenberg, interestingly, brought up the (hypothetical) example of finding God’s signature in a DNA molecule (just like Craig Venter has actually done). Dawkins admitted that that would do it, though immediately raised the more likely possibility that that would be a practical joke played by a superhuman — but not supernatural — intelligence. Coyne then commented that there is no sensible distinction between superhuman and supernatural, in a nod to Clarke’s third law.

There appeared to be some interesting differences within the group. For instance, Rosenberg clearly has no problem with a straightforward functionalist computational theory of the mind; DeDeo accepts it, but feels uncomfortable about it; and Deacon outright rejects it, without embracing any kind of mystical woo. Steven Weinberg asked the question of whether — if a strong version of artificial intelligence is possible — it follows that we should be nice to computers.

The first actual session was about the nature of reality, with an introduction by Alex Rosenberg. His position is self-professedly scientistic, reductionist and nihilist, as presented in his The Atheist’s Guide to Reality. (Rationally Speaking published a critical review of that book, penned by Michael Ruse.) Alex thinks that complex phenomena — including of course consciousness, free will, etc. — are not just compatible with, but determined by and reducible to, the fundamental level of physics. (Except, of course, that there appears not to be any such thing as the fundamental level, at least not in terms of micro-things and micro-bangings.)

A Matter of Taste?

28FOOD-articleInline

William Deresiewicz in the NYT:

Foodism has taken on the sociological characteristics of what used to be known — in the days of the rising postwar middle class, when Mortimer Adler was peddling the Great Books and Leonard Bernstein was on television — as culture. It is costly. It requires knowledge and connoisseurship, which are themselves costly to develop. It is a badge of membership in the higher classes, an ideal example of what Thorstein Veblen, the great social critic of the Gilded Age, called conspicuous consumption. It is a vehicle of status aspiration and competition, an ever-present occasion for snobbery, one-upmanship and social aggression. (My farmers’ market has bigger, better, fresher tomatoes than yours.) Nobody cares if you know about Mozart or Leonardo anymore, but you had better be able to discuss the difference between ganache and couverture.

Young men once headed to the Ivy League to acquire the patina of high culture that would allow them to move in the circles of power — or if they were to the manner born, to assert their place at the top of the social heap by flashing what they already knew. Now kids at elite schools are inducted, through campus farmlets, the local/organic/sustainable fare in dining halls and osmotic absorption via their classmates from Manhattan or the San Francisco Bay Area, into the ways of food. More and more of them also look to the expressive possibilities of careers in food: the cupcake shop, the pop-up restaurant, the high-end cookie business. Food, for young people now, is creativity, commerce, politics, health, almost religion.

It took me some effort to explain to a former student recently that no, my peers did not talk about food all the time when we were her age, unless she meant which diner we were going to for breakfast. “But food is everything!” she said.

jacques barzun (1907-2012)

BARZUN1-obit-articleLarge-v2

Unlike many of his colleagues, Professor Barzun showed little interest in taking overtly political positions. This was partly because he became a university administrator and had to stand above the fray, and partly because he approached the world with a detached civility and a sardonic skepticism about intellectual life. “The intellectuals’ chief cause of anguish,” he wrote in “The House of Intellect” (1959), “are one another’s works.” If Mr. Barzun kept the political issues of the day at arm’s length, he nonetheless developed a reputation as a cultural conservative after the student protests at Columbia in the late 1960s. He later argued that the “peoples of the West” had “offered the world a set of ideas and institutions not found earlier or elsewhere.”

more from Edward Rothstein at the NY Times here.

On Henry James

Henry_James_ftr_0

These days, the rumblings of the James industry are louder than those of the Hawthorne industry, the Hemingway industry and even—mirabile dictu!—the Faulkner industry. But only the bulk of the industry’s output, if not its spirit or letter, is registered on ground level. V.S. Naipaul, for example, has remained deaf to the claims of the post-revival Jamesians, dismissing James on the ground that he “never went out in the world…ever risked anything…ever exposed himself to anything…ever thought he should mingle with the crowd.” But to the figure usually identified as “that mythical creature, ‘the Common Reader,’” James has become a solidly major figure, one of a handful of Big Names, as Michael Gorra’s thorough, level-headed new book, Portrait of a Novel: Henry James and the Making of an American Masterpiece, suggests. A scholarly (or fanatical) love letter, it reads like a biography of Portrait of a Lady—its gestation, development, reception—or perhaps a well-researched novel about Henry James that favors the early period, where Lodge and others favored the late.

more from Leo Robson at The Nation here.

georgia on the mind

2c751de6-cad0-4fbc-95c8-a8188a552f6c

Georgia’s historical experience differs from that of other small nations such as the Baltic states and Finland, which fell under Russian or Soviet rule but eventually made a more complete escape. Georgia was absorbed into the tsarist empire in 1801, its royal family deported to Russia and its language replaced with Russian in public life. An opportunity for freedom arose after the February 1917 revolution, which overthrew the tsar, but after declaring independence in May 1918, the Georgians proved unable to sustain their state for more than three years. Rayfield points out that whereas Vladimir Lenin let the Balts and Finns go their own way, similar forbearance was unlikely in Georgia. The impulse to conquest was strong among Moscow-based Bolsheviks and thuggish Georgian comrades such as Josef Stalin and Sergo Orjonikidze.

more from Tony Barber at the FT here.

new gorey

600

The three titles just published are presumably about some certain thing but are really always about something else. Consider “The Osbick Bird”: It’s a sweetly melancholy surrealist fable about Edwardian eccentric Emblus Fingby’s odd (but fond) friendship with the even odder bird of the title. Gorey, an only child and lifelong solitary, was famously an odd bird himself — tall, long-legged and gawky like the bird in the book, with the luxurious beard of a Victorian literatus and a sense of style that ran to fur coats and tennis shoes. Inevitably described as “flamboyant,” he was often suspected of being a closeted gay, but when pointedly asked if he was, replied, “I’m neither one thing nor the other particularly.” It’s hard not to read the story as a daydream, at once wistful and resigned, about What Might Have Been, had he had a partner.

more from Mark Dery at the LA Times here.

Reading and Guilty Pleasure

From The New York Times:

ReadingpageAs we move into the summer season of beach and hammock reading, many of us reach for books that we describe as “guilty pleasures.” This notion has become an important category in our thinking about literature. Two prominent examples are NPR’s regular feature “My Guilty Pleasure” and Arthur Krystal’s recent New Yorker essay, “Easy Writers: Guilty pleasures without guilt.”

Reading Krystal’s subtle and savvy piece, it struck me that our talk of guilty pleasures involves two controversial assumptions: that some books (and perhaps some genres) are objectively inferior to others and that “better” books are generally not very enjoyable. Combined, the two assumptions lead to a view under which, to pick up Krystal’s metaphor, we think of books the way we often think of foods: there those that are “good for you” and those that merely “taste good.” Here I want to reflect on the viability of these two assumptions. Are some books objectively better than others, or are literary preferences ultimately just matters of subjective taste? In our democratic society, many take a relativist position: you can’t argue about taste, because there are no standards that allow us to establish higher quality as an objective fact. If I think that Proust’s “In Search of Lost Time” is a magnificent probing of the nature of time and subjectivity and you think it is overwritten self-indulgent obscurantism, we both have a right to our opinions. So doesn’t it follow that each opinion is only relatively right (right for me, right for you)?

More here.

How Did Milk Help Found Western Civilization?

121018_SCI_DairyProds.jpg.CROP.rectangle3-large

Benjamin Phelan in Slate:

To repurpose a handy metaphor, let's call two of the first Homo sapiens Adam and Eve. By the time they welcomed their firstborn, that rascal Cain, into the world, 2 million centuries of evolution had established how his infancy would play out. For the first few years of his life, he would take his nourishment from Eve's breast. Once he reached about 4 or 5 years old, his body would begin to slow its production of lactase, the enzyme that allows mammals to digest the lactose in milk. Thereafter, nursing or drinking another animal's milk would have given the little hell-raiser stomach cramps and potentially life-threatening diarrhea; in the absence of lactase, lactose simply rots in the guts. With Cain weaned, Abel could claim more of his mother's attention and all of her milk. This kept a lid on sibling rivalry—though it didn't quell the animus between these particular sibs—while allowing women to bear more young. The pattern was the same for all mammals: At the end of infancy, we became lactose-intolerant for life.

Two hundred thousand years later, around 10,000 B.C., this began to change. A genetic mutation appeared, somewhere near modern-day Turkey, that jammed the lactase-production gene permanently in the “on” position. The original mutant was probably a male who passed the gene on to his children. People carrying the mutation could drink milk their entire lives. Genomic analyses have shown that within a few thousand years, at a rate that evolutionary biologists had thought impossibly rapid, this mutation spread throughout Eurasia, to Great Britain, Scandinavia, the Mediterranean, India and all points in between, stopping only at the Himalayas. Independently, other mutations for lactose tolerance arose in Africa and the Middle East, though not in the Americas, Australia, or the Far East.

In an evolutionary eye-blink, 80 percent of Europeans became milk-drinkers; in somepopulations, the proportion is close to 100 percent. (Though globally, lactose intolerance is the norm; around two-thirds of humans cannot drink milk in adulthood.) The speed of this transformation is one of the weirder mysteries in the story of human evolution, more so because it's not clear why anybody needed the mutation to begin with.

A Math Genius’s Sad Calculus

Fractalist_mandelbrot_102312_620px

Adam Kirsch in Tablet Magazine [h/t:Tunku Varadarajan]:

Mandelbrot’s life work was to develop mathematical tools able to measure that kind of fiendishly difficult, real-world complexity. The challenge facing The Fractalist is that it is almost impossible for a non-mathematician to advance beyond these generalities and understand what precisely it is that Mandelbrot accomplished. Knowing this, he allows no mathematical formulas or notation in the book—the formula for the Mandelbrot set is the sole exception. It is clear enough, however, that the mathematics Mandelbrot worked with has nothing to do with the kind most of us learned in school; it is infinitely more creative and exciting. His own gift, he writes, was an intuitive ability to “see” complex shapes. As a student, he could solve difficult problems much faster than the rest of the class by turning equations into mental geometry: “In no time, searching for and studying symmetry became central to my work … hopelessly complicated problems of integral calculus could be ‘reduced’ to familiar shapes that made them easy to resolve.”

For this reviewer, reading The Fractalist is rather like reading about a poet who wrote in a foreign language for which no adequate translation is available. You know Mandelbrot is up to exciting things, but you have to take them mostly on faith. What he can share, and does copiously, are the steps of his worldly career: the professorial appointments, the job as a researcher at IBM, the papers published and colleagues courted and impressed. There is so much of this kind of thing in the second half of The Fractalist that it comes to read like an annotated CV, and it has the effect of making Mandelbrot seem very vain. But then, this is a man who decided early in life that he wanted to be a second Kepler, founding a new field of study and revolutionizing humanity’s picture of the world. (In his own view, he accomplished this: “In my Keplerian quest I faced many challenges. The good news is that I succeeded.”) All of this sits oddly with his later declaration that “a memoir is a lesson in humility.”

Your Brain on Pseudoscience: the Rise of Popular Neurobollocks

150841417

Steven Poole in New Statesman:

[A] new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

Remembering Sylvia Kristel

Sylvia-kristel-cannes-1976

Rosemary Hill in the LRB blog:

Like many of my contemporaries I sawEmmanuelle in its much-censored British version at the Prince Charles Cinema off Leicester Square. I went with my first long-term boyfriend. We were both working in Foyles in our gap year, commuting in from Sevenoaks or thereabouts and I suspect that beneath the somewhat laconic discussion afterwards we were a bit shocked by it. I know for a fact that I was.

It must have been almost exactly ten years later that I met Sylvia Kristel when she opened her front door to me in Ghent. It was just after Christmas. Ghent, which I had never seen before, was looking like a scene from Breughel, the snow thick on the hump-back bridges over the canals, the cafes brightly lit and inside them tables covered with richly coloured Turkey carpets. I had just got married and was there with my husband, Christopher Logue. Christopher was an old friend of the Belgian writer Hugo Claus, with whom Sylvia had lived, on and off, since the 1970s. She looked at first sight a bit of a mess, hair on end, puffy eyes, and still in the afternoon wearing a lumpy towelling robe and big pink fluffy slippers, cigarette in hand. I was hugely relieved.

The Moral Case for Silence

Obama2

Norman Pollack makes the case in Counterpunch:

Herman Melville’s story, “Bartleby, the Scrivener,” written 160-odd years ago, is now more relevant than ever. Bartleby faces out to a blank wall–the subtitle is, “A Story of Wall Street”–his highest assertion of self being:”I prefer not to.” Melville, perhaps America’s greatest writer, was making an important statement: meaningful choice has been circumscribed, even by the mid-19th century, in American society. Not only was the heroic turned against itself, but a pervasive condition of alienation defined the individual’s inner life and relations to others. One encountered reality through basic compromises of the ideal vision of a democratic polity, so that engagement became complicity in the renewal of one’s alienation. This, Melville resolutely opposed.

So, too, did Sherwood Anderson seventy years later. (By coincidence, today the New York Times focuses on Elyria, Ohio, his birthplace and the locale for Winesburg, which remains essentially unchanged.) Anderson also captures the loneliness and sadness of American life, which finds the individual enclosed within walls, so that one’s highest affirmation becomes to say “No” to the materialism that trades in false values and destroys the human soul. From Melville to Anderson to the present, America is still in the same condition, only now in more intensified form in that we no longer recognize alienation and willingly accept complicity in a life devoid of self-knowledge and the cooperative social bonds which alone confers dignity on human beings.

Making the moral case for silence as imperative in the coming election may seem difficult. Liberals and many but not all progressives regard the choice to be crystal-clear: Romney, the Republican party, and the Tea Partiers in its midst represent retrograde social forces affecting all sectors of American life. The indictment is merited. Romney seeks a return to the Dark Ages of American capitalism. Both regulation and the social safety net would be severely impaired, and individual privacy would be invaded by a heightened puritanical zeal. Hester Prynne would lurk in every shadow. As for foreign policy, bluntness would rule the waves. One suspects that the Pentagon would be given a blank check to wage perpetual war founded on the belief that America, a pristine land of freedom, is surrounded by enemies, domestic and foreign. From the liberals’ standpoint, what could possibly be worse?

I submit, perhaps Barack Obama could be worse.

Remembering the Cuban Missile Crisis

CubaMap1b_w

US Air Force Col. Charles G. Simpson remembers the Cuban Missile, in The Bulletin of the Atomic Scientist:

We kept our nine missiles ready and watched and listened to the news, waiting to see what the resolution of this crisis would be. The base was a different world — most of the families of the bomber and tanker crews left, either heading home to parents or friends or packing up campers and driving into the Idaho mountains. For the most part, only families of the missile squadron and base security and support units remained.

My wife, Carol, was about eight months pregnant and stayed close to our home during the crisis. Since my one-person position was augmented by two other officers during the crisis, I went home every night to rest for the next day. Still, while I was working, my wife had to deal with some serious problems during the crisis. For example, the military doctor who lived above us became distraught; my wife heard strange noises in his apartment, went to investigate, and found him threatening to kill himself by jumping off the balcony. She contacted the hospital commander and the security police, who quickly responded and took care of the situation.

And retired Col. Valery Yarynich, of the Soviet armed forces:

The escalation of the crisis in relations between the United States and Soviet Union in October 1962 had a most direct impact on the lives of the staff officers for the Kirov rocket corps, named after the city nearest its bases in the Ural Mountains. On October 23, I received orders to go to one of the two divisions of our corps in which intercontinental ballistic missiles (ICBM) of the 8K64 type — SS-7 in American terminology — had recently been put on combat duty. Each of these missiles could deliver a nuclear warhead with an explosive yield of three megatons a distance of 8,100 miles. They were the first strategic missiles with which Soviet Premier Nikita Khrushchev could threaten America. Of course, the Soviet Union was not yet making ​​ICBMs like sausages, which would become the case as the Cold War continued, but there were enough at the time — 25 in two divisions — to create a new Armageddon. On this trip, I had a specific task: I was to take all possible measures to ensure that the division received its orders from Moscow, and, if necessary, launched its missiles in a timely fashion. As a representative of the rocket corps staff, I was endowed with adequate powers for this purpose.

self-abnegating vessels

1351060858

Sometimes it happens that an author can live long enough to see this fate befall his own work, and to bear witness to that special form of powerlessness which authorship eventually invites. Viktor Shklovsky’s Bowstring: On the Dissimilarity of the Similar and Energy of Delusion — two works of literary theory recently translated and published for the first time in English by the Dalkey Archive Press — are remarkable demonstrations of this kind of melancholy self-awareness. Written in Shklovsky’s old age, they sketch a theory of literary history at a time when his ideas had grown unfashionable. Thrown into the light of the present, these books are more than mere artifacts of time: they are also self-conscious reflections on literature’s relation to its past, and on the way literary forms (genres, plots, tropes) become, as Shklovsky puts it, the “self-abnegating” vessels of their own untimeliness.

more from Jnathan Foltz at the LA Review of Books here.

proto-Elamite hope

_63514348_protoelamite464

This international research project is already casting light on a lost bronze age middle eastern society where enslaved workers lived on rations close to the starvation level. “I think we are finally on the point of making a breakthrough,” says Jacob Dahl, fellow of Wolfson College, Oxford and director of the Ancient World Research Cluster. Dr Dahl’s secret weapon is being able to see this writing more clearly than ever before. In a room high up in the Ashmolean Museum in Oxford, above the Egyptian mummies and fragments of early civilisations, a big black dome is clicking away and flashing out light. This device, part sci-fi, part-DIY, is providing the most detailed and high quality images ever taken of these elusive symbols cut into clay tablets. This is Indiana Jones with software.

more from Sean Coughlan at the BBC here.

the history of spaces

121029_r22743_p233

The first history we write is a history of races. Our tribe’s myth is here, yours is over there, our race is called “the people” and blessed by the gods, and yours, well, not so blessed. Next comes the history of faces: history as the epic acts of bosses and chiefs, pharaohs and emirs, kings and Popes and sultans in conflict, where the past is essentially the chronicle of who wears the crown first and who wears it next. Then comes the history of places, where the ingathering of people and classes in a single city or state makes a historical whole bigger than any one face within it. Modern history is mostly place history, of an ambitious kind: what all the little faces were doing while the big faces were looking at each other. Modern place history has produced scholarly masterpieces, like Emmanuel Le Roy Ladurie’s “Montaillou,” the densely inhabited tale of one region in France in medieval times, and a lot of collective social history “from below.” (It has also produced great pop writing: Robert Hughes on Barcelona, Peter Ackroyd on London.) But beyond, or beneath, these histories is the history of spaces: the history of terrains and territories, a history where plains and rivers and harbors shape the social place that sits above them or around them.

more from Adam Gopnik at The New Yorker here.

Friday Poem

Moonchild

whatever slid into my mother's room that
late june night, tapping her great belly,
summoned me out roundheaded and unsmiling.
is this the moon, my father used to grin.
cradling me? it was the moon
but nobody knew it then.

the moon understands dark places.
the moon has secrets of her own.
she holds what light she can.

we girls were ten years old and giggling
in our hand-me-downs. we wanted breasts,
pretended that we had them, tissued
our undershirts. jay johnson is teaching
me to french kiss, ella bragged, who
is teaching you? how do you say; my father?

the moon is queen of everything.
she rules the oceans, rivers, rain.
when I am asked whose tears these are
I always blame the moon.

by Lucille Clifton
from Blessing the Boats: New and Selected Poems 1988-2000
BOA Editions, 2000

Steven Pinker: Why Are States So Red and Blue?

Steven Pinker in the New York Times:

25stone-artA-tmagArticleRegardless of who wins the presidential election, we already know now how most of the electoral map will be colored, which will be close to the way it has been colored for decades. Broadly speaking, the Southern and Western desert and mountain states will vote for the candidate who endorses an aggressive military, a role for religion in public life, laissez-faire economic policies, private ownership of guns and relaxed conditions for using them, less regulation and taxation, and a valorization of the traditional family. Northeastern and most coastal states will vote for the candidate who is more closely aligned with international cooperation and engagement, secularism and science, gun control, individual freedom in culture and sexuality, and a greater role for the government in protecting the environment and ensuring economic equality.

But why do ideology and geography cluster so predictably? Why, if you know a person’s position on gay marriage, can you predict that he or she will want to increase the military budget and decrease the tax rate, and is more likely to hail from Wyoming or Georgia than from Minnesota or Vermont? To be sure, some of these affinities may spring from coalitions of convenience. Economic libertarians and Christian evangelicals, united by their common enemy, are strange bedfellows in today’s Republican party, just as the two Georges — the archconservative Wallace and the uberliberal McGovern — found themselves in the same Democratic Party in 1972.

But there may also be coherent mindsets beneath the diverse opinions that hang together in right-wing and left-wing belief systems.

More here.

Mothering Hypes

Cult_doyle_213561_6632

Sady Doyle reviews Jessica Valenti's Why Have Kids?, in In These Times:

Valenti’s empathy for mothers is matched by her impatience for platitudes about motherhood. Half of the book is headed “Lies” and questions such sacred cows as “children make you happy” (studies show that parenting decreases satisfaction with life) and the idea that being a full-time parent is “the hardest job in the world.” (If it’s so difficult and so all-important, Valenti wonders, why aren’t more men volunteering to prove themselves by undertaking it? And why isn’t it paid?)

These myths not only saddle women with unfair guilt, Valenti argues, but also prevent progressive mobilization around the work of parenthood itself.

“It seems to me that a lot of the political ambivalence around parenting issues come from this idea that the parenting is a reward in and of itself,” she writes to me in an email. “That we don’t need things like subsidized child care or paid leave because our kids are ‘our problem’ and besides, they’re such a joy anyway, what do we have to complain about!? It’s a way to maintain the status quo.

“But the truth is,” she continues, “that parenting is really hard. It isn’t always rewarding. And it doesn’t always bring you joy. That’s OK! Who said it was kids’ job to make you happy? I think if we’re more honest about the struggles of parenting and what parenting really looks like, we can be more upfront about what we need to make everyday parenting easier.”

Valenti suggests a few commonsense solutions, many of which have been promoted by feminists and progressives for some time: paid parenting, extended maternal leave, a community-based approach to raising children rather than a strictly individualistic, Mom-or-nothing focus. But, she admitted in our conversation, “We just haven’t had much luck mobilizing women around the issue. I see great feminists and feminist organizations doing work on motherhood, but it doesn’t get the same attention that something like abortion rights or violence against women does.”

An Interview with Stuart Hall

Stuart-Hall-cultural-theo-007

Zoe Williams in The Guardian:

He became one of the seminal figures of the New Left Review in the 50s (alongside Ralph Miliband, whose rolling or otherwise in his political grave, let's leave aside); it is interesting to note that the memorable ideas from that publication, into the Thatcher years and beyond, were often Hall's coinage. Beatrix Campbell, in a letter to the London Review of Books this January, mentions Thatcher's “retrogressive modernisation”, as described by Hall. But his greatest mark in terms of popular thinking was in the field of multiculturalism, as a faculty member and later director of the Centre for Contemporary Cultural Studies at Birmingham University.

Until very recently, Hall's articulation of the multicultural society looked like the one fixed advance of the 60s, the one improvement that no amount of political rhetoric or social polarisation could undo. He mildly rejects the idea that academia was the engine of the new world order. “We drew the line in the 60s. We were here. They were there. It wasn't going to look like Dunkirk. It was never going to look like that again. I think some advances were made academically, but it was more what I think of as a multicultural drift, just having them [people from other cultures] around, they weren't going to eat you, they didn't have tails. The smartest guy in the store is probably black. You turn on the television and the guy singing is probably black. That mattered a lot in accustoming people to think about it.”

And he still maintains that this country, which has adored Bob Marley for three decades, is a very different place to the one he arrived in. And yet, he says, “I'm more politically pessimistic than I've been in 30 years.”

This pessimism is not down to the failure of multiculturalism, or rather, that speech last year in which David Cameron claimed it had failed – Hall takes a benign, if dismissive, attitude to Conservative posturing here, commenting mildly that Cameron is talking about equal-opportunities legislation, as he perceives it, rather than multiculturalism as part of the culture. No, it's the state of the left that strikes him as the most problematic. “The left is in trouble. It's not got any ideas, it's not got any independent analysis of its own, and therefore it's got no vision. It just takes the temperature: 'Whoa, that's no good, let's move to the right.' It has no sense of politics being educative, of politics changing the way people see things.”