martin amis trashes david bowie in 1973

BowieMartin Amis at The New Statesman:

When Glam-Rock superstar David Bowie flounced on to the Hammersmith Odeon stage last Monday night, recognisably male and not even partially naked, it seemed that we would be denied the phenomenon-of-our-times spectacle which your reporter was banking on. The preludial ambience, too, was discouragingly humdrum: behind me in the audience upper-class slummers boomingly voiced their fears of having to endure a “really grotty” supporting band; in front of me teenage couples snogged with old-fashioned – not to say reactionary – zeal; beside me a joint was lit and furtively extinguished; and on stage, prior to curtain-up, a fat old teddy-boy appeared, asked Hammersmith if it was feeling good, wanted a louder answer, got one, and left us with a lie about the anticipated time lapse before Mr Bowie’s arrival. Once under way, admittedly, that musician went through various stages of déshabillé – now in orange rompers, now a miniskirt, now in hot-pants, now a leotard – but we never got to see the famous silver catsuit and pink jockstrap. Bowie did, it’s true, have a habit of turning away from the audience and sulkily twitching his backside at it before floating off to arouse each aisle in turn with his silky gaze – but there was no sign of the celebrated sodomistic routine involving lead guitarist Mick Ronson, no acts of stylised masturbation and fellatio with microphone and mikestand. Perhaps Mr Bowie just wasn’t feeling up to it that evening, or perhaps Mr Bowie was just a mild fad hystericised by “the media”, an entrepreneur of camp who knew how little, as well as how much, he could get away with.

more here.



How David Bowie spoke for me

David Bennun in More Intelligent Life:

Bowie%20web

Everybody has a Bowie story. This is mine. I’m a child living in Nairobi, Kenya – a world away, in the days before the Internet, from the pop music that already fascinates me. A visitor from America brings me a C-90 cassette. On one side is “The Rise and Fall of Ziggy Stardust and the Spiders from Mars”, which I do not then know to be David Bowie’s 1972 breakthrough record. Ten years after an earlier generation’s collective jaw dropped when he played “Starman” on “Top of the Pops”, this is my own “Starman” moment. To me, this fictive arrival from another planet is to all intents and purposes a real one. From the eerie syncopated pulse of “Five Years”, and that strange, strained vocal with its apocalyptic images of panic and catastrophe, to the sweeping finale of “Rock’n’Roll Suicide”, enveloping me like an exotic comfort blanket with the message that more of my kind are out there (“Just turn on with me and you’re not alone!”), I am transfixed by this alien artefact. It fills my Walkman earphones several times a day. At this point I barely have any idea who Bowie is or what he looks like (“Let’s Dance”, a global hit that reaches even the African equator, is still a year away). I grasp only that, as one outlander, I have somehow connected to another who speaks for and to me.

A quarter of a century later I watch what turns out to be Bowie’s last appearance on a British stage, when he gives a surprise guest performance with David Gilmour at the Royal Albert Hall in May 2006. Bowie sings Pink Floyd’s “Arnold Layne”, and suddenly it seems the most natural thing in the world, his unmistakable London drawl lighting up this other-worldly yet utterly English song. My friend and I gawp at each other in our seats. This is the nearest thing to the second coming we are ever likely to encounter. Everybody has a Bowie story, and that’s the point here. The bereavement so many are feeling today may be one of the last shared experiences our increasingly atomised pop culture will know. We no longer undergo “Starman” moments – those pan-generational shocks which galvanise the young and appal their elders – and we are no longer producing musical artists who alter the entire course of their art, let alone the wider world, as Bowie did in his pomp, which lasted even longer than that of the only British pop act as momentous as he is, the Beatles.

More here.

The nineteenth-century obsession with premature burial

Precipitate_burialDan Piepenbring at The Paris Review:

Premature Burial runs to more than five hundred pages, and its most gripping sections are given over to accounts of interment gone awry, along with the many anxieties of the nineteenth-century deathbed. There’s the man who sank into such a prolonged lethargy that he was thought dead until he “broke into a profuse sweat” in his coffin; the young woman whose corpse was exhumed for reburial only to be discovered “in the middle of the vault, with disheveled hair and the linen torn to pieces … gnawed in her agony”; the man whose fear of premature burial was so severe that he instructed his family to leave his body undisturbed for ten days after death, “with the face uncovered, and watched night and day. Bells were to be fastened to his feet. And at the end of the second day veins were to be opened in the arm and leg.”

Tebb draws some of his most abject cases, fittingly enough, from The Undertakers’ and Funeral Directors’ Journal, a veritable storehouse of medical malfeasance. The Journal ran at least one story of a pregnant woman who gave birth in the grave. It also has an episode with one of the only happy endings in the whole book:

“Mrs. Lockhart, of Birkhill, who died in 1825, used to relate to her grandchildren the following anecdote of her ancestor, Sir William Lindsay, of Covington, towards the close of the seventeenth century:—‘Sir William was a humorist and noted, moreover, for preserving the picturesque appendage of a beard at a period when the fashion had long passed away.

more here.

Genetic Flip Helped Organisms Go From One Cell to Many

Carl Zimmer in The New York Times:

ZIMMER_COMBO-superJumboNarwhals and newts, eagles and eagle rays — the diversity of animal forms never ceases to amaze. At the root of this spectacular diversity is the fact that all animals are made up of many cells — in our case, about 37 trillion of them. As an animal develops from a fertilized egg, its cells may diversify into a seemingly limitless range of types and tissues, from tusks to feathers to brains. The transition from our single-celled ancestors to the first multicellular animals occurred about 800 million years ago, but scientists aren’t sure how it happened. In a study published in the journal eLife, a team of researchers tackles this mystery in a new way. The researchers resurrected ancient molecules that once helped single-celled organisms thrive, then recreated the mutations that helped them build multicellular bodies.

The authors of the new study focused on a single molecule called GK-PID, which animals depend on for growing different kinds of tissues. Without GK-PID, cells don’t develop into coherent structures, instead growing into a disorganized mess and sometimes even turning cancerous. GK-PID’s job, scientists have found, is to link proteins so cells can divide properly. “I think of it as a molecular carabiner,” said Joseph W. Thornton, an evolutionary biologist at the University of Chicago and a co-author of the new study. When a cell divides, it first has to make an extra copy of its chromosomes, and then each set of chromosomes must be moved into the two new cells. GK-PID latches onto proteins that drag the chromosomes, then attaches to anchor proteins on the inner wall of the cell membrane. Once those proteins are joined by GK-PID, the dragging proteins pull the chromosomes in the correct directions. Bad things happen if the chromosomes head the wrong way. Skin cells, for example, form a stack of horizontal layers. New cells needs to grow in the same direction so skin can continue to act as a barrier. If GK-PID doesn’t ensure that the chromosomes move horizontally, the cells end up in a jumble, like bricks randomly set at different angles.

More here.

Tuesday Poem

In That Year

And in that year my body was a pillar of smoke
and even his hands could not hold me.

And in that year my mind was an empty table
and he laid his thoughts down like dishes of plenty.

And in that year my heart was the old monument,
the folly, and no use could be found for it.

And in that year my tongue spoke the language
of insects and not even my father knew me.

And in that year I waited for the horses
but they only shifted their feet in the darkness.

And in that year I imagined a vain thing;
I believed that the world would come for me.

And in that year I gave up on all the things
I was promised and left myself to sadness.

And then that year lay down like a path
and I walked it, I walked it, I walk it.
.

by Erin Moore
from The Art of Falling
Seren Books, Bridgend

Monday, January 11, 2016

Sunday, January 10, 2016

You Can’t Trust What You Read About Nutrition

Christie Aschwanden in FiveThirtyEight:

ScreenHunter_1611 Jan. 11 12.03As the new year begins, millions of people are vowing to shape up their eating habits. This usually involves dividing foods into moralistic categories: good/bad, healthy/unhealthy, nutritious/indulgent, slimming/fattening — but which foods belong where depends on whom you ask.

The U.S. Dietary Guidelines Advisory Committee recently released its latest guidelines, which define a healthy diet as one that emphasizes vegetables, fruits, whole grains, low- or nonfat dairy products, seafood, legumes and nuts while reducing red and processed meat, refined grains, and sugary foods and beverages.1 Some cardiologists recommend a Mediterranean diet rich in olive oil, the American Diabetes Association gives the nod to bothlow-carbohydrate and low-fat diets, and the Physicians Committee for Responsible Medicine promotes a vegetarian diet. Ask a hard-bodied CrossFit aficionado, and she may champion a “Paleo” diet based on foods our Paleolithic ancestors (supposedly) ate. My colleague Walt Hickey swears by the keto diet.

Who’s right? It’s hard to say. When it comes to nutrition, everyone has an opinion. What no one has is an airtight case. The problem begins with a lack of consensus on what makes a diet healthy. Is the aim to make you slender? To build muscles? To keep your bones strong? Or to prevent heart attacks or cancer or keep dementia at bay? Whatever you’re worried about, there’s no shortage of diets or foods purported to help you. Linking dietary habits and individual foods to health factors is easy — ridiculously so — as you’ll soon see from the little experiment we conducted.

More here.

The Stone Reader: A Review

9781631490712_198

David Edmonds reviews The Stone Reader edited by Peter Catapano and Simon Critchley, in The Philosophers' Magazine:

Many newspapers have regular columns on science. But few of these columns are dedicated to a discussion of the nature or purpose of science. Almost all newspapers have regular pages devoted to sport. But it would be unusual to have an article that grappled with the meaning of sport. Yet in various ways several philosophers in The Stone Reader—a collection of short, philosophy essays from the New York Times’s philosophy blog The Stone—seek to address the existential questions, “What Is Philosophy?” and “What Is A Philosopher?”

What is a Philosopher? is the title of the first essay in this volume, written by Professor Simon Critchley who also acts as The Stone Reader’s co-editor (alongside Peter Catapano of The New York Times). Critchley’s academic career began in the UK, where he developed an interest in thinkers from the continental tradition, such as Heidegger and Derrida. Just over a decade ago he moved to the New School for Social Research where he has continued to write prolifically, on a wide variety of subjects – a recent book was on suicide – with an essayistic style that again owes more to a European than an anglo-American tradition of philosophy.

The What is Philosophy? questions hits a vulnerable spot not because philosophy has fuzzy borders. All disciplines have fuzzy borders. Science has fuzzy borders. Even activities like sport have fuzzy borders.. The International Olympic Committee classifies the card-game bridge as a sport. A recent High Court decision in the UK has determined that it is not (a decision that matters, because bridge clubs and bridge tournaments cannot now appeal for funding from a pot of money reserved for sport). Still, philosophy is notoriously difficult to define. A few years ago, on the philosophy podcast I co-run with Nigel Warburton (www.philosophybites.com) we put the ‘What Is Philosophy?’ question to 50 different philosophers and received 50 different answers. There isn’t quite the same difficulty in demarcating the rough edges of maths or French literature.

Part of the problem is historical. The meaning of philosophy has been in a state of perpetual evolution. There was a time when the sciences were situated within philosophy. Aristotle wrote about biology and in his day, Sir Isaac Newton was described as a “natural philosopher”. Since then, bit by bit, chunks of academic territory have split off, leaving the original land mass in danger of appearing like a shrunken and barren island. Physics went, then, in the 19th century, so did biology. Psychology made a successful bid for independence in the early 20th century as too did linguistics. Broadly, philosophy has become the analysis of a set of issues that cannot be resolved empirically.

More here.

The Hedonistic Utilitarian

TorbjornTannsjo09-300x205

Richard Marshall interviews Torbjörn Tännsjö in 3:AM Magazine:

3:AM: Why are you a moral realist and what difference does this make to how you go about investigating morals from, for example, a non-realist?

TT: I am indeed a moral realist. In particular, I believe that one basic question, what we ought to do, period (the moral question), is a genuine one. There exists a true answer to it, which is independent of our thought and conceptualisation. My main argument in defence of the position is this. It is true (independently of our conceptualisation) that it is wrong to inflict pain on a sentient creature for no reason (she doesn’t deserve it, I haven’t promised to do it, it is not helpful to this creature or to anyone else if I do it, and so forth). But if this is a truth, existing independently of our conceptualisation, then at least one moral fact (this one) exists and moral realism is true. We have to accept this, I submit, unless we can find strong reasons to think otherwise. Moral nihilism comes with a price we can now see. It implies that it is not wrong (independently of our conceptualisation) to do what I describe above; this does not mean that it is all right to do it either, of course, but yet, for all this, I find this implication from nihilism hard to digest. It is not difficult to accept for moral reasons. If it is false both that it is wrong to perform this action and that it is right to perform it, then we need to engage in difficult issues in deontic logic as well. So we should not accept moral nihilism unless we find strong arguments to do so. So are there any good arguments in defence of moral nihilism? I think not and I try to defend this claim in my From Reasons to Norms. On the Basic Question in Ethics (2010). It is of note that for a long time moral nihilism was a kind of unquestioned default position in analytic moral philosophy. What initiated the interest in moral realism was the fact that, in 1977, two authors, John Mackie and Gilbert Harman, independently of one another, put forward arguments in defence of the nihilist position. This triggered an interest in what had up to then been a non-issue. When thinking carefully about their arguments for nihilism I didn’t find them convincing. I was not alone. At first there was a trend towards moral realism in its “Cornell”, i.e naturalist, style. In my book Moral Realism (1990) I didn’t take a stand on the naturalist/non-naturalist issue. I am now a decided non-naturalist realist. And today we may even speak of a trend towards non-naturalist moral realism (for example Derek Parfit, David Enoch, apart from myself).

Being a moral realist I see normative ethics as a search of the truth about our obligations and a search of explanation; the idea is that moral principles can help us to a moral explanation of our particular obligations.

More here.

Literary Women Pay Homage to Zora Neale Hurston on Her 125th Birthday

333pxhurstonzoranealeloc.jpg.CROP.rtstoryvar-medium

Janelle Harris in The Root:

The was born in Notasulga, Ala., but she didn’t like the way her story started, so she rewrote it and claimed Eatonville, Fla., as her birthplace instead. She wasn’t too partial to 1891, the year her mother delivered her, so she remixed it, and for the rest of her life, she took liberties with the mathematics of her age, knocking as many as 10 years off if the notion felt good to her.

Zora Neale Hurston was a master of creative invention and reinvention, from the personal details of her own life to her artistic catalog, which included four novels, two books of folklore, an autobiography, and dozens of short stories, essays, articles and plays. She was an original black girl unboxed.

It’s appropriate today, on what would be Mother Zora’s 125th birthday, to honor the social and cultural freedoms she cleared for black female writers who stand on her platform and use our words to tell our own stories instead of allowing them to be told to and for us. She made it OK to be bold and conflicted, to wrestle with our identities and explore our differences as we chip away at the monolith, even to sometimes contradict ourselves and swerve, midaction, without apology.

Toni Morrison and Gloria Naylor, both literary geniuses, have credited Hurston as an inspiration, as do others, the famous and not so famous among us, who strip away pretense and dig into our personal wells of realness when we sit at a keyboard. We awe at the musicality of her prose and absorb what she said even in between the lines. This is what Hurston taught us, the black women creatives who came up in her shine.

You don’t need anybody’s permission to love who you uniquely are.

“My mother had a number of books from the canon of black women’s literature. Among them was I Love Myself When I Am Laughing … and Then Again When I Am Looking Mean and Impressive, Alice Walker’s anthology of Hurston’s work. Just the book cover and the quote did so much to shift my thinking of what it means to be a woman. Her whole damn self is inspiring, a woman who loved herself at a time when self-hatred was expected of her. I find her to be contrary, instructive, insightful, bold and a perfect guide of who I can be if I dare.” —Writer and painter Kiini Ibura Salaam

More here.

Is it Time for France to Abandon Laïcité?

18d3300b908203a3bf17cc36706f71500b1399d8

Elizabeth Winkler in The New Republic:

As France marks the one-year anniversary of the terrorist attack on the offices of Charlie Hebdo, French officials are stepping up efforts to counter violent extremism. One measure involves widening police powers to conduct raids and detain suspected terrorists. The Supreme Court is reviewing a draft bill that would make these temporary, state-of-emergency tools, implemented after the November 2015 attacks on multiple sites in Paris, permanent. The state-backed Conseil Français du Culte Musulman (French Council of Islam) has also announced its intention to issue certificates to imams who acknowledge French values and demonstrate their non-radical credentials.

Some worry that such measures will play into the hands of the Islamic State and other extremist groups. Increasing police powers could endanger respect for civil liberties, while imposing governmental control of Islam in France could drive more Muslims to radical sects. It would arguably be more effective to focus efforts on improving the integration of Muslims, many of whom feel alienated in French society. This would involve examining educational and career opportunities for immigrants and their families—paths that will offer them upward mobility and a better chance to assimilate into the workforce. But it would also mean revisiting a pillar of France’s political and cultural identity: the policy of laïcité.

Laïcité is France’s principle of secularism in public affairs, aimed at fostering a post-religious society. It developed during the French Revolution, which sought to dismantle the Catholic Church in France along with the monarchy, and was enshrined in the 1905 law on the Separation of the Churches and the State. Broadly, the idea refers to the freedom of citizens and of public institutions from the influence of organized religion. (“Laïcité” derives from the French term for laity—non-clergy or lay people.)

It goes further than the separation of church and state in other nations, however, by prohibiting religious expression in the public sphere. In early 20th-century France—a fairly homogenous, Christian nation—this was a straightforward attempt to protect government from the sway of the Catholic Church. But in modern France—a decidedly more heterogeneous and multi-religious society—this insistence on secularism is thorny. As a critic argued in Le Figaro, laïcité is unintelligible and even shocking to many Muslims, who view it as “an injunction to abandon their religion.” Instead of enhancing social harmony, it may actually be exacerbating religious and racial tensions.

More here.

The Saudi-Iranian crisis reveals a deep power struggle in Tehran

Image-20160107-14007-qurd9e

Kamran Matin in The Conversation:

Ever since Saudi Arabia’s execution of Shia dissident Nimr al-Nimr was met with violent protests at the Saudi embassy in Iran, the two already hostile countries have been at diplomatic loggerheads. But while Saudi Arabia’s actions suggest a unity of purpose at the highest level, the Iranian reaction has not been uniform.

The Iranian government has severely criticised the attacks on Saudi Arabia’s diplomatic missions. President Hasan Rouhani attributed the attacks to “rogue elements” who “want to damage the dignity of the Islamic Republic”.

By contrast, the supreme leader, Ayatollah Khamenei, and officials close to his office such as Tehran’s leader of Friday prayers, have tacitly or openly supported the protesters.

The official media, meanwhile, is similarly divided with reformist and pro-government newspapers and websites taking a critical but more measured line while conservative media and those close to the security and intelligence establishments have adopted a more aggressive tone.

These conflicting reactions stem from the deep ambivalence at the core of the Iranian state, which combines centres of power both popular and divine.

That contradiction is reflected in the country’s official name: the ‘Islamic Republic’ of Iran. This means that different and often competing ideological factions are constantly trying to dominate the state and its vast economic resources, and to shape Iran’s strategic direction both internally and externally.

Such rivalries can become particularly intense at critical domestic junctures and produce unintended consequences.

More here.

Did dark matter kill the dinosaurs?

Lewis Dartnell in The Telegraph:

Dino_1-large_trans++-WaCvwBuRm4zisdtG6jk8mj-EflfpgJ1inNhxUm3CqIDark matter is one of the most intensely studied subjects in particle physics and cosmology today. It’s certainly curious stuff, behaving in ways unlike anything in our everyday experience. Indeed, dark matter has never actually been isolated or studied in the lab. Its very existence has only been inferred indirectly by its influence on things we can see. But what is dark matter? When astronomers look at a rotating galaxy, they can work out the mass of the stars and gas clouds within it, and thus the strength of that galaxy’s gravity, as well as how fast it is spinning and so how much gravitational force is required to hold it all together. The problem is that the two figures don’t match. Galaxies like our own apparently contain far more mass than the visible matter of stars and nebulae, or else they would have flung themselves apart in their rapid twirl. The implication is that there must also be something unseen, dark matter, the gravity of which holds it all together. We infer dark matter, as Lisa Randall puts it rather wonderfully in Dark Matter and the Dinosaurs, in the same way we would deduce the invisible presence of a celebrity in the midst of an excited, jostling crowd.

…The history of life on Earth is punctuated by mass extinctions, including that notorious asteroid or comet collision 65 million years ago. Various studies over recent decades have looked for a periodicity in the impact rate – the possibility that impacts on the Earth come in regular waves – and have hypothesised triggers, such as the elongated orbit of an unseen Planet X or perhaps a companion star to the Sun, dubbed Nemesis. A huge swarm of icy bodies encircle the Sun on the very outer edges of the solar system, known as the Oort cloud; if some of these were to have their orbit disturbed by an external gravitational influence, they could swoop down into the inner solar system as comets, and potentially slam into the Earth. But none of these proposed triggers have stood up to further study. Could there be a link, Randall posits, between the rate of impacts on the Earth and dark matter in the galaxy? Our galaxy, the Milky Way, is a great rotating spiral of stars, and as our sun orbits the centre, it also bobs up and down through the flat plane of the galaxy: our solar system’s overall motion around the galaxy is like that of a fairground carousel horse. It is this motion, claims Randall, that could cause regular waves of impacts on Earth.

More here.

Obnoxiousness Is the New Charisma

Frank Bruni in The New York Times:

FrankMaking light of so much bloodshed, Cruz told Iowans the story of a Texas woman who was pulled over by a police officer. She supposedly informed the officer that she had a Glock affixed to her hip, a .38 revolver in one boot, a single-shot derringer in the other and a double-barrel shotgun under the seat. “Goodness gracious,” the officer said. “What on earth are you afraid of?” “Not a dang thing,” the woman responded. Cruz is unsettling enough in isolation, but it’s the combination of him and Trump that really galls. And it galls not just Democrats but other Republicans. “At some point, we have to deal with the fact that there are at least two candidates who could utterly destroy the Republican bench for a generation if they became the nominee,” Josh Holmes, a former chief of staff to the Senate majority leader, Mitch McConnell, told Politico’s Alex Isenstadt recently. In the current issue of Time magazine, David Von Drehle put it this way: “The G.O.P. has awakened less than a month from the Iowa caucuses and the New Hampshire primary to find itself in bed between a bombshell and a kamikaze.” It’s a king-size bed, and they’re all under an eiderdown of obnoxiousness. From the moment Trump announced his candidacy, he chose a potty mouth over a silver tongue, and a shocking number of Americans thrilled to that, regarding crudeness as the greatest form of candor. From the moment Cruz arrived in the United States Senate, he chose tirades over teamwork, becoming “so unpopular that at one point not a single Republican senator would support his demand for a roll-call vote,” The Times’s Jennifer Steinhauer wrote last month, adding that he was left “standing on the Senate floor like a man with bird flu, everyone scattering to avoid him.”

But what repelled Republican senators is somehow beckoning Republican voters: In a Gallup survey released on Friday, 61 percent of them said that they had a favorable impression of him, while only 16 percent said that they had an unfavorable one, giving him a “net favorable” rating of plus 45, the best in the Republican field. I guess bird flu is the new catnip. Many analysts explain all of this in terms of a potent anger among Americans. They say Trump and Cruz lend voice to it. But that’s not exactly right. Anger can have a noble dimension — as a response to injustice, as the grist for change — and neither Trump nor Cruz projects much nobility or tries to, for that matter. They’re not so much angry as petulant, impudent. When Trump tells rivals on the debate stage that they’re ugly or unpopular and when he ridicules a journalist’s disability, he’s not being angry. He’s just being a jerk. And when he crows incessantly about his deal-making genius, his billions and his poll numbers, he’s not stoking constructive passions. He’s just stroking himself.

More here.

Sunday Poem

I Called God

I called God but there's no
God. But there was
God because I called
God. And if there wasn't
God what
did I call? And I called, I called
God. And there was, for a split
second, and there will be. He won’t
die. As long as
I keep calling

God.
.

by Adi Assis
from Haneshek Haham (Firearms)
publisher: Helicon, Tel Aviv, 2009

t
ranslation: 2015, Shelly Marder

Saturday, January 9, 2016

Bard-ji on the beach: Postcolonial artists write back to Shakespeare

Claire Chambers in Dawn:

ScreenHunter_1610 Jan. 10 12.22Shakespeare’s 1603 tragedy Othello has long been ripe for adaptation and postcolonial rewritings. As the Pakistani novelist Zulfikar Ghose observes in his book This Mortal Knowledge, Othello is a truly noble man, in contrast to the calumny of “lascivious Moor” with which Iago taints him. In fact, if Othello has a fault, Ghose suggests that it is his “sexual frugality”, which leads him to make too great a distinction between body and spirit. This enables Iago to work on both Othello’s jealousy about his wife and on the “base racial instinct” Iago shares with his fellow white Venetians. The consequence is that a “beast with two backs” is created — not through sexual union but the conjoining of Desdemona and Othello in death. With its Molotov cocktail of false friendship, racism, military careers, and extreme sexual possessiveness, Othello proves irresistible to many artists from postcolonial backgrounds.

In 1966, the Sudanese author Tayeb Salih published an Arabic-language novel Mawsim al-Hijra ila al-Shamal. It was translated into the English title Season of Migration to the North in 1969 and is now a Penguin Modern Classic. In this cornerstone text for postcolonialism, Salih depicts the cultural conflict that ensues when two rural Sudanese Muslims move to Britain and then return to Africa.

More here.

How Is the Economy Doing? It May Depend on Your Party, and $1

3VIEW-master675

Neil Irwin in the NYT's The Upshot:

Did unemployment get better or worse during Ronald Reagan’s presidency? In a 1988 survey, some 80 percent of dedicated Republicans accurately said it had improved, compared with 30 percent of loyal Democrats. In the 1990s, the pattern reversed on a range of factual questions about economic and fiscal issues. In a 1997 survey, for example, Republicans were far less likely than Democrats to acknowledge that the budget deficit had declined during the Bill Clinton administration.

As an economics writer, I see the same thing anecdotally. When I wrote articles recently about the unemployment rate’s dip to 5 percent, I received vehement responses from conservatives convinced that the Obama administration was cooking the numbers. They were not so different from responses I received from liberals when the jobless rate was at that level in 2005, during the George W. Bush administration.

In other words, when you ask people about the economy, the answers are less a statement of objectivity and more like what they’d say if you’d asked which pro football team was the best. That has important implications for democracy. How can people judge whether a party is effective if there is no sense of objective truth? And it could even have implications for the economy itself if, for example, conservative-leaning business executives freeze hiring or investment when the president doesn’t share their politics.

But new research from two teams of political scientists adds a wrinkle to these findings. It turns out that the partisan bias in how people answer factual questions about the economy is diminished by this one weird trick: Pay people.

That is a conclusion reached in two new papers in The Quarterly Journal of Political Science, one from four scholars led by John G. Bullock at the University of Texas at Austin, the other by Markus Prior of Princeton and two colleagues.

When survey respondents were offered a small cash reward — a dollar or two — for producing a correct answer about the unemployment rate and other economic conditions, they were more likely to be accurate and less likely to produce an answer that fit their partisan biases.

More here.

Do We Need a New Constitutional Convention?

Kreitner-banner

Richard Kreitner and Sanford Levinson in Boston Review:

When the country’s most prominent critic of the Constitution writes a commentary on the most famous defense of that Constitution, it is an event. When the publication of that commentary comes at a time when the system of government that Constitution provides is, by all accounts, under serious strain, it is an event very much worth noting.

Sanford Levinson’s An Argument Open To All: Reading The Federalist in the 21st Century is an engaging interpretation of the eighty-five original Federalist essays written in 1787 and 1788 by Alexander Hamilton, James Madison, and John Jay in support of the ratification of the new Constitution. For several decades now, in books such as Our Undemocratic Constitution: Where the Constitution Goes Wrong (And How We the People Can Correct It) (2005) and Framed: America’s 51 Constitutions and the Crisis of Governance (2012), Levinson has argued that nothing less than a second Constitutional Convention is sufficient to fix the original charter’s problems. In his new book, Levinson offers a close reading of each of the original Federalist essays in order to see what that classic tract can tell us about how government works more than two centuries later—or why, often, it does not.

In the following conversation, which has been condensed and edited for clarity, Levinson talks about what’s wrong with Elizabeth Warren’s claim that “the system is rigged,” the continuing centrality of fear to American politics and constitutionalism, and what Publius—the pen name Hamilton, Madison, and Jay used in The Federalist—would think of the 2016 presidential aspirants.
—Richard Kreitner

Kreitner: Why this book? And why now?

Levinson: The Federalist plays a curious role not only in the intellectual canon of American thought, but also the popular canon. We have tickets to Hamilton in December and I was listening to the soundtrack last night; part of Hamilton is his participation in TheFederalist.

But one of the things I’ve discovered, as I teach a course this semester at Harvard Law School, is that very few people have actually read The Federalist in toto. In that sense it may be like the Bible or other canonical documents that are much more often cited or evoked than actually read. Moreover, when I ask my students what they have read of TheFederalist, only one person in my class had read the book in its entirely.

Another reason is that my previous book, Framed: America’s 51 Constitutions and the Crisis of Governance, reflected my growing belief that the Constitution, as is true of most constitutions, is really more important because of the structures that it establishes than the rights it professes to protect. Over the years I’ve become more analytically sympathetic to Madison’s argument that rights protections tend to be what he called “parchment barriers,” rather than truly effective levies against the desires of politically established majorities to suppress rights.

More here.