Weddings & beheadings

Hanif Qureishi in Prospect Magazine:

Hisstory_kureishiI have gathered the equipment together and now I am waiting for them to arrive. They will not be long; they never are.

You don’t know me personally. My existence has never crossed your mind. But I would bet you’ve seen my work: it has been broadcast everywhere, on most of the news channels worldwide. Or at least parts of it have. You could find it on the internet, right now, if you really wanted to. If you could bear to look.

Not that you’d notice my style, my artistic signature or anything like that. I film beheadings, which are common in this war-broken city, my childhood home.

It was never my ambition, as a young man who loved cinema, to film such things. Nor was it my wish to do weddings, though there are less of those these days. Ditto graduations and parties. My friends and I have always wanted to make real films, with living actors and dialogue and jokes and music, as we began to do as students. Nothing like that is possible any more. Everyday we are ageing, we feel shabby. The stories are there, waiting to be told; we’re artists. But this stuff, the death work, it has taken over.

More here.

Defiant Iran

Christopher de Bellaigue in the New York Review of Books:

Ahmadineuad_m20050811At the beginning of 2002, President George W. Bush tried to punish Iran for supporting anti-Israel militants, for refusing to adopt a Western-style democracy, and for allegedly trying to produce weapons of mass destruction. He included Iran, along with Iraq and North Korea, in the “axis of evil.” Among foreign diplomats and journalists in Tehran, it became fashionable to speak of the coming “implosion” of the Islamic Republic, Iran’s revolutionary state. Weakened by a power struggle between reformists and conservative hard-liners, Iran was now, or so it was said, acutely vulnerable to the sort of threat that the United States, whose forces had easily toppled the Taliban and scattered al-Qaeda, seemed to represent.

The fear of intervention by the US in Iran became more urgent among Iran’s leaders when America invaded Iraq the following year. Indeed, it later became known that, in early 2003, the Iranian Foreign Ministry quietly sent Washington a detailed proposal for comprehensive negotiations, in which the Iranian government said it was prepared to make concessions about its nuclear program and to address concerns about its ties to groups such as Hezbollah and Islamic Jihad, in return for an agreement from the White House to refrain from destabilizing the Islamic Republic and start lifting long-in-effect sanctions. The US rejected this overture out of hand. It seemed that Bush didn’t want to offer guarantees to a regime that he intended, at a later date, to try to destroy.

Nowadays, it is hard to imagine the Iranian government repeating this sort of offer.

More here.

Human species ‘may split in two’

From the BBC:

Screenhunter_5_5Humanity may split into two sub-species in 100,000 years’ time as predicted by HG Wells, an expert has said.

Evolutionary theorist Oliver Curry of the London School of Economics expects a genetic upper class and a dim-witted underclass to emerge.

The human race would peak in the year 3000, he said – before a decline due to dependence on technology.

People would become choosier about their sexual partners, causing humanity to divide into sub-species, he added.

The descendants of the genetic upper class would be tall, slim, healthy, attractive, intelligent, and creative and a far cry from the “underclass” humans who would have evolved into dim-witted, ugly, squat goblin-like creatures.

More here.

Finger Forecasts

Jennifer Huget in the Washington Post:

You’ve checked out your lover’s zodiac sign, and you know what her sleep number is. Before taking the next step, though, you might want to have a close look at her fingers.

Screenhunter_4_13

A new study in a British medical journal finds a link between the relative length of a woman’s index and ring fingers and her athletic prowess. The research takes its place among dozens of other studies tying that ratio — known in finger-measurement circles as 2D:4D (the relationship between the length of the second digit, 2D, and the fourth) — to all manner of physical and psychological traits, from breast cancer risk to schizophrenia.

Digit ratio, as the measurement is called, has been found to relate to left-handedness and autism, to hyperactivity and bullying in children, eating disorders in women and depression in men.

More here.

A Manifesto for American Liberals

In response to Tony Judt’s recent piece on the “Strange Death of Liberal America” in the LRB, Bruce Ackerman and Todd Gitlin have issued a manifesto for liberals, which has been also signed by a number of prominent liberal intellectuals, including Kenneth Arrow Joshua Cohen Robert A. Dahl James K. Galbraith Arlie Hochschild Margaret Levi Robert B. Reich Elaine Scarry and Charles Tilly. (Via Crooked Timber.) In The American Prospect:

As right-wing politicians and pundits call us stooges for Osama bin Laden, Tony Judt charges, in a widely discussed and heatedly debated essay in the London Review of Books, that American liberals — without distinction — have “acquiesced in President Bush’s catastrophic foreign policy.” Both claims are nonsense on stilts.

Clearly this is a moment for liberals to define ourselves. The important truth is that most liberals, including the undersigned, have stayed our course throughout these grim five years. We have consistently and publicly repudiated the ruinous policies of the Bush administration, and our diagnosis, alas, has been vindicated by events. The Bush debacle is a direct consequence of its repudiation of liberal principles. And if the country is to recover, we should begin by restating these principles.

We have all opposed the Iraq war as illegal, unwise, and destructive of America’s moral standing. This war fueled, and continues to fuel, jihadis whose commitment to horrific, unjustifiable violence was amply demonstrated by the September 11 attacks as well as the massacres in Spain, Indonesia, Tunisia, Great Britain, and elsewhere. Rather than making us safer, the Iraq war has endangered the common security of Americans and our allies.

Working invisibility cloak created at last

Justin Mullins in New Scientist:

Image006_3An invisibility cloak that works in the microwave region of the electromagnetic spectrum has been unveiled by researchers in the US. The device is the first practical version of a theoretical set-up first suggested in a paper published earlier in 2006.

The cloak works by steering microwave light around an object, making it appear to an observer as if it were not there at all. Materials that bend light in this way do not exist naturally, so have to be engineered with the necessary optical properties.

Earlier in 2006, John Pendry, a theoretical physicist at Imperial College London, UK, and colleagues showed how such an invisibility cloak could, in theory, be made (see Physicists draw up plans for real ‘cloaking device’). Now David Smith and colleagues at Duke University in North Carolina, US, have proved the idea works.

In recent years, materials scientists have made rapid progress in making so-called “metamaterials”, which can have exotic electromagnetic properties unseen in nature. These are made up of repeating structures of simple electronic components such as capacitors and inductors.

In 2001, Smith built a metamaterial with a negative refractive index, which bends microwaves in a way impossible for ordinary lenses. Now he has gone one step further.

More here.  [Image for illustrative purposes only.]

The Revolution that Never Came

Shadi Hamid in Qahwa Sada:

The long-awaited “Arab spring” had arrived. Or so it appeared. On January 20, 2005, President George W. Bush declared in his inaugural address that “all who live in tyranny and hopelessness can know: the United States will not ignore your oppression, or excuse your oppressors. When you stand for your liberty, we will stand with you.” Less than two weeks later, the world stood in collective awe, as Iraqis braved terrorist threats to cast their ballots for the first time in their lives. For those who had been waiting decades to see something as simple as a free election, the moment was moving and emotional. Not long after, in March, former Lebanese prime minister Rafiq Hariri was killed. A nation grieved as it witnessed, yet again, a visionary figure cut down by the scourge of terror. Lebanon erupted in grief and then anger as close to one million Lebanese demanded self-determination on the streets of their war-torn capital. Then, in April, 50,000 Bahrainis – one-eighth of the total population – rallied for constitutional reform.

For a short while, it seemed that the Middle East was witnessing “a democratic moment,” one that would, in due time, render the region’s haunting past (and present) of tyranny a distant memory. However, it was not to be.

More here.

A Dangerous New Order

Editorial in today’s New York Times:

Bush_signingOnce President Bush signed the new law on military tribunals, administration officials and Republican leaders in Congress wasted no time giving Americans a taste of the new order created by this unconstitutional act.

Within hours, Justice Department lawyers notified the federal courts that they no longer had the authority to hear pending lawsuits filed by attorneys on behalf of inmates of the penal camp at Guantánamo Bay. They cited passages in the bill that suspend the fundamental principle of habeas corpus, making Mr. Bush the first president since the Civil War to take that undemocratic step.

Not satisfied with having won the vote, Dennis Hastert, the speaker of the House, quickly issued a statement accusing Democrats who opposed the Military Commissions Act of 2006 of putting “their liberal agenda ahead of the security of America.” He said the Democrats “would gingerly pamper the terrorists who plan to destroy innocent Americans’ lives” and create “new rights for terrorists.”

This nonsense is part of the Republicans’ scare-America-first strategy for the elections.

More here.

nutty professors

Anyone who has ever taught at a college or university must have had this experience. You’re in the middle of something that you do every day: standing at a lectern in a dusty room, for example, lecturing to a roomful of teen-agers above whom hang almost visible clouds of hormones; or running a seminar, hoping to find the question that will make people talk even though it’s spring and no one has done the reading; or sitting in a department meeting as your colleagues act out their various professional identities, the Russian historians spreading gloom, the Germanists accidentally taking Poland, the Asianists grumbling about Western ignorance and lack of civility, and the Americanists expressing surprise at the idea that the world has other continents. Suddenly, you find yourself wondering, like Kingsley Amis’s Lucky Jim, how you can possibly be doing this. Why, in the age of the World Wide Web, do professors still stand at podiums and blather for fifty minutes at unruly mobs of students, their lowered baseball caps imperfectly concealing the sleep buds that rim their eyes? Why do professors and students put on polyester gowns and funny hats and march, once a year, in the uncertain glory of the late spring? Why, when most of our graduate students are going to work as teachers, do we make them spend years grinding out massive, specialized dissertations, which, when revised and published, may reach a readership that numbers in the high two figures? These activities seem both bizarre and disconnected, from one another and from modern life, and it’s no wonder that they often provoke irritation, not only in professional pundits but also in parents, potential donors, and academic administrators.

more from The New Yorker here.

pollock as post-pollock

Article_7

SOMETIMES THE SMALLEST things create the most arresting aesthetic experiences—an observation resoundingly reconfirmed for me at “No Limits, Just Edges,” the Jackson Pollock works-on-paper exhibition recently on view at the Solomon R. Guggenheim Museum in New York (and before that at the Guggenheim Foundation’s outposts in Berlin and Venice). As I walked through the show’s expansive last room, my eyes gravitated, almost magnetically, to the lower right-hand corner of an untitled 1951 drawing, where, beneath the slashing arrows and scrawled numerals soaked into the fibers of the absorbent Japanese paper Pollock favored that year, lay one of the artist’s most remarkable, if diminutive, passages: the letters P-o-l-l-o-c-k fashioned out of his trademark drips. I have long had a special interest in post-1950 Pollock, and although I was familiar with this particular work, the crystal-clear logic with which the artist applied his signature style to his signature itself remained striking. Indeed, the dripped signature, strangely, seemed less the result of an artist’s simply working within his own given mode than an act of self-conscious appropriation. That is, the way Pollock used his painterly mark to play on the technique he made famous looked almost like one artist parodying another’s style. Here, at the crucial juncture of his career, when he was moving beyond the dripped abstractions so indelibly associated with his name, Pollock seemed to step outside himself, to begin to address issues of artistic authorship and individual style with an amazing acuity and critical distance. This sly gesture, which is, in fact, typical of Pollock in these years and yet very much at odds with the popularly accepted image of him as an unintellectual, intuitive shaman, reminded me again of how unexplored the artist’s late works are, even now, on the fiftieth anniversary of his death.

more from Artforum here.

kierkegaard v. andersen

Kierkegaard_1

At first sight it seems unlikely that Hans Christian Andersen could have much to do with Søren Kierkegaard, beyond the fact that both of them lived in nineteenth-century Copenhagen, and that they are the only Danish authors who are famous outside Denmark. What do fairy tales have to do with philosophy? What could the creator of “The Emperor’s New Clothes”, “The Ugly Duckling”, “The Little Mermaid” or “The Snow Queen” have in common with the author of Either/Or, Fear and Trembling, The Concept of Anxiety and Sickness unto Death? And what could connect the most popular storyteller in history, whose 200th anniversary last year was celebrated all round the world, to the grim Lutheran who came fourteenth in a BBC poll to find our favourite philosopher, with a derisory 1.65 per cent of the vote?

more from the TLS here.

Written out of history

From Guardian:

Wevill1 Ted Hughes’s wife, Sylvia Plath, famously killed herself. But what of his mistress, who four years later did the same? For the first time, Yehuda Koren and Eilat Negev tell the story of the woman that the poet tried to hide. In May 1962, Assia and her third husband, the Canadian poet David Wevill, were invited to spend a weekend with Plath and Hughes, who were then living in the village of North Tawton in Devon. It was on that weekend, as Hughes later wrote in a poem, that “The dreamer in me fell in love with her”. Six weeks passed before he and Wevill met alone for the first time, when he came to London for a meeting at the BBC.

But Plath was quick to discover the budding affair. She ordered him out, and he was happy to comply. The following day he knocked on the Wevill’s door carrying four bottles of champagne. Wevill made no secret of Hughes’ ferocious lovemaking among her office friends. Equally repelled and fascinated, she told Edward Lucie-Smith, “You know, in bed he smells like a butcher.” In the next two months he shuttled between the two women.

In mid-September he and Plath took a holiday in Ireland. On the fourth day he disappeared. His whereabouts have remained a mystery not only to Plath but to subsequent biographers and scholars. However, in our research we discovered that when Hughes embarked on the Irish trip, he already had a ticket to another destination. Ditching Plath in Ireland, he hurried to London to meet Wevill, and the two of them headed south for a 10-day fiesta in Spain. He and Plath had spent their honeymoon there, and she hated the country. For him and Wevill, the trip was a delight, providing them with a creative boost: a film script that they had started writing together.

When he returned home, Hughes had a terrible row with Plath; he refused to give up his mistress and left for London permanently. Two months later, Plath moved to London as well. Hughes and Wevill were no longer making a secret of their affair. They were seen everywhere, so much so that many people mistakenly thought that they were actually living together.

On February 11 1963, Plath ended her life. Two days later, Myers came for a condolence visit and found Wevill resting in Plath’s bed. A month later Hughes and Wevill decided to abort the child that Wevill was carrying.

More here.

Asleep at the Memory Wheel

From Science:

Sleep Neuroscientist Matthew Walker of Harvard University and his colleagues paid 10 undergraduate students to forgo a night’s sleep. The next day, the students viewed a series of 30 words, and two days later–after having two nights to catch up on their sleep–the students returned to the lab and took a test to see how well they remembered the words they’d seen.

The students recalled about 40% fewer words overall than a group of 10 students who had slept normally, Walker reported here yesterday at the annual meeting of the Society for Neuroscience. But the researchers also found that the emotional content of the words made a big difference in what people remembered. Previous studies have found that both positive and negative emotions bolster memory, but in the current study, negatively charged words (such as cancer or jail) seemed to penetrate the sleep-deprived brain more deeply than positive ones (such as happy or sunshine). Indeed, sleep-deprived students were only 19% worse than their well-rested counterparts at remembering negative words, but 59% worse for positive words. Walker suspects the difference may reflect an evolutionary safeguard against forgetting potential threats.

More here.

Wednesday, October 18, 2006

A friend is one who stabs you in the front

However desirable it is to have neat definitions of important ideas, the fact is that most of them are too internally complex to be caught in a formula. “Friendship” is one such. There are many kinds of friendship, achieved by many different routes, and the most they have in common is that – somewhere in the ideal version of them – loyalty, sympathy and affection standardly figure.

Not everyone agrees that friendship is the summit of human relationship. Literature and the movies conspire to give this place to romantic love, while another convention yields the distinction to parent-child relationships. But each of these is successful only if it matures into friendship at last, which is why sages of quite different traditions extol friendship as the highest, the most central, the most necessary link in the social web. Given that humans are essentially social beings, friendship thus turns out to be a defining component of life worth living.

It is an interesting coincidence, and perhaps more, that both Mencius in ancient China and Aristotle in Greece taught that a friend is “another self”. If one cares fully about another person, they said, his good matters as much to oneself as one’s own: so a pair of true friends are “one mind in two bodies”.

more from AC Grayling at the Financial Times here.

the masters of doomsday décor

Matthewritchie_1

If the international situation has you fretting about Armaggedon, cheer up: It turns out the apocalypse is going to be great fun, after all. At least that’s the vision according to art installations current in Chelsea. With shows that inaugurate their respective dealers’ new or expanded galleries, Matthew Ritchie’s takes its title, “The Universal Adversary,” from our government’s collective term for worst case scenario crisis prediction, while Barnaby Furnas explodes his trademark motif of shed blood to biblically epic proportions.

Their art is as photogenic as the glossies frequently prove the young art stars themselves to be—for all the portentuousness of their subject matter, neither prophet is a grizzly old man with a beard. Cheerful palette, spritely markmaking, sumptuous overload and dexterous skill are the pervasive qualities of both exhibitions. These are the masters of doomsday décor.

more from Artcritical here.

dawkins: a bit glib, but insightful

“He’s a brilliant man,” one of my colleagues once said of Richard Dawkins, “but so impolite.” I agree, but think he chose the wrong conjunction: If I had to identify Dawkins’ cardinal virtues, I would say that he is brilliant, articulate, impassioned and impolite. As Emerson famously said, “Your goodness must have some edge to it — else it is none.” “The God Delusion” is a fine and significant book, and this is largely due to Dawkins’ willingness to employ the sharp edges of his intellect to cut through a paralyzing propriety whose main effect is to stifle conversations — about religion, about intellectual responsibility, about politics — that we very much need, at this particular moment in our history, to be having.

Some will accuse Dawkins of being not just impolite but also intolerant. He is indeed a kind of crusading atheist, and makes no bones about his opposition not just to religious extremism but also to all species of religious faith — a phenomenon he regards as fundamentally irrational and deeply dangerous.

more from SF Chroncile Review here.

The Wages of Whiterness

Via Belle Waring over at Crooked Timber, a new study by Joni Hersch at Vanderbilt Law School suggests that a very classic and vulgar racism is alive. In the Washington Post:

Vanderbilt University economist Joni Hersch found that legal immigrants to the United States who had darker complexions or were shorter earned less money than their fair-skinned or taller counterparts with similar jobs, training and backgrounds. Even swarthy whites from abroad earned less than those with lighter skin.

Immigrants with the lightest complexions earned, on average, about 8 to 15 percent more than those with the darkest skin tone after controlling for race and country of origin as well as for other factors related to earnings, including occupation, education, language skills, work history, type of visa and whether they were married to a U.S. citizen.

In fact, Hersch estimated that the negative impact of skin tone on earnings was equal to the benefit of education, with a particularly dark complexion virtually wiping out the advantage of education on earnings.

Hersch’s paper can be found here.

Economics and Evolution

In Scientific American, Stuart Kauffman on why economics should be inspired more by biology than by physics.

As economics attempts to model increasingly complicated phenomena, however, it would do well to shift its attention from physics to biology, because the biosphere and the living things in it represent the most complex systems known in nature. In particular, a deeper understanding of how species adapt and evolve may bring profound–even revolutionary–insights into business adaptability and the engines of economic growth.

One of the key ideas in modern evolutionary theory is that of preadaptation. The term may sound oxymoronic but its significance is perfectly logical: every feature of an organism, in addition to its obvious functional characteristics, has others that could become useful in totally novel ways under the right circumstances. The forerunners of air-breathing lungs, for example, were swim bladders with which fish maintained their equilibrium; as some fish began to move onto the margins of land, those bladders acquired a new utility as reservoirs of oxygen. Biologists say that those bladders were preadapted to become lungs. Evolution can innovate in ways that cannot be prestated and is nonalgorithmic by drafting and recombining existing entities for new purposes–shifting them from their existing function to some adjacent novel function–rather than inventing features from scratch.

Mexico’s Institutional Crisis

In the New Left Review, Al Giordano on the Mexican Presidential elections.

For Mexicans, the events of this summer inevitably recalled another stolen election, eighteen years ago. In July 1988, Cuauhtémoc Cárdenas—son of the populist president Lázaro Cárdenas (1934–40), who had instituted land reforms and nationalized oil—ran for the presidency against the pri’s Carlos Salinas de Gortari. Cárdenas and his left-reformist supporters within the party had broken from the pri in 1987, having despaired of reforming the priísta machine from within. Together with former pri chairman Porfirio Muñoz Ledo and a range of small left parties, he founded the National Democratic Front (FDN) early in 1988 to contest that year’s election. When the returns came in on July 6th, Cárdenas was in the lead: the 55 per cent of tally sheets in the possession of FDN poll workers showed Cárdenas with 40 per cent to Salinas’s 36; government tabulations showed similar results. But then came the moment that has defined public responses to the current electoral crisis: the pri interior minister announced on national tv that the vote-counting computer had crashed. When the system was back up again later that night, suddenly Salinas was ahead.

Millions took to the streets to protest the fraud. The PRI regime flatly refused to make the remaining precinct tally sheets public, but when 30,000 ballots marked for Cárdenas were found dumped in rivers and forests in the southern state of Guerrero, popular anger erupted. During a demonstration in the Zócalo attended by upwards of three million people, some of Cárdenas’s aides pressed him to seize the National Palace. But he recoiled from such a radical course, opting to negotiate with Salinas in private. In exchange for some concessions, including the formation in 1990 of the Federal Electoral Institute, Cárdenas dropped his challenge, prompting bitter divisions within the fdn that continue to haunt the party formed from its demoralized components in 1989, the PRD.

Should the Nobel Peace Prize Take a Break

The Economist suggests that the Nobel Peace Prize might want to take a hiatus.

Withholding the prize for a year, or possibly five, might seem rather callous. But the institute would not be suggesting that the world has become sufficiently peaceful now. Some do argue that wars are generally in decline. Last year a think-tank in Canada released a “Human Security Report” which noted that 100-odd wars have expired since 1988. Their study found that wars and genocides have become less frequent since 1991, that the value of the international arms trade has slumped by a third (between 1990 and 2003), and that refugee numbers have roughly halved (between 1992 and 2003). Yet, despite all that, there are clearly enough problems today—Darfur, Sri Lanka, Somalia, Afghanistan, Iraq, international terrorism—to keep the hardest-working peace promoters busy.

The reason for the institute to withhold the prize, instead, would be to preserve its value. There is a risk that its worth is being eroded as the institute scrambles to find an eye-catching recipient every year. There is the problem of Buggins’s turn, an expectation (as with some other prizes) that the award should rotate between regions of the world. This year it is Asia, last year the recipient was from the Middle East, the year before from Africa.