A Manifesto for American Liberals

In response to Tony Judt’s recent piece on the “Strange Death of Liberal America” in the LRB, Bruce Ackerman and Todd Gitlin have issued a manifesto for liberals, which has been also signed by a number of prominent liberal intellectuals, including Kenneth Arrow Joshua Cohen Robert A. Dahl James K. Galbraith Arlie Hochschild Margaret Levi Robert B. Reich Elaine Scarry and Charles Tilly. (Via Crooked Timber.) In The American Prospect:

As right-wing politicians and pundits call us stooges for Osama bin Laden, Tony Judt charges, in a widely discussed and heatedly debated essay in the London Review of Books, that American liberals — without distinction — have “acquiesced in President Bush’s catastrophic foreign policy.” Both claims are nonsense on stilts.

Clearly this is a moment for liberals to define ourselves. The important truth is that most liberals, including the undersigned, have stayed our course throughout these grim five years. We have consistently and publicly repudiated the ruinous policies of the Bush administration, and our diagnosis, alas, has been vindicated by events. The Bush debacle is a direct consequence of its repudiation of liberal principles. And if the country is to recover, we should begin by restating these principles.

We have all opposed the Iraq war as illegal, unwise, and destructive of America’s moral standing. This war fueled, and continues to fuel, jihadis whose commitment to horrific, unjustifiable violence was amply demonstrated by the September 11 attacks as well as the massacres in Spain, Indonesia, Tunisia, Great Britain, and elsewhere. Rather than making us safer, the Iraq war has endangered the common security of Americans and our allies.



Working invisibility cloak created at last

Justin Mullins in New Scientist:

Image006_3An invisibility cloak that works in the microwave region of the electromagnetic spectrum has been unveiled by researchers in the US. The device is the first practical version of a theoretical set-up first suggested in a paper published earlier in 2006.

The cloak works by steering microwave light around an object, making it appear to an observer as if it were not there at all. Materials that bend light in this way do not exist naturally, so have to be engineered with the necessary optical properties.

Earlier in 2006, John Pendry, a theoretical physicist at Imperial College London, UK, and colleagues showed how such an invisibility cloak could, in theory, be made (see Physicists draw up plans for real ‘cloaking device’). Now David Smith and colleagues at Duke University in North Carolina, US, have proved the idea works.

In recent years, materials scientists have made rapid progress in making so-called “metamaterials”, which can have exotic electromagnetic properties unseen in nature. These are made up of repeating structures of simple electronic components such as capacitors and inductors.

In 2001, Smith built a metamaterial with a negative refractive index, which bends microwaves in a way impossible for ordinary lenses. Now he has gone one step further.

More here.  [Image for illustrative purposes only.]

The Revolution that Never Came

Shadi Hamid in Qahwa Sada:

The long-awaited “Arab spring” had arrived. Or so it appeared. On January 20, 2005, President George W. Bush declared in his inaugural address that “all who live in tyranny and hopelessness can know: the United States will not ignore your oppression, or excuse your oppressors. When you stand for your liberty, we will stand with you.” Less than two weeks later, the world stood in collective awe, as Iraqis braved terrorist threats to cast their ballots for the first time in their lives. For those who had been waiting decades to see something as simple as a free election, the moment was moving and emotional. Not long after, in March, former Lebanese prime minister Rafiq Hariri was killed. A nation grieved as it witnessed, yet again, a visionary figure cut down by the scourge of terror. Lebanon erupted in grief and then anger as close to one million Lebanese demanded self-determination on the streets of their war-torn capital. Then, in April, 50,000 Bahrainis – one-eighth of the total population – rallied for constitutional reform.

For a short while, it seemed that the Middle East was witnessing “a democratic moment,” one that would, in due time, render the region’s haunting past (and present) of tyranny a distant memory. However, it was not to be.

More here.

A Dangerous New Order

Editorial in today’s New York Times:

Bush_signingOnce President Bush signed the new law on military tribunals, administration officials and Republican leaders in Congress wasted no time giving Americans a taste of the new order created by this unconstitutional act.

Within hours, Justice Department lawyers notified the federal courts that they no longer had the authority to hear pending lawsuits filed by attorneys on behalf of inmates of the penal camp at Guantánamo Bay. They cited passages in the bill that suspend the fundamental principle of habeas corpus, making Mr. Bush the first president since the Civil War to take that undemocratic step.

Not satisfied with having won the vote, Dennis Hastert, the speaker of the House, quickly issued a statement accusing Democrats who opposed the Military Commissions Act of 2006 of putting “their liberal agenda ahead of the security of America.” He said the Democrats “would gingerly pamper the terrorists who plan to destroy innocent Americans’ lives” and create “new rights for terrorists.”

This nonsense is part of the Republicans’ scare-America-first strategy for the elections.

More here.

nutty professors

Anyone who has ever taught at a college or university must have had this experience. You’re in the middle of something that you do every day: standing at a lectern in a dusty room, for example, lecturing to a roomful of teen-agers above whom hang almost visible clouds of hormones; or running a seminar, hoping to find the question that will make people talk even though it’s spring and no one has done the reading; or sitting in a department meeting as your colleagues act out their various professional identities, the Russian historians spreading gloom, the Germanists accidentally taking Poland, the Asianists grumbling about Western ignorance and lack of civility, and the Americanists expressing surprise at the idea that the world has other continents. Suddenly, you find yourself wondering, like Kingsley Amis’s Lucky Jim, how you can possibly be doing this. Why, in the age of the World Wide Web, do professors still stand at podiums and blather for fifty minutes at unruly mobs of students, their lowered baseball caps imperfectly concealing the sleep buds that rim their eyes? Why do professors and students put on polyester gowns and funny hats and march, once a year, in the uncertain glory of the late spring? Why, when most of our graduate students are going to work as teachers, do we make them spend years grinding out massive, specialized dissertations, which, when revised and published, may reach a readership that numbers in the high two figures? These activities seem both bizarre and disconnected, from one another and from modern life, and it’s no wonder that they often provoke irritation, not only in professional pundits but also in parents, potential donors, and academic administrators.

more from The New Yorker here.

pollock as post-pollock

Article_7

SOMETIMES THE SMALLEST things create the most arresting aesthetic experiences—an observation resoundingly reconfirmed for me at “No Limits, Just Edges,” the Jackson Pollock works-on-paper exhibition recently on view at the Solomon R. Guggenheim Museum in New York (and before that at the Guggenheim Foundation’s outposts in Berlin and Venice). As I walked through the show’s expansive last room, my eyes gravitated, almost magnetically, to the lower right-hand corner of an untitled 1951 drawing, where, beneath the slashing arrows and scrawled numerals soaked into the fibers of the absorbent Japanese paper Pollock favored that year, lay one of the artist’s most remarkable, if diminutive, passages: the letters P-o-l-l-o-c-k fashioned out of his trademark drips. I have long had a special interest in post-1950 Pollock, and although I was familiar with this particular work, the crystal-clear logic with which the artist applied his signature style to his signature itself remained striking. Indeed, the dripped signature, strangely, seemed less the result of an artist’s simply working within his own given mode than an act of self-conscious appropriation. That is, the way Pollock used his painterly mark to play on the technique he made famous looked almost like one artist parodying another’s style. Here, at the crucial juncture of his career, when he was moving beyond the dripped abstractions so indelibly associated with his name, Pollock seemed to step outside himself, to begin to address issues of artistic authorship and individual style with an amazing acuity and critical distance. This sly gesture, which is, in fact, typical of Pollock in these years and yet very much at odds with the popularly accepted image of him as an unintellectual, intuitive shaman, reminded me again of how unexplored the artist’s late works are, even now, on the fiftieth anniversary of his death.

more from Artforum here.

kierkegaard v. andersen

Kierkegaard_1

At first sight it seems unlikely that Hans Christian Andersen could have much to do with Søren Kierkegaard, beyond the fact that both of them lived in nineteenth-century Copenhagen, and that they are the only Danish authors who are famous outside Denmark. What do fairy tales have to do with philosophy? What could the creator of “The Emperor’s New Clothes”, “The Ugly Duckling”, “The Little Mermaid” or “The Snow Queen” have in common with the author of Either/Or, Fear and Trembling, The Concept of Anxiety and Sickness unto Death? And what could connect the most popular storyteller in history, whose 200th anniversary last year was celebrated all round the world, to the grim Lutheran who came fourteenth in a BBC poll to find our favourite philosopher, with a derisory 1.65 per cent of the vote?

more from the TLS here.

Written out of history

From Guardian:

Wevill1 Ted Hughes’s wife, Sylvia Plath, famously killed herself. But what of his mistress, who four years later did the same? For the first time, Yehuda Koren and Eilat Negev tell the story of the woman that the poet tried to hide. In May 1962, Assia and her third husband, the Canadian poet David Wevill, were invited to spend a weekend with Plath and Hughes, who were then living in the village of North Tawton in Devon. It was on that weekend, as Hughes later wrote in a poem, that “The dreamer in me fell in love with her”. Six weeks passed before he and Wevill met alone for the first time, when he came to London for a meeting at the BBC.

But Plath was quick to discover the budding affair. She ordered him out, and he was happy to comply. The following day he knocked on the Wevill’s door carrying four bottles of champagne. Wevill made no secret of Hughes’ ferocious lovemaking among her office friends. Equally repelled and fascinated, she told Edward Lucie-Smith, “You know, in bed he smells like a butcher.” In the next two months he shuttled between the two women.

In mid-September he and Plath took a holiday in Ireland. On the fourth day he disappeared. His whereabouts have remained a mystery not only to Plath but to subsequent biographers and scholars. However, in our research we discovered that when Hughes embarked on the Irish trip, he already had a ticket to another destination. Ditching Plath in Ireland, he hurried to London to meet Wevill, and the two of them headed south for a 10-day fiesta in Spain. He and Plath had spent their honeymoon there, and she hated the country. For him and Wevill, the trip was a delight, providing them with a creative boost: a film script that they had started writing together.

When he returned home, Hughes had a terrible row with Plath; he refused to give up his mistress and left for London permanently. Two months later, Plath moved to London as well. Hughes and Wevill were no longer making a secret of their affair. They were seen everywhere, so much so that many people mistakenly thought that they were actually living together.

On February 11 1963, Plath ended her life. Two days later, Myers came for a condolence visit and found Wevill resting in Plath’s bed. A month later Hughes and Wevill decided to abort the child that Wevill was carrying.

More here.

Asleep at the Memory Wheel

From Science:

Sleep Neuroscientist Matthew Walker of Harvard University and his colleagues paid 10 undergraduate students to forgo a night’s sleep. The next day, the students viewed a series of 30 words, and two days later–after having two nights to catch up on their sleep–the students returned to the lab and took a test to see how well they remembered the words they’d seen.

The students recalled about 40% fewer words overall than a group of 10 students who had slept normally, Walker reported here yesterday at the annual meeting of the Society for Neuroscience. But the researchers also found that the emotional content of the words made a big difference in what people remembered. Previous studies have found that both positive and negative emotions bolster memory, but in the current study, negatively charged words (such as cancer or jail) seemed to penetrate the sleep-deprived brain more deeply than positive ones (such as happy or sunshine). Indeed, sleep-deprived students were only 19% worse than their well-rested counterparts at remembering negative words, but 59% worse for positive words. Walker suspects the difference may reflect an evolutionary safeguard against forgetting potential threats.

More here.

Wednesday, October 18, 2006

A friend is one who stabs you in the front

However desirable it is to have neat definitions of important ideas, the fact is that most of them are too internally complex to be caught in a formula. “Friendship” is one such. There are many kinds of friendship, achieved by many different routes, and the most they have in common is that – somewhere in the ideal version of them – loyalty, sympathy and affection standardly figure.

Not everyone agrees that friendship is the summit of human relationship. Literature and the movies conspire to give this place to romantic love, while another convention yields the distinction to parent-child relationships. But each of these is successful only if it matures into friendship at last, which is why sages of quite different traditions extol friendship as the highest, the most central, the most necessary link in the social web. Given that humans are essentially social beings, friendship thus turns out to be a defining component of life worth living.

It is an interesting coincidence, and perhaps more, that both Mencius in ancient China and Aristotle in Greece taught that a friend is “another self”. If one cares fully about another person, they said, his good matters as much to oneself as one’s own: so a pair of true friends are “one mind in two bodies”.

more from AC Grayling at the Financial Times here.

the masters of doomsday décor

Matthewritchie_1

If the international situation has you fretting about Armaggedon, cheer up: It turns out the apocalypse is going to be great fun, after all. At least that’s the vision according to art installations current in Chelsea. With shows that inaugurate their respective dealers’ new or expanded galleries, Matthew Ritchie’s takes its title, “The Universal Adversary,” from our government’s collective term for worst case scenario crisis prediction, while Barnaby Furnas explodes his trademark motif of shed blood to biblically epic proportions.

Their art is as photogenic as the glossies frequently prove the young art stars themselves to be—for all the portentuousness of their subject matter, neither prophet is a grizzly old man with a beard. Cheerful palette, spritely markmaking, sumptuous overload and dexterous skill are the pervasive qualities of both exhibitions. These are the masters of doomsday décor.

more from Artcritical here.

dawkins: a bit glib, but insightful

“He’s a brilliant man,” one of my colleagues once said of Richard Dawkins, “but so impolite.” I agree, but think he chose the wrong conjunction: If I had to identify Dawkins’ cardinal virtues, I would say that he is brilliant, articulate, impassioned and impolite. As Emerson famously said, “Your goodness must have some edge to it — else it is none.” “The God Delusion” is a fine and significant book, and this is largely due to Dawkins’ willingness to employ the sharp edges of his intellect to cut through a paralyzing propriety whose main effect is to stifle conversations — about religion, about intellectual responsibility, about politics — that we very much need, at this particular moment in our history, to be having.

Some will accuse Dawkins of being not just impolite but also intolerant. He is indeed a kind of crusading atheist, and makes no bones about his opposition not just to religious extremism but also to all species of religious faith — a phenomenon he regards as fundamentally irrational and deeply dangerous.

more from SF Chroncile Review here.

The Wages of Whiterness

Via Belle Waring over at Crooked Timber, a new study by Joni Hersch at Vanderbilt Law School suggests that a very classic and vulgar racism is alive. In the Washington Post:

Vanderbilt University economist Joni Hersch found that legal immigrants to the United States who had darker complexions or were shorter earned less money than their fair-skinned or taller counterparts with similar jobs, training and backgrounds. Even swarthy whites from abroad earned less than those with lighter skin.

Immigrants with the lightest complexions earned, on average, about 8 to 15 percent more than those with the darkest skin tone after controlling for race and country of origin as well as for other factors related to earnings, including occupation, education, language skills, work history, type of visa and whether they were married to a U.S. citizen.

In fact, Hersch estimated that the negative impact of skin tone on earnings was equal to the benefit of education, with a particularly dark complexion virtually wiping out the advantage of education on earnings.

Hersch’s paper can be found here.

Economics and Evolution

In Scientific American, Stuart Kauffman on why economics should be inspired more by biology than by physics.

As economics attempts to model increasingly complicated phenomena, however, it would do well to shift its attention from physics to biology, because the biosphere and the living things in it represent the most complex systems known in nature. In particular, a deeper understanding of how species adapt and evolve may bring profound–even revolutionary–insights into business adaptability and the engines of economic growth.

One of the key ideas in modern evolutionary theory is that of preadaptation. The term may sound oxymoronic but its significance is perfectly logical: every feature of an organism, in addition to its obvious functional characteristics, has others that could become useful in totally novel ways under the right circumstances. The forerunners of air-breathing lungs, for example, were swim bladders with which fish maintained their equilibrium; as some fish began to move onto the margins of land, those bladders acquired a new utility as reservoirs of oxygen. Biologists say that those bladders were preadapted to become lungs. Evolution can innovate in ways that cannot be prestated and is nonalgorithmic by drafting and recombining existing entities for new purposes–shifting them from their existing function to some adjacent novel function–rather than inventing features from scratch.

Mexico’s Institutional Crisis

In the New Left Review, Al Giordano on the Mexican Presidential elections.

For Mexicans, the events of this summer inevitably recalled another stolen election, eighteen years ago. In July 1988, Cuauhtémoc Cárdenas—son of the populist president Lázaro Cárdenas (1934–40), who had instituted land reforms and nationalized oil—ran for the presidency against the pri’s Carlos Salinas de Gortari. Cárdenas and his left-reformist supporters within the party had broken from the pri in 1987, having despaired of reforming the priísta machine from within. Together with former pri chairman Porfirio Muñoz Ledo and a range of small left parties, he founded the National Democratic Front (FDN) early in 1988 to contest that year’s election. When the returns came in on July 6th, Cárdenas was in the lead: the 55 per cent of tally sheets in the possession of FDN poll workers showed Cárdenas with 40 per cent to Salinas’s 36; government tabulations showed similar results. But then came the moment that has defined public responses to the current electoral crisis: the pri interior minister announced on national tv that the vote-counting computer had crashed. When the system was back up again later that night, suddenly Salinas was ahead.

Millions took to the streets to protest the fraud. The PRI regime flatly refused to make the remaining precinct tally sheets public, but when 30,000 ballots marked for Cárdenas were found dumped in rivers and forests in the southern state of Guerrero, popular anger erupted. During a demonstration in the Zócalo attended by upwards of three million people, some of Cárdenas’s aides pressed him to seize the National Palace. But he recoiled from such a radical course, opting to negotiate with Salinas in private. In exchange for some concessions, including the formation in 1990 of the Federal Electoral Institute, Cárdenas dropped his challenge, prompting bitter divisions within the fdn that continue to haunt the party formed from its demoralized components in 1989, the PRD.

Should the Nobel Peace Prize Take a Break

The Economist suggests that the Nobel Peace Prize might want to take a hiatus.

Withholding the prize for a year, or possibly five, might seem rather callous. But the institute would not be suggesting that the world has become sufficiently peaceful now. Some do argue that wars are generally in decline. Last year a think-tank in Canada released a “Human Security Report” which noted that 100-odd wars have expired since 1988. Their study found that wars and genocides have become less frequent since 1991, that the value of the international arms trade has slumped by a third (between 1990 and 2003), and that refugee numbers have roughly halved (between 1992 and 2003). Yet, despite all that, there are clearly enough problems today—Darfur, Sri Lanka, Somalia, Afghanistan, Iraq, international terrorism—to keep the hardest-working peace promoters busy.

The reason for the institute to withhold the prize, instead, would be to preserve its value. There is a risk that its worth is being eroded as the institute scrambles to find an eye-catching recipient every year. There is the problem of Buggins’s turn, an expectation (as with some other prizes) that the award should rotate between regions of the world. This year it is Asia, last year the recipient was from the Middle East, the year before from Africa.

A Discussion of Jihad, McWorld, and Modernity

In Salmagundi, excerpts from Benjamin Barber, Martha Nussbaum, Peter Singer, Breyten Bretyenbach, Orlando Paterson, Guity Nashat, Akeel Bilgram, James Miller, Vladimir Tismaneanu, and Carlyn Forche’s discussion on “Jihad, McWorld, Modernity“, a symposium about the “Clash of Civilizations”.

Benjamin Barber:

Though we framed this debate to some degree in terms of the clash of civilizations, and that is certainly a provocative term which the events of 9/11 would seem to inspire, I take that phrase, the “clash of civilizations,” to be little more than an expression of parochial bigotry. It speaks in no way to the world we live in and is, frankly, hardly even worth discussing, although some people here may strongly disagree. It’s the kind of language that is redolent of a world of 18th century imperialism, a world of “us and them,” and it clarifies nothing. I would just remind those of you who are enamored of Sam Huntington’s phraseology that, in the book that gave us this expression, he argues not only that there is a clash of civilizations, but that the clash is aided and abetted by a fifth column in the United States made up of African Americans, who are undermining the West and its ideals. So if you’ve taken that book seriously, I suggest you read it again more carefully and revise your estimation.

The more serious charge, though, is that there is a special problem called Islam, and that Islam has created a world in which fundamentalists regard not just the West, but democracy, pluralism, freedom, and global markets as the enemy of an ancient, militant, intolerant doxology and that the West’s destruction is necessary to the survival of that doxology. That is an argument that’s been put in somewhat more civil and polite terms by a variety of thinkers, including Bernard Lewis, but there are others as well. Paul Berman, for one, has made a rather peculiar argument that Islamic fundamentalism is a new form of totalitarianism not entirely unlike the Soviet and fascist variants. I reject that charge in its entirety, and note that all religions stand in a tension with secular society and that every civilization the world has known has had the task of working out that tension, adjudicating the relevant differences and stresses. That is the essence of what a civilization is about—and though some cultures have been more successful than others in maintaining a healthy balance between religious and secular demands, there is an essential pattern we can see at work in contemporary Islamic societies.

Agatha, we all owe you

From Guardian:

Agatha11 The “disappearing act” by Agatha Christie over 11 days in 1926 has always been a subject of huge curiosity and mystery. Why did a famous and successful woman cut and run, leaving her car abandoned in a way that suggested self-injury, to fetch up in a genteel hotel in Harrogate – where she remained oblivious to newspaper headlines and a national hunt to find her while acting perfectly normally as a guest? There may well have been another ingredient in the mystery, namely envy. Agatha Christie was already famous, so it followed that what she did was simply for publicity. She must be seeking higher sales figures and pity.

I have to say that her driving off into the night seems to me the most natural thing in the world. She had recently lost a beloved mother, and all bereaved daughters know that this is worse than anything a blunt instrument can inflict. Then comes the stab wound, when her adored husband says he’s leaving her for someone else and never loved her anyway. Suddenly she’s on the edge of an abyss of loneliness and self-loathing; nothing she has done is worth a damn. It would be the action of a thoroughly ordered mind to shut down and hide, like a wounded animal seeking oblivion.

More here.

What’s behind those fall colors?

From MSNBC:

Fall_colors For years, scientists have studied how leaves prepare for the annual show of fall color. The molecules behind bright yellows and oranges are well understood, but brilliant reds remain a bit of a mystery. In response to chilly temperatures and fewer daylight hours, leaves stop producing their green-tinted chlorophyll, which allows them to capture sunlight and make energy. Because chlorophyll is sensitive to the cold, certain weather conditions like early frosts will turn off production more quickly.

Meanwhile, orange and yellow pigments called carotenoids—also found in orange carrots—shine through the leaves’ washed out green. “The yellow color has been there all summer, but you don’t see it until the green fades away,” said Paul Schaberg, U.S. Forest Service plant physiologist. “In trees likes aspens and beech, that’s the dominant color change.”

Scientists know less about the radiant red hues that pepper northern maple and ash forests in the fall. The red color comes from anthocyanins, which unlike carotenoids, are only produced in the fall. They also give color to strawberries, red apples, and plums. On a tree, these red pigments beneficially act as sunscreen, by blocking out harmful radiation and shading the leaf from excess light. They also serve as antifreeze, protecting cells from easily freezing. And they are beneficial as antioxidants.

More here.

Tuesday, October 17, 2006

The Trouble with Deepak Chopra, Part 2

From Respectful Insolence:

Alright, I’ll come right out and admit it up front. There was no part one to this piece. Well, there was, but it wasn’t on this blog, and I didn’t write it. PZ did in response to some really idiotic arguments from ignorance that Deepak Chopra displayed as part of an “argument” (and I use the term loosely) that there is some mystical other quality that explains life other than genes. He paraded a litany of arguments that so conclusively demonstrated that he had no clue about even the basics of molecular biology that I as a physician cringed and hid my head in shame when I read it, given that Dr. Chopra is, at least nominally, a medical doctor. PZ did a fine job of fisking Chopra’s nonsense (with one minor quibble that I mentioned in the comments). Even the people leaving comments on Chopra’s article were in general pretty hostile to his drivel and pointed out the large number of misstatements of our understanding of genetics, logical fallacies, and credulous arguments from ignorance that flew hither and yon from Chopra’s keyboard. I thought that, having thoroughly embarrassed himself once, Chopra would slink away for a while before dropping another woo-bomb onto an unsuspecting blogosphere. I even thought that Chopra had a shred of self-respect that would prevent him from embarrassing himself again that soon.

I was mistaken.

He’s back, with The Trouble With Genes, Part II (also found here).

More here.