Terry Eagleton’s God Talk

Stanley Fish in the New York Times:

ScreenHunter_05 May. 06 11.11 In the opening sentence of the last chapter of his new book, “Reason, Faith and Revolution,” the British critic Terry Eagleton asks, “Why are the most unlikely people, including myself, suddenly talking about God?” His answer, elaborated in prose that is alternately witty, scabrous and angry, is that the other candidates for guidance — science, reason, liberalism, capitalism — just don’t deliver what is ultimately needed. “What other symbolic form,” he queries, “has managed to forge such direct links between the most universal and absolute of truths and the everyday practices of countless millions of men and women?”

Eagleton acknowledges that the links forged are not always benign — many terrible things have been done in religion’s name — but at least religion is trying for something more than local satisfactions, for its “subject is nothing less than the nature and destiny of humanity itself, in relation to what it takes to be its transcendent source of life.” And it is only that great subject, and the aspirations it generates, that can lead, Eagleton insists, to “a radical transformation of what we say and do.”

More here. [Photo shows Terry Eagleton.]



Why Obama cited Churchill on torture

Christopher Hitchens in Slate:

090504_FW_ChurchillTN He didn't get the attention he deserved for it, but President Obama was very cleverly fusing liberal principles with an appeal to the basic conservative values of “Old Europe” when, in his 100th-day press conference, he used Winston Churchill to justify his opposition to water-boarding and other “enhanced methods.” He told his audience that, even at a time when London was being “bombed to smithereens” and the British government held hundreds of Nazi agents in an internment center, there was a prime-ministerial view that torture was never permissible.

It would be reassuring to think that somebody close to Obama had handed him a copy of a little-known book called Camp 020: MI5 and the Nazi Spies. This was published by the British Public Record Office in 2000 and describes the workings of Latchmere House, an extraordinary British prison on Ham Common in the London suburb of Richmond, which housed as many as 400 of Hitler's operatives during World War II. Its commanding officer was a man named Col. Robin Stephens, and though he wore a monocle and presented every aspect of a frigid military martinet (and was known and feared by the nickname “Tin-Eye”), he was a dedicated advocate of the nonviolent approach to his long-term guests. To phrase it crisply—as he did—his view was and remained: “Violence is taboo, for not only does it produce answers to please, but it lowers the standard of information.”

More here.

Tuesday Poem

Counterman
Paul Violi

—What’ll it be?

Roast beef on rye, with tomato and mayo.

—Whaddaya want on it?

A swipe of mayo.
Pepper, but no salt.

—You got it. Roast beef on rye.
You want lettuce on that?

No, just tomato and mayo.

—Tomato and mayo. You got it.
. . . Salt and pepper?

No salt, just a little pepper.

—You got it. No salt.
You want tomato.

Yes. Tomato. No lettuce.

—No lettuce. You got it.
. . . No salt, right?

Right. No salt.

—You got it. –Pickle?

No, no pickle. Just tomato and mayo.
And pepper.

—Pepper.

Yes, a little pepper.

—Right. A little pepper.
No pickle.

Right. No pickle.

—You got it.
Next!

Roast beef on whole wheat, please,
With lettuce, mayonnaise, and a center slice
Of beefsteak tomato.
The lettuce splayed, if you will,
In a Beaux Arts derivative of classical acanthus,
And the roast beef, thinly sliced, folded
In a multifoil arrangement
That eschews Bragdonian pretentions
Or any idea of divine geometric projection
For that matter, but simply provides
A setting for the tomato
To form a medallion with a dab
Of mayonnaise as a fleuron.
And—as eclectic as this may sound—
If the mayonnaise can also be applied
Along the crust in a Vitruvian scroll
And as a festoon below the medallion,
That would be swell.

—You mean like in the Cathedral St. Pierre in Geneva?

Yes, but the swag more like the one below the rosette
At the Royal Palace in Amsterdam.

—You got it.
Next!

A Match Made in Heaven

From The Washington Post:

Match Exhibit A: the match that took place July 20, 1937, on Wimbledon's Centre Court. The occasion was the Davis Cup Interzone Final between the United States and Germany. On one side of the net was Don Budge, a lanky redhead from Oakland, Calif., with a bludgeoning serve and a fabled backhand. On the other side, Baron Gottfried von Cramm, “the very embodiment of style, grace, and sportsmanship,” with a counterpunching game that was likened to chamber music. Cramm took the first two sets; Budge swept the next two; and as the combatants played on into the London twilight, the crowd of 14,000 realized that something extraordinary was happening. “The two white figures began to set the rhythms of something that looked more like ballet than a game where you hit a ball,” wrote radio journalist Alistair Cooke. “People stopped asking other people to sit down. The umpire gave up stopping the game to beg for silence during rallies.”

Each player hit twice as many winners as errors — an ungodly percentage — and the match was concluded by a spectacular running passing shot that the winning player, stumbling as he hit it, never saw land. Whereupon “a British crowd forgot its nature,” Cooke reported. “It stood on benches” and made the “deep kind of roar” that “does not belong on any tennis court.” The U.S. team captain later said, “No man, living or dead, could have beaten either man that day.” Indeed, the question of who ultimately prevailed — I won't spoil it by telling you here — is almost irrelevant.

More here.

Ear Plugs to Lasers: The Science of Concentration

John Tierney in The New York Times:

Ears Imagine that you have ditched your laptop and turned off your smartphone. You are beyond the reach of YouTube, Facebook, e-mail, text messages. You are in a Twitter-free zone, sitting in a taxicab with a copy of “Rapt,” a guide by Winifred Gallagher to the science of paying attention. The book’s theme, which Ms. Gallagher chose after she learned she had an especially nasty form of cancer, is borrowed from the psychologist William James: “My experience is what I agree to attend to.” You can lead a miserable life by obsessing on problems. You can drive yourself crazy trying to multitask and answer every e-mail message instantly.

Or you can recognize your brain’s finite capacity for processing information, accentuate the positive and achieve the satisfactions of what Ms. Gallagher calls the focused life. It can sound wonderfully appealing, except that as you sit in the cab reading about the science of paying attention, you realize that … you’re not paying attention to a word on the page.

More here.

No Exhibit for Old Men

Celebrating the new worlds of 50 artists under 33.

Our own Morgan Meis in The Smart Set:

ScreenHunter_03 May. 05 10.33 Every once in a while you get an epiphany. Something you've been meaning to say for a long time jumps, crystal clear, to the front of your brain. You've always known it, but you've never been able to say it.

This happened to me while reading an essay by Sasha Frere-Jones about Lady Gaga. Frere-Jones opens the piece with the following thought:

Dedicated fans of popular music have a certain conversation at least once a year. Call it The Question of Endurance. You and your friends are talking about music, and the conversation turns to a popular band. You express support. A friend voices her opinion, maybe as favorable as yours, but appends a qualifier: “I like them, but will they be around in 10 years?” You may feel compelled to defend whomever it is you’re talking about, covering the present moment and the future with your positive take. After trying this approach, though, you realize that pop music has no Constitution and doesn’t operate like a de-facto Supreme Court: Precedent is not always established, and isn’t even necessary. Pop rarely accretes in a tidy, serial manner — it zigs, zags, eats itself, and falls over its shoelaces.

It's a smart point, and it applies, as far as I'm concerned, to pretty much everything in the realm of what we like to call “culture.” I would take it even a step further in regard to contemporary art. I don't care whether or not any specific work of art will be around in 10 years, or a hundred, or a thousand. I'm utterly uninterested in trying to judge whether this or that work will “stand the test of time.” I don't think there is a “test of time.” Time doesn't “test” things. Longevity and quality have no intrinsic connection. Time does not slowly sift out the truth from the lies — it just moves along, usually in directions we could never have fathomed. Civilization isn't stable and progressive and never has been. For the critic, 10 years from now ought not exist, 100 years from now ought doubly not.

More here.

Scientists make molecules that evolve, compete, mimick behavior of Darwin’s finches

From PhysOrg.com:

ScreenHunter_02 May. 05 09.46 Two years ago, Voytek managed to develop a second, unrelated enzymatic RNA molecule that also can continuously evolve. This allowed her to set the two RNAs in evolutionary motion within the same pot, forcing them to compete for common resources, just like two species of finches on an island in the Galapagos.

In the new study, the key resource or “food” was a supply of molecules necessary for each RNA's replication. The RNAs will only replicate if they have catalyzed attachment of themselves to these food molecules. So long as the RNAs have ample food, they will replicate, and as they replicate, they will mutate. Over time, as these mutations accumulate, new forms emerge — some fitter than others.

When Voytek and Joyce pitted the two RNA molecules in a head-to-head competition for a single food source, they found that the molecules that were better adapted to use a particular food won out. The less fit RNA disappeared over time. Then they placed the two RNA molecules together in a pot with five different food sources, none of which they had encountered previously. At the beginning of the experiment each RNA could utilize all five types of food — but none of these were utilized particularly well. After hundreds of generations of evolution, however, the two molecules each became independently adapted to use a different one of the five food sources. Their preferences were mutually exclusive — each highly preferred its own food source and shunned the other molecule's food source.

In the process, the molecules evolved different evolutionary approaches to achieving their ends.

More here.

Juan Cole’s summary of the situation in Pakistan

Juan Cole in his blog, Informed Comment:

As Pakistani president Asaf Ali Zardari arrived in Washington for talks with President Obama and Afghan President Hamid Karzai, fighting intensified in Pakistan's northwest.

On Tuesday morning, Pakistani Taliban deployed a suicide bomber to attack Pakistani security forces near Peshawar killing 5 and wounding 9 persons, among them school children bystanders.

WaPo says that fighting had intensified Monday in the Swat Valley between the Pakistani Taliban and government troops, as well as in Buner, the district into which they recently made an incursion and from which the government has been attempting to dislodge them. So far some 80 militants have been killed in the Buner campaign, and 20,000 civilians have been displaced.

Tony Karon at Time explains that the Pakistani military establishment disagrees with Washington that the Taliban are an existential threat to the Pakistani state, and why.

Convinced that Pakistan's problems are in part rooted in economic issues, Sens. John Kerry and Dick Lugar introduced legislation Monday aimed at tripling US foreign aid to Islamabad.

Meanwhile, on the diplomatic front, Secretary of Defense Robert Gates is calling on Saudi Arabia to help Pakistan crush the Pakistani Taliban. The Saudis have developed a fear of the vigilante radicals that they once supported back in the 1980s, and spent 2003-2006 suppressing them at home, and perhaps by now Gates's idea makes some sense.

More here. Obama says Pakistan is toughest U.S. challenge:

A Natural History of the Flu

Carl Zimmer in the New York Times:

ScreenHunter_01 May. 05 08.25 The current outbreak shows how complex and mysterious the evolution of viruses is. That complexity and mystery are all the more remarkable because a virus is life reduced to its essentials. A human influenza virus, for example, is a protein shell measuring about five-millionths of an inch across, with 10 genes inside. (We have about 20,000.)

Some viruses use DNA, like we do, to encode their genes. Others, like the influenza virus, use single-strand RNA. But viruses all have one thing in common, said Roland Wolkowicz, a molecular virologist at San Diego State University: they all reproduce by disintegrating and then reforming.

A human flu virus, for example, latches onto a cell in the lining of the nose or throat. It manipulates a receptor on the cell so that the cell engulfs it, whereupon the virus’s genes are released from its protein shell. The host cell begins making genes and proteins that spontaneously assemble into new viruses. “No other entity out there is able to do that,” Dr. Wolkowicz said. “To me, this is what defines a virus.”

The sheer number of viruses on Earth is beyond our ability to imagine. “In a small drop of water there are a billion viruses,” Dr. Wolkowicz said. Virologists have estimated that there are a million trillion trillion viruses in the world’s oceans.

More here.

Monday, May 4, 2009

Sunday, May 3, 2009

The 2012 Apocalypse — And How to Stop It

2012 Perhaps alarmist, Brandon Keim in Wired:

For scary speculation about the end of civilization in 2012, people usually turn to followers of cryptic Mayan prophecy, not scientists. But that’s exactly what a group of NASA-assembled researchers described in a chilling report issued earlier this year on the destructive potential of solar storms.

Entitled “Severe Space Weather Events — Understanding Societal and Economic Impacts,” it describes the consequences of solar flares unleashing waves of energy that could disrupt Earth’s magnetic field, overwhelming high-voltage transformers with vast electrical currents and short-circuiting energy grids. Such a catastrophe would cost the United States “$1 trillion to $2 trillion in the first year,” concluded the panel, and “full recovery could take 4 to 10 years.” That would, of course, be just a fraction of global damages.

Good-bye, civilization.

Worse yet, the next period of intense solar activity is expected in 2012, and coincides with the presence of an unusually large hole in Earth’s geomagnetic shield. But the report received relatively little attention, perhaps because of 2012’s supernatural connotations. Mayan astronomers supposedly predicted that 2012 would mark the calamitous “birth of a new era.”

Whether the Mayans were on to something, or this is all just a chilling coincidence, won’t be known for several years. But according to Lawrence Joseph, author of “Apocalypse 2012: A Scientific Investigation into Civilization’s End,” “I’ve been following this topic for almost five years, and it wasn’t until the report came out that this really began to freak me out.”

Wired.com talked to Joseph and John Kappenman, CEO of electromagnetic damage consulting company MetaTech, about the possibility of geomagnetic apocalypse — and how to stop it.

Visible Young Man

Colson In the NYT, a review of Colson Whitehead's Sag Harbor:

Now that we’ve got a post-black president, all the rest of the post-blacks can be unapologetic as we reshape the iconography of blackness. For so long, the definition of blackness was dominated by the ’60s street-fighting militancy of the Jesses and the irreverent one-foot-out-the-ghetto angry brilliance of the Pryors and the nihilistic, unrepentantly ghetto, new-age thuggishness of the 50 Cents. A decade ago they called post-blacks Oreos because we didn’t think blackness equaled ghetto, didn’t mind having white influencers, didn’t seem full of anger about the past. We were comfortable employing blackness as a grace note rather than as our primary sound. Post-blackness sees blackness not as a dogmatic code worshiping at the altar of the hood and the struggle but as an open-source document, a trope with infinite uses.

The term began in the art world with a class of black artists who were adamant about not being labeled black artists even as their work redefined notions of blackness. Now the meme is slowly expanding into the wider consciousness. For so long we were stamped inauthentic and bullied into an inferiority complex by the harder brothers and sisters, but now it’s our turn to take center stage. Now Kanye, Questlove, Santigold, Zadie Smith and Colson Whitehead can do blackness their way without fear of being branded pseudo or incognegro.

So it’s a perfect moment for Whitehead’s memoiristic fourth novel, “Sag Harbor,” a coming-of-age story about the Colsonesque 15-year-old Benji, who wishes people would just call him Ben. He’s a Smiths-loving, Brooks Brothers-wearing son of moneyed blacks who summer in Long Island and recognize the characters on “The Cosby Show” as kindred spirits.

Sunday Poem

Found
Ron Koertge

My wife waits for a caterpillar
to crawl onto her palm so she
can carry it out of the street
and into the green subdivision
of a tree.

Yesterday she coaxed a spider
into a juicier corner. The day
before she hazed a snail
in a half-circle so he wouldn’t
have to crawl all the way
around the world and be 2,000
years late for dinner.

I want her to hurry up and pay
attention to me or go where I
want to go until I remember
the night she found me wet
and limping, felt for a collar
and tags, then put me in
the truck where it was warm.

Without her I wouldn’t
be standing here in these
snazzy alligator shoes.

A Queen for the Ages

From The Washington Post:

Cleo More than two millennia after it took place, the story of Cleopatra has lost none of its grip on the world's imagination. It has inspired great plays (Shakespeare, Shaw and Sardou), novels, poems, movies (Elizabeth Taylor!), works of art, musical compositions both serious (Handel and Samuel Barber) and silly (“Comin' Atcha,” by Cleopatra), and of course histories and biographies. Yet for all this rich documentation and interpretation, it remains at least as much legend and mystery as historical record, which has allowed everyone who tells it to play his or her own variations on the many themes it embraces.

The latest to take it on is Diana Preston, a British writer of popular history. On the evidence of “Cleopatra and Antony,” I'd say she's a thoroughgoing pro. Her research is careful and deep; her prose is lively and graceful; her sympathy for her central character is strong but wholly without sentimentality; her depiction of the worlds in which Cleopatra lived is detailed, textured and evocative. If there is a better book about Cleopatra for today's reader, I don't know what it is.

She calls her book “Cleopatra and Antony,” thus reversing the order as immortalized by Shakespeare. History and legend have usually given priority to the two great men in the Egyptian queen's life, Julius Caesar and Mark Antony, but Preston argues that “Cleopatra perhaps deserves first place” because “her tenacity, vision and ambition would have been remarkable in any age but in a female ruler in the ancient world they were unique.” She was “a charismatic, cultured, intelligent ruler,” yet thanks to the propaganda put about by Octavian — later the Emperor Augustus but in the fourth decade B.C. Mark Antony's rival for control of the Roman Empire — she “was transformed into a pleasure-loving houri, the very epitome of fatal beauty and monstrous depravity, bent on bringing animal gods, barbarian decadence and despotism to the sacred halls of Rome's Capitol.”

More here.

Why can’t we concentrate?

Laura Miller in Salon:

Story Here's a fail-safe topic when making conversation with everyone from cab drivers to grad students to cousins in the construction trade: Mention the fact that you're finding it harder and harder to concentrate lately. The complaint appears to be universal, yet everyone blames it on some personal factor: having a baby, starting a new job, turning 50, having to use a Blackberry for work, getting on Facebook, and so on. Even more pervasive than Betty Friedan's famous “problem that has no name,” this creeping distractibility and the technology that presumably causes it has inspired such cris de coeur as Nicholas Carr's much-discussed “Is Google Making Us Stupid?” essay for the Atlantic Monthly and diatribes like “The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future,” a book published last year by Mark Bauerlein.

You don't have to agree that “we” are getting stupider, or that today's youth are going to hell in a handbasket (by gum!) to mourn the withering away of the ability to think about one thing for a prolonged period of time. Carr (whose argument was grievously mislabeled by the Atlantic's headline writers as a salvo against the ubiquitous search engine) reported feeling the change “most strongly” while he was reading. “Immersing myself in a book or a lengthy article used to be easy,” he wrote. “Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text.” For my own part, I now find it challenging to sit still on my sofa through the length of a feature film. The urge to, for example, jump up and check the IMDB filmography of a supporting actor is well-nigh irresistible, and once I'm at the computer, why not check e-mail? Most of the time, I'll wind up pausing the DVD player before the end of the movie and telling myself I'll watch the rest tomorrow.

More here.

The Big Similarities & Quirky Differences Between Our Left and Right Brains

Carl Zimmer in Discover Magazine:

Brain There is nothing more humbling or more perception-changing than holding a human brain in your hands. I discovered this recently at a brain-cutting lesson given by Jean-Paul Vonsattel, a neuropathologist at Columbia University. These lessons take place every month in a cold, windowless room deep within the university’s College of Physicians and Surgeons. On the day I visited, there were half a dozen brains sitting on a table. Vonsattel began by passing them around so the medical students could take a closer look. When a brain came my way, I cradled it and found myself puzzling over its mirror symmetry. It was as if someone had glued two smaller brains together to make a bigger one.

Vonsattel then showed us just how weak that glue is. He took back one of the brains and used a knife to divide the hemispheres. He sliced quickly through the corpus callosum, the flat bundle of nerve fibers that connects the halves. The hemispheres flopped away from each other, two identical slabs of fleshy neurons.

Sometimes surgeons must make an even more extreme kind of slice in the brain of a patient. A child may suffer from epilepsy so severe that the only relief doctors can offer is to open up the skull and cut out the entire hemisphere in which the seizures start. After the surgery, the space soon fills with cerebrospinal fluid. It may take a child a year of physical therapy to recover from losing a hemisphere—but the fact that patients recover at all is stunning when you consider that they have only half a brain. It makes you wonder what good two hemispheres are in the first place.

More here.

After the Great Recession

President Obama discusses how his policies on schools, energy and health care might change daily life in America.

David Leonhardt in the New York Times Magazine:

03obama-500 Are there tangible ways that Wall Street has made the average person’s life better in the way that Silicon Valley has?

THE PRESIDENT: Well, I think that some of the democratization of finance is actually beneficial if properly regulated. So the fact that large numbers of people could participate in the equity markets in ways that they could not previously — and for much lower costs than they used to be able to participate — I think is important.

Now, the fact that we had such poor regulation means — in some of these markets, particularly around the securitized mortgages — means that the pain has been democratized as well. And that’s a problem. But I think that overall there are ways in which people have been able to participate in our stock markets and our financial markets that are potentially healthy. Again, what you have to have, though, is an updating of the regulatory regimes comparable to what we did in the 1930s, when there were rules that were put in place that gave investors a little more assurance that they knew what they were buying.

More here.

Genius: The Modern View

David Brooks in the New York Times:

Ts-brooks-190 Some people live in romantic ages. They tend to believe that genius is the product of a divine spark. They believe that there have been, throughout the ages, certain paragons of greatness — Dante, Mozart, Einstein — whose talents far exceeded normal comprehension, who had an other-worldly access to transcendent truth, and who are best approached with reverential awe.

We, of course, live in a scientific age, and modern research pierces hocus-pocus. In the view that is now dominant, even Mozart’s early abilities were not the product of some innate spiritual gift. His early compositions were nothing special. They were pastiches of other people’s work. Mozart was a good musician at an early age, but he would not stand out among today’s top child-performers.

What Mozart had, we now believe, was the same thing Tiger Woods had — the ability to focus for long periods of time and a father intent on improving his skills. Mozart played a lot of piano at a very young age, so he got his 10,000 hours of practice in early and then he built from there.

More here.