How to Dispel Your Illusions

Freeman Dyson reviews Thinking, Fast and Slow by Daniel Kahneman, in the New York Review of Books:

ScreenHunter_09 Dec. 11 19.04In 1955, when Daniel Kahneman was twenty-one years old, he was a lieutenant in the Israeli Defense Forces. He was given the job of setting up a new interview system for the entire army. The purpose was to evaluate each freshly drafted recruit and put him or her into the appropriate slot in the war machine. The interviewers were supposed to predict who would do well in the infantry or the artillery or the tank corps or the various other branches of the army. The old interview system, before Kahneman arrived, was informal. The interviewers chatted with the recruit for fifteen minutes and then came to a decision based on the conversation. The system had failed miserably. When the actual performance of the recruit a few months later was compared with the performance predicted by the interviewers, the correlation between actual and predicted performance was zero.

Kahneman had a bachelor’s degree in psychology and had read a book, Clinical vs. Statistical Prediction: A Theoretical Analysis and a Review of the Evidence by Paul Meehl, published only a year earlier. Meehl was an American psychologist who studied the successes and failures of predictions in many different settings. He found overwhelming evidence for a disturbing conclusion. Predictions based on simple statistical scoring were generally more accurate than predictions based on expert judgment.

A famous example confirming Meehl’s conclusion is the “Apgar score,” invented by the anesthesiologist Virginia Apgar in 1953 to guide the treatment of newborn babies. The Apgar score is a simple formula based on five vital signs that can be measured quickly: heart rate, breathing, reflexes, muscle tone, and color. It does better than the average doctor in deciding whether the baby needs immediate help. It is now used everywhere and saves the lives of thousands of babies.

More here.

Jonathan Lethem on Reading, Writing, and Concepts of Originality

Brian Gresko in Agni:

ScreenHunter_08 Dec. 11 18.57Jonathan Lethem is the author of eight novels, including Motherless Brooklyn (1999), which won the National Book Critics Circle Award, and The Fortress of Solitude (2003), a New York Times Bestseller. His native Brooklyn serves as the setting for both of those acclaimed novels. Recent work has found him exploring Los Angeles in You Don’t Love Me Yet (2007) and New York City’s Upper East Side in Chronic City (2009). That book, along with much of Lethem’s oeuvre, is influenced by the work of Philip K. Dick, whose novels Lethem edited for The Library of America. In addition to fiction, Lethem’s essays on music, literature, and culture have appeared in publications such as Harper’s and Rolling Stone. Last year he published a critical look at John Carpenter’s science fiction film They Live; a study on the Talking Heads album Fear of Music is forthcoming. Lethem teaches creative writing at Pomona College, in California, but I had the opportunity to speak with him at his studio in Brooklyn.

Brian Gresko: Do you have models in mind when you begin a project? By models I mean works that influence your writing.

Jonathan Lethem: I’ve always been a consciously influenced writer. I usually have some models in mind for anything I’m writing, whether it’s other novels, or some films, or sometimes even a comic book. In terms of prose style, I am almost always open to writing some degree of homage, or trying to adopt or import a part of another writer’s style into what I’m doing. Usually it’s more than one author, and/or it’s in combination with some radically different influence on the narrative strategy, or on the kind of motifs, characters, or situations that I’m writing about. I never think that this is going to simply seem like writer X, because I’m always colliding that influence with a number of other elements.

More here.

Amphibians are all but gone, bequeathing us lessons that must not be squandered

Joseph R. Mendelson III in American Scientist:

ScreenHunter_07 Dec. 11 18.42I learned great and terrible lessons in my first year of graduate work in 1989—more than I expected and more than I realized at the time. My advisor at the University of Texas at Arlington, Jonathan Campbell, arranged for me to spend a field season in Guatemala surveying the amphibians and reptiles in the environs of a coffee plantation near Pueblo Viejo, in the Department of Alta Verapaz. I was going it alone with no Spanish language skills in the midst of a brutal civil war. But there had never been any herpetological work done in this particular part of the country, so I balanced my apprehension with a vision of myself following in the footsteps of the intrepid tropical natural historians and explorers whose monographs I was intensely studying.

During that summer I assembled a good collection of reptiles from Pueblo Viejo, but only a poor collection of amphibians. We all assumed at the time that the paltry representation of amphibian specimens was due to my incomplete skills; where I come from in southern California, reptiles are common and amphibians are scarce. Twenty years later, it is clear to all herpetologists that I had arrived on the scene of a massacre. That field trip launched my research career as an amphibian taxonomist and, indeed, I discovered my first species of frog new to science at Pueblo Viejo. Since then I have had the pleasure of discovering and naming dozens of new species, but some of them I “discovered” on museum shelves and not in the field. My finds had become extinct before they were even named. I chose to become a herpetologist, not a paleontologist, because I enjoy working afield with live animals. Recent reflection has forced me to reconsider my academic title. I am a forensic taxonomist.

More here.

Occupy Wall Street: The Oakland Commune

Aaron Bady in Possible Futures:

ScreenHunter_06 Dec. 11 18.19As a site of resistance, “Wall Street” is a metonym for a system, a transnational apparatus of capital and political oligarchy. We don’t have to get too specific, because we all know what we mean when we say “Wall Street” (even if we don’t agree on what that thing actually is). And so while that particular part of Lower Manhattan might be a focal point of a gigantic process of accumulation and dispossession, “Wall Street” is still just a concrete symbol for that larger and much less tangible process. The fact that so much financial work is actually done elsewhere is not that important; to “Occupy Wall Street” is to attack and de-legitimize the thing it symbolizes, the ordering structure that builds and rebuilds the world around us, that the rest of us have no choice but to inhabit and endure.

This is why it has meant something very different, from the beginning, to “Occupy Oakland.” In a just world—in the world the occupiers are trying to usher into existence—there might be no such thing as “Wall Street” at all, and certainly not in its current form. But Oakland is not a center of finance and power or a locus of political privilege. There is a “here” here. No one really lives in Wall Street, but those who “Occupy Oakland” do so because they already did. As a result, when we “Occupy Oakland,” we are engaged much less in a symbolic protest against “the banks” or “the 1%”—political actions which are given their shape by the political terrain of protesting abstractions—and much more in a very concrete struggle for a right to the city.

After all, the police who dispersed occupiers with tear gas were only doing the sort of thing they had long been accustomed to doing to the poor, transient, and/or communities of color that make up a great majority of Oakland’s humanity.

More here.

The First Great Paradox of Contemporary Cultural History

Kurt Andersen in Vanity Fair:

ScreenHunter_05 Dec. 11 18.08The past is a foreign country. Only 20 years ago the World Wide Web was an obscure academic thingamajig. All personal computers were fancy stand-alone typewriters and calculators that showed only text (but no newspapers or magazines), played no video or music, offered no products to buy. E-mail (a new coinage) and cell phones were still novelties. Personal music players required cassettes or CDs. Nobody had seen a computer-animated feature film or computer-generated scenes with live actors, and DVDs didn’t exist. The human genome hadn’t been decoded, genetically modified food didn’t exist, and functional M.R.I. was a brand-new experimental research technique. Al-Qaeda and Osama bin Laden had never been mentioned in The New York Times. China’s economy was less than one-eighth of its current size. CNN was the only general-interest cable news channel. Moderate Republicans occupied the White House and ran the Senate’s G.O.P. caucus.

Since 1992, as the technological miracles and wonders have propagated and the political economy has transformed, the world has become radically and profoundly new. (And then there’s the miraculous drop in violent crime in the United States, by half.) Here is what’s odd: during these same 20 years, the appearance of the world (computers, TVs, telephones, and music players aside) has changed hardly at all, less than it did during any 20-year period for at least a century. The past is a foreign country, but the recent past—the 00s, the 90s, even a lot of the 80s—looks almost identical to the present. This is the First Great Paradox of Contemporary Cultural History.

More here.

‘Vocal Fry’ Creeping Into U.S. Speech

Marissa Fessenden in Science:

ScreenHunter_04 Dec. 11 17.55A curious vocal pattern has crept into the speech of young adult women who speak American English: low, creaky vibrations, also called vocal fry. Pop singers, such as Britney Spears, slip vocal fry into their music as a way to reach low notes and add style. Now, a new study of young women in New York state shows that the same guttural vibration—once considered a speech disorder—has become a language fad.

Vocal fry, or glottalization, is a low, staccato vibration during speech, produced by a slow fluttering of the vocal chords (listen here). Since the 1960s, vocal fry has been recognized as the lowest of the three vocal registers, which also include falsetto and modal—the usual speaking register. Speakers creak differently according to their gender, although whether it is more common in males or females varies among languages. In American English, anecdotal reports suggest that the behavior is much more common in women. (In British English, the pattern is the opposite.) Historically, continual use of vocal fry was classified as part of a voice disorder that was believed to lead to vocal chord damage. However, in recent years, researchers have noted occasional use of the creak in speakers with normal voice quality.

In the new study, scientists at Long Island University (LIU) in Brookville, New York, investigated the prevalence of vocal fry in college-age women. The team recorded sentences read by 34 female speakers. Two speech-language pathologists trained to identify voice disorders evaluated the speech samples. They marked the presence or absence of vocal fry by listening to each speaker's pitch and two qualities called jitter and shimmer—variation in pitch and volume, respectively.

More here.

3QD Politics & Social Science Prize Semifinalists 2011

Hello,

The voting round of our politics & social science prize (details here) is over. A total of 1,281 votes were cast for the 56 nominees (click here for full list of nominees). Thanks to the nominators and the voters for participating.

Carla Goller has designed a “trophy” logo that our top twenty vote-getters may choose to display on their own blogs. So here they are, in descending order from the most voted-for:

  1. Semi_2011_Politic-scieneAndy Worthington: Mocking the Law, Judges Rule that Evidence Is Not Necessary to Hold Insignificant Guantánamo Prisoners for the Rest of Their Lives
  2. Corey Robin: Revolutionaries of the Right: The Deep Roots of Conservative Radicalism
  3. 3 Quarks Daily: The Pao of Love
  4. Muhammad Cohen: Overheard at Ali’s Diner on Arab Street
  5. Peter Frase: Anti-Star Trek: A Theory of Posterity
  6. Zunguzungu: “The Grass Is Closed”: What I Have Learned About Power from the Police, Chancellor Birgeneau, and Occupy Cal
  7. The Awl: The Livestream Ended: How I Got Off My Computer And Onto The Street At Occupy Oakland
  8. Pandaemonium: Rethinking the Idea of “Christian Europe”
  9. Jadaliyya: Palestine in Scare Quotes: From the NYT Grammar Book
  10. The Primate Diaries: Freedom to Riot: On the Evolution of Collective Violence
  11. Crikey: Theorising Darwin: US may stockpile and transit cluster munitions
  12. Accidental Blogger: The mideast uprisings: a lesson for strong men, mad men and counterfactual historians
  13. 3 Quarks Daily: There’s Something about the Teeth of Tyrants
  14. Hopeless but not serious: Pokémon gets political
  15. Jadaliyya: The Marriage of Sexism and Islamophobia; Re-Making the News on Egypt
  16. U.S. Intellectual History: “When the Zulus Produce a Tolstoy We Will Read Him”: Charles Taylor and the Politics of Recognition
  17. Tang Dynasty Times: The Persian Prince Pirooz
  18. PH2.1: Polarization?
  19. 3 Quarks Daily: Pakistan: The Narratives Come Home to Roost
  20. David B. Sparks: Isarithmic History of the Two-Party Vote

The editors of 3 Quarks Daily will now pick the top six entries from these, and after possibly adding up to three “wildcard” entries, will send that list of finalists to Stephen M. Walt for final judging. We will post the shortlist of finalists here in the next day or two.

Good luck!

Abbas

The Fatwa Against Women Touching Bananas

Asra Nomani in The Daily Beast:

BanThis past week, an email pinged around the world, claiming that a Muslim cleric “residing in Europe” issued a, well, interesting fatwa, or religious ruling, banning Muslim women from touching bananas or cucumbers: “He said that these fruits and vegetables ‘resemble the male penis’ and hence could arouse women or ‘make them think of sex,'” according to a report in a supposed Egyptian website, BikyaMasr. The Times of India ran the story: “Islamic cleric bans women from touching bananas.” “If women wish to eat these food items, a third party, preferably a male related to them such as their a father or husband, should cut the items into small pieces and serve,” the cleric supposedly dictated.

…True or not, the possibility of such a fatwa underscores the long Ridiculist of fatwas, to borrow CNN host Anderson Cooper's nightly feature of news stories of the absurd. “That cleric is an idiot,” one Muslim wrote. “But what am I going to do now? I eat lots of bananas because I am vegetarian,” wrote Farzana Hassan, a progressive Canadian-Muslim leader. In our Muslim community, we've had enough comic fatwas to create our own Fatwa Ridiculist. Some of my nominees:

1. A man can work with a woman to whom he's not a brother, father, uncle, or son, if he drinks her breast milk first.
2. A husband can divorce his wife with a text message, declaring: “I divorce you. I divorce you. I divorce you.”
3. Muslim girls can't be tomboys.
4. Mickey Mouse is a corrupting influence and must die.
5. Emoticons are illegal.
6. You can't wear a Manchester United soccer jersey.
7. A husband and wife can't have sex naked.
8. Pokemon is as bad as Mickey Mouse.
9. Ditch the downward dog. Yoga is forbidden.
10. Girls above the age of 13 can't ride bikes. (See fatwa No. 3.)

To all of this I have only one thing to say: Please pass the banana split.

More here.

If I ruled the world

Ann Widdecombe in Prospect:

AnnAll magazines with a young readership would have to use adverbs, adjectives and subordinate clauses in their stories so that the young would be exposed to the beauty rather than just the functionality of language. Latin would be compulsory from 11 till 16 and classical Greek once more widely available. All television companies would be obliged to put on at intervals of no less than one a month an excellent play which had no serious swearing, explicit sex, drunkenness or estuary English. Children’s television presenters would be banned from ending every simple sentence with an interrogative note.

…All those who prefer old-fashioned manners would wear a small badge which would indicate to men that they could safely open doors, give up seats on public transport and offer help with carrying bags without the risk of a feminist tirade. These days I get that sort of treatment but why should I have had to wait until I became a little old lady? Girls who were scantily clothed, drunk and incapable in city centres would be escorted home and curfewed for a month. That would save a lot of casual sex, false rape allegations and wasted time in A&E departments. Men and women who turned up in A&E drunk or drugged would be obliged to pay for their treatment.

More here.

Saturday, December 10, 2011

Noam Chomsky: Occupy The Future

Noam Chomsky in In These Times:

ScreenHunter_03 Dec. 11 12.03Delivering a Howard Zinn lecture is a bittersweet experience for me. I regret that he’s not here to take part in and invigorate a movement that would have been the dream of his life. Indeed, he laid a lot of the groundwork for it.

If the bonds and associations being established in these remarkable events can be sustained through a long, hard period ahead – victories don’t come quickly – the Occupy protests could mark a significant moment in American history.

I’ve never seen anything quite like the Occupy movement in scale and character, here and worldwide. The Occupy outposts are trying to create cooperative communities that just might be the basis for the kinds of lasting organizations necessary to overcome the barriers ahead and the backlash that’s already coming.

That the Occupy movement is unprecedented seems appropriate because this is an unprecedented era, not just at this moment but since the 1970s.

The 1970s marked a turning point for the United States. Since the country began, it had been a developing society, not always in very pretty ways, but with general progress toward industrialization and wealth.

More here.

Kurt Vonnegut at the Writers’ Workshop

Suzanne McConnell in The Brooklyn Rail:

ScreenHunter_02 Dec. 11 11.52I was a student of Kurt Vonnegut, Jr.'s at the University of Iowa Writers' Workshop and we remained in touch from those years until his death. Vonnegut was not famous, when he taught there. He’d published four novels; Cat's Cradle had been published two years before but had not yet become a contemporary classic. He was working on Slaughterhouse Five. He was no more or less awesome than other writers teaching at the workshop – Vance Bourjaily, Nelson Algren, Jose Donoso, William Price Fox, Eugene Garber, and Richard Yates. But he was my favorite.

What was Kurt like in class as a teacher?

He was passionate, indignant. He wheezed with laughter. He laughed at his own jokes. He was practical. He was shy. He amused himself, during workshops, by doodling. He was kind. He was entertaining. He was smart.

More here. [Thanks to Elatia Harris.]

A building block for GPS

Peter Reuell in the Harvard Gazette:

ScreenHunter_01 Dec. 11 11.48Next time your Global Positioning System (GPS) helps you get from point A to B without pulling out a map, thank Norman Ramsey.

A professor emeritus of physics who recently died at 96, Ramsey’s work lay the foundation for the development of the atomic clock, a device that allows scientists to measure time more precisely than ever, and which is a critical component in global positioning systems (GPS).

Just as a grandfather clock counts the oscillations of a pendulum to keep time, atomic clocks use the movement of atoms — which oscillate at precise frequencies — to measure time. Using the devices, a second is no longer measured as a fraction of the time it takes the Earth to revolve around the sun, but as the time it take a cesium-133 atom to oscillate 9,192,631,770 times.

The advantage of such clocks is in their previously unheard of accuracy. Since every cesium-133 atom oscillates at the same frequency, clocks can be built that neither lose nor gain a second in millions of years. Such precision is critical in a number of scientific fields. Atomic clocks are used to track satellites in deep space, to test Einstein’s theory of general relativity, by astronomers seeking to use multiple radio telescopes to capture images of objects light-years away, and by geologists, who use GPS to track the movement of earthquake fault lines.

More here.

The Creative Potential of Occupy Wall Street’s Cooperative Power

Rochelle Gurstein in The New Republic:

Mic_2The night before the first big Occupy Wall Street rally at Foley Square in early October, I went to my local bookstore to hear Chris Lehmann speak about his new book, Rich People Things, which explores, with penetrating hilarity, the follies of the “one percent.” During the discussion, a number of us were struck by the way the obscenely wealthy few are proud to be an “elite” in contrast to the way the term, along with kindred ideas like taste, discrimination, and distinction, have been completely discredited—vilified—in matters of culture. From there the discussion turned to the poverty of our language of dissent and revolt, the resourcelessness of our political imaginations. The historians among us spoke of the radical labor movement that emerged in the last part of the nineteenth century and offered a number of examples of their compelling visceral imagery—plutocrats as “blood-sucking parasites of property”; the factory labor system as “a prison-house” or “chattel slavery”; the competition between men that underwrites capitalism as “bestial”; industrial cities as “inexpressibly base and ugly.” A young woman in the audience said that the nineteenth-century language did not sit well with her and that her generation did not feel comfortable with the language of 1960s “Up against the wall, mother-f-er” confrontation. She expressed some concern that her ivy-league college education—she had been an American Studies major—had made her so hyper-sensitive to speaking ill of anyone that she no longer had words that felt right to her to condemn those responsible for wrecking our country.

More here.

Anita Desai on Longing and Striving

From The New York Times:

AnitaSometimes a mango is just a mango. This is rarely the case in Indian novels, where mangoes tend to be luminescent orbs dangling in steamy air, glistening with sweetness, sex and Being itself, waiting to be plucked, caressed, birthed. Either that or they’re muddy and rotten and piled high on a dirty road, surrounded by rancid garbage, rank cooking fires, beggar children and grinning, greasy swindlers. In other words, mangoes in India’s literary fiction are much like India in literary fiction: distinguished by pleasing aromas or permanent anarchy, if not some chutneyed combination.

For almost five decades, Anita Desai’s writing has avoided this easy trafficking in the delicious and malicious. She has instead created a body of work distinguished by its sober, often bracing prose, its patient eye for all-telling detail and its humane but penetrating intelligence about middling people faced with middling prospects. Whether in India, Mexico or America, Desai’s characters tend to be easy marks for new possibilities — for something, anything, other than life as it is. This vulnerability leads to promising experiences, which often become fresh disappointments. For a writer so taken with such arrangements, the best results are minor-key masterpieces; the lesser efforts are melancholy suffocations. Both outcomes are evident in the three novellas that make up her new collection, “The Artist of Disappearance.”

More here.

Saturday Poem

The Wishing Tree

I stand neither in the wilderness
nor fairyland

but in the fold
of a green hill

the tilt from one parish
into another.

To look at me
through a smirr of rain

is to taste the iron
in your own blood

because I hoard
the common currency

of longing: each wish
each secret assignation.

My limbs lift, scabbed
with greenish coins

I draw into my slow wood
fleur-de-lys, the enthroned Brittania.

Behind me, the land
reaches towards the Atlantic.

And though I’m poisoned
choking on the small change

of human hope,
daily beaten into me

look: I am still alive—
in fact, in bud.

by Kathleen Jamie
From: The Tree House
Publisher: Picador, London, 2004

In Defence of Difference

Difference_articleMaywa Montenegro and Terry Glavin in Seed:

This past January, at the St. Innocent Russian Orthodox Cathedral in Anchorage, Alaska, friends and relatives gathered to bid their last farewell to Marie Smith Jones, a beloved matriarch of her community. At 89 years old, she was the last fluent speaker of the Eyak language. In May 2007 a cavalry of the Janjaweed — the notorious Sudanese militia responsible for the ongoing genocide of the indigenous people of Darfur — made its way across the border into neighboring Chad. They were hunting for 1.5 tons of confiscated ivory, worth nearly $1.5 million, locked in a storeroom in Zakouma National Park. Around the same time, a wave of mysterious frog disappearances that had been confounding herpetologists worldwide spread to the US Pacific Northwest. It was soon discovered that Batrachochytrium dendrobatidis, a deadly fungus native to southern Africa, had found its way via such routes as the overseas trade in frog’s legs to Central America, South America, Australia, and now the United States. One year later, food riots broke out across the island nation of Haiti, leaving at least five people dead; as food prices soared, similar violence erupted in Mexico, Bangladesh, Egypt, Cameroon, Ivory Coast, Senegal and Ethiopia.

The Zombie Apocalypse of Daniel Defoe

Zompocalypse_490Andrew McConnell Stott in Lapham's Quarterly:

You can barely flee down a city block these days without running smack into the middle of the newest zombie apocalypse, a genre usually traced back to Richard Matheson’s 1954 survivor novel, I Am Legend, but which finds a much more venerable precursor in Daniel Defoe’s A Journal of the Plague Year.

Defoe’s novel, published in 1722, is a mutant factual-fiction that recounts the plague epidemic of 1665, which dispatched almost 100,000 Londoners. Purporting to be the “memorial” of a survivor known only as “H.F.”, it was based on genuine documentary sources, including the diary of Defoe’s uncle.

For something so grounded in fact, A Journal of the Plague Year conforms to the expectations of zombie narratives in almost every way. People look to the skies for the origin of the pestilence, as in George A. Romero’s Day of the Dead; its city of spacious abandonment and grassed-over streets anticipates the empty metropolis of Danny Boyle’s 28 Days Later; and as the King takes flight and the law implodes, the living are faced with the decision to team-up or go it alone in the style of Robert Kirkman’s The Walking Dead, where zombie-battling is merely a skull-cleaving interlude between the real battles for resources.

What A Journal of the Plague Year doesn’t have is zombies—at least not explicitly. Still, the numberless, suppurating victims are apt to behave like the undead at every turn, crowding the novel with “walking putrefied carcasses, whose breath was infectious and sweat poison.” These abject and degenerating bodies, disfigured by the “tokens” of disease that look like “small Knobs…of callous or Horn,” can turn on others, even running through the streets actively seeking to infect people impressed “with a kind of Rage, and a hatred against their own Kind,” as if the sickness itself were filled with an “evil Will” determined “to communicate it self.” Thus babies kill their mothers, and men tackle women in the street hoping to infect them with a deadly kiss. Others manage to dodge the disease, only to be disfigured by the weight of madness or grief.