Being old and carefree works on grass

Our own Asad Raza in Tennis magazine:

Screenhunter_01_jul_04_0027It’s probably safe to say that no one in the world predicted the Arnaud Clement and Rainer Schuettler quarterfinal at Wimbledon.  The two veterans faced off yesterday in a match suspended, in a nice metaphor, by the lateness of the hour. 30-year-old Clement and 32-year-old Schuettler share the same career-best result: losing finalist, in both cases at the hands of Andre Agassi, at the Australian Open.  Schuettler achieved this in 2003, while Clement’s run occurred all the way back in 2001–he beat an 18 year-old Roger Federer on his way to the final.

The venerable duo who competed in what ESPN dubbed the “lost quarterfinal” were not the only older players to have success at this year’s tournament.  While Schuettler will have one semifinal spot, Marat Safin, at the advanced tennis age of 28, has already booked the other.  Meanwhile, Tamarine Tanasugarn, 31, made waves by advancing to the quarterfinals before losing to Venus Williams, 28.  29 year-old Natalie Dechy, ranked 97, held match points against world number one Ana Ivanovic.

Even more impressive, these elder statesmen and women are winning at the expense of young players and those in their mid-career primes: Maria Sharapova, Novak Djokovic, Andy Roddick, Jelena Jankovic, and David Nalbandian all lost early in the tournament, in many cases to their elders.

Why the sudden onslaught of older players doing so well at Wimbledon?

More here.



My Apologies to Malcolm

Gladwell_malcolm_f

I posted a Monday column this week entitled “Down, I say, Down with Malcolm Gladwell.” I was having a little fun with the invective. I started out calling him a fraud and ended with the question of whether he is salvageable as a human being. As one of our readers, Pete Chapman, noted in the comments section, the post was not in my usual style. Chapman mentioned that he appreciated in my essays “trying to balance out your judgements and show your reader that you’re also aware of the counterpoints to whatever position you take.” That is generally my approach. Sometimes I flub it and sometimes it works but mostly I think of criticism as a process of getting inside other positions, cherry picking through the world of infinite subjectivities. I’m a Pyrrhonian pragmatist, or something like that. No position feels entirely satisfying to me and thus I try to keep moving.

All this is a preface to saying that I’ve had an email exchange with Malcolm Gladwell and he’s a decent guy. I like him. And now I feel bad that I went directly for the ad hominem. I think my substantive critique of Blink is basically right, by the way, but there was no need for the gratuitous meanness. I did it, I suppose, to generate a little buzz for the piece and that’s not a particularly honorable way to go about it. My real point about Blink, without all the bells and whistles, would be the following: the relationship between judgment and knowledge is a mysterious and fascinating thing. Specifically, the way that judgments can be seen to precede and to ‘ground’ knowledge is both an important and unsettling thought. Blink is a work in this tradition but one that falls apart and gets tangled up in its desire to provide practical advice, to give people access to the “magic” of judgment.

So, since I am an editor here at 3QD, I’m taking up one posting today to say, publicly, sorry, Malcolm, you’re not a fraud and I was pushing the boundaries of jerkitude to wonder whether you are salvageable as a human being. I look forward to your next book, which I hereby pledge to review in these pages and without all the personal nastiness. I just hope it’s a lot better than Blink 🙂

morgan meis

exciting, modern, and a little vague

4obama_final_ipod_copy

A couple years ago, GQ asked John Kerry if he preferred the Beatles or the Rolling Stones. Kerry, never one to let an opportunity to appear human or interesting go unblown, refused to express a preference. “I can tell you the truth,” he said, “and the truth is I love both.” It took him a couple more months to lose the election. But right there, in that interview, he lost the rock-geek vote. Or at least, he ensured that if anybody who actually cared about music voted for him that November, they’d be doing so reluctantly. There’s no wrong answer to the Beatles-vs.-Stones question. And you’re certainly allowed to like both. But you can’t be agnostic. Kerry came off like he’d somehow failed to have a definitive emotional response to the two most important rock bands of his generation–or like he was afraid of articulating one during an election year, which is even worse.

In an interview to be published this Friday in Rolling Stone, Barack Obama doesn’t come right out and declare himself to be a Stones person. But when quizzed about the contents of his iPod by cub reporter Jann Wenner, he references the Stones twice, cites the awesomely apocalyptic “Gimme Shelter” specifically, and doesn’t give the Fab Four so much as a name-check. Also on the oPod: “[A] lot of Coltrane, a lot of Miles Davis, a lot of Charlie Parker”; “everything from Howlin’ Wolf to Yo-Yo Ma to Sheryl Crow to Jay-Z”; and music from Barack’s ’70s youth, including Stevie Wonder, Earth, Wind & Fire, and Elton John.

more from TNR here.

Rauschenberg was Ernie to Jasper Johns’ Bert

Robertrauschenberg

I’ve been thinking a lot about Rauschenberg lately. But I’ve always thought a lot about Rauschenberg. For my money (I wish!), he was and remains the unsurpassed master of visual language in the modern era; his seemingly effortless improvisational command of semiotics was exceeded only by the richness, intricacy and originality of his formalist skills. Treating information as material, he translated Dadaist collage into the idiom of painting; painting into sculpture; then flattened the whole menagerie into a dense and simultaneous info-pancake of silk-screened magazine clippings that stripped pictorialism and narrative linearity down to their bare wires.

If that weren’t enough, he was a dyslexic homosexual drunkard —all top-shelf people in my chest of drawers. Rauschenberg was Ernie to Jasper Johns’ Bert — expansive, self-indulgent, mischievous and visionary.

more from the LA Weekly here.

absorbing the new

Fig199

We are so accustomed to the existence of America that it is hard to think of the challenges that its “discovery” posed to contemporaries. To get some sense of the novelty we would need to conjure up a comparable event nowadays. Let us therefore imagine that New Horizons, the spacecraft headed for Pluto, launched in 2006, mysteriously crashes into an invisible barrier. Subsequent expeditions reveal that the barrier is made of a complex substance that reflects the light of the sun by breaking it up into a myriad shining dots of various sizes and degrees of intensity and then reflects the light of those dots against its own back by breaking them down further to give the impression of an infinite space behind it, seemingly filled with stars and galaxies. Further investigations suggest that the impression of constant expansion beyond the barrier is produced by the movement of the sun, which does in fact rotate around the earth just as Aristotle had assumed, but whose reflection on the complex structure of the barrier produces a false impression of immobility that has deceived astronomers since the time of Copernicus.

more from the TLS here.

THE TRANSCRIPT: TOM WOLFE + MICHAEL GAZZANIGA

From Seed:

Gazza Wolfe, who calls himself “the social secretary of neuroscience,” often turns to current research to inform his stories and cultural commentary. His 1996 essay, “Sorry, But Your Soul Just Died,” raised questions about personal responsibility in the age of genetic predeterminism. Similar concerns led Gazzaniga to found the Law and Neuroscience Project. When Gazzaniga, who just published Human: The Science Behind What Makes Us Unique, was last in New York, Seed incited a discussion: on status, free will, and the human condition.

Tom Wolfe: Mike, I don’t want you to think I’m giving up my right to disagree with you down the line — I may not have to — but you’re one of the very few evolutionary thinkers and neuroscientists that I pay attention to, and I’ll tell you why. In the ’90s, when the subject of neuroscience and also genetics started becoming hot, there was a tendency to conflate genetic theory and evolutionary theory with neuroscience, as if the two were locked, which just isn’t true. Remember Jose Delgado, the wave brain physiologist who was at Yale at one time?

Michael Gazzaniga: Oh yeah. Sure.

TW: The guy stood in a smock in a bullring and put stereotaxic needles in the brain of a bull and just let himself be charged. He had a radio transmitter. The bull is as far away as that wall is from me, and he presses the thing and the bull goes dadadada and comes to a stop.

MG: Right.

TW: He’s still with us; he’s in his 90s. Anyway, his son, also Jose Delgado, and also a neuroscientist, was interviewed recently and he said, “The human brain is complex beyond anybody’s imagining, let alone comprehension.” He said, “We are not a few miles down a long road; we are a few inches down the long road.” Then he said, “All the rest is literature.”

Many of today’s leading theorists, such as E. O. Wilson, Richard Dawkins, and Dan Dennett, probably know about as much on the human brain as a second-year graduate student in neuropsychology. That isn’t their field. Wilson is a great zoologist and a brilliant writer. Dawkins, I’m afraid, is now just a PR man for evolution. He’s kind of like John the Baptist — he goes around announcing the imminent arrival. Dennett, of course, is a philosopher and doesn’t pretend to know anything about the brain. I think it has distorted the whole discussion.

More here.

Could Our Own Proteins Be Used to Help Us Fight Cancer?

From Scientific American:

  • Guardian proteins, found in all forms of life, keep a wide variety of cellular processes running smoothly.
  • Through their diverse interactions, these proteins pick up telltale “fingerprints” of each cell’s contents, which has allowed them to evolve a critical role in immune responses to cancer or pathogens.
  • Therapies that take advantage of these proteins include inhibitors and enhancers of their various natural functions.

Cancer In 1962 someone at the Genetics Institute in Pavia, Italy, turned up the temperature in an incubator holding fruit flies. When Ferruccio Ritossa, then a young geneticist, examined the cells of these “heat shocked” flies, he noticed that their chromosomes had puffed up at discrete locations. The puffy appearance was a known sign that genes were being activated in those regions to give rise to their encoded proteins, so those sites of activity became known as the heat shock loci.

The effect was reproducible but initially considered to be unique to the fruit fly. It took another 15 years before the proteins generated when these chromosome puffs appear were detected in mammals and other forms of life. In what is certainly among the most absorbing stories in contemporary biology, heat shock proteins (HSPs) have since been recognized as occupying a central role in all life—not just at the level of cells but of organisms and whole populations.

More here.

Wednesday, July 2, 2008

Gender and Math, Taking the Social for the Natural

FlashcardOver at ars technica, John Timmer summarized findings from a new study in Science, via Delong:

[A] new study suggests that, when it comes to math, we can forget biology, as social equality seems to play a dominant role in test scores.

The study, which appeared in last week’s edition of Science, relied on a test from the Programme for International Student Assessment (PISA), run by the Organization for Economic Cooperation and Development (OECD). A total of over 275,000 students in 40 countries took the PISA exam as 15-year-olds. On average, girls scored about 2 percent lower than boys on math, but nearly 7 percent higher on reading, consistent with previous test results.

The researchers, noted, however, that the math gap wasn’t consistent between countries. For example, it was nearly twice as large as the average in Turkey, while Icelandic girls outscored males by roughly 2 percent. The general pattern of these differences suggested to the authors that the performance differences correlated with the status of women. The authors of the study built a composite score that reflected the gender equality of the countries based on the World Economic Forum’s Gender Gap Index, data extracted from the World Values Surveys, measures of female political participation, and measures of the economic significance of females.

Scandinavian countries such as Norway and Sweden score very high on gender equality measures; in these nations, the gender gap on math performance is extremely small. In contrast, nations at the other end of the spectrum, such as Turkey and Korea, had the largest gender gap. The correlations between gender equality and math scores held up under a statistical test designed to catch spurious associations. The authors even checked out the possibility of genetic effects not linked to the Y chromosome by examining whether genetic similarity between various European populations could account for these differences, but they found that it could not.

catch

Author_photo_rosenbaum

When Susan Sontag wrote “Notes on ‘Camp’ ” back in 1964, she was foregrounding—to use a current catchphrase—something familiar but not yet defined.

“Many things in the world have not been named,” her famous essay began, “and many things even if they have been named, have never been described. One of these is the sensibility—unmistakably modern, a variant of sophistication but hardly identical with it—that goes by the cult name of ‘Camp.’ “

I would choose nearly identical words to describe the phenomenon, the linguistic sensibility, that I’d name “catch”: the way our language has become increasingly dominated by rapidly cycling catchphrases. Rapidly cycling because in blogospheric time, they speed from clever witticism to tired cliché in the virtual blink of an eye.

more from Slate here.

Montesquieu

20080612_montesquieuh165jpg

Montesquieu would make most everyone’s top-ten list of political philosophers, but he is not prominent in the ranks of natural philosophers. Following the lead of the American Founders, who referred to him as “the celebrated Montesquieu,” we associate his name with new discoveries and improvements in the science of politics rather than science proper. However, as a young man in his late twenties, decades before the publication of his masterwork, The Spirit of the Laws (1748), Montesquieu seems to have been interested in a variety of scientific questions.

The young nobleman was elected to the Academy of Bordeaux in 1716. In keeping with that body’s preference for scientific endeavors, Montesquieu shifted away from literary and political explorations. Although his first presentation to the Academy was a “Discourse on the politics of the Romans in religion,” his subsequent offerings owed more to Descartes than Machiavelli.

more from The New Atlantis here.

the end

Wanderer_lg

The post-catastrophic novel began with Mary Shelley’s The Last Man (1826), in which a plague kills most of humanity and provokes incessant warfare. Plague remains the triggering calamity in much post-catastrophe fiction up through the Manhattan Project; even as late as George Stewart’s Earth Abides (1949), plague rather than nuclear war is the problem. But between the invention of James Watt’s coal-fired steam engine in 1784 and the start of the Cold War, the most haunting sci-fi visions were not visions of the end of the world. They were visions—in dystopian novels like We, Brave New World, and 1984—of the consolidation of technological civilization into a system of total social control. Zamyatin, Huxley, and Orwell did not imagine a time when the boots stamping on human faces could no longer be industrially manufactured, so that people would return to smashing one another’s faces the old-fashioned way, with stones. The bombing of Hiroshima revived this notion of a reduced, brutally simplified future; and from Nevil Shute’s On the Beach (1957) to Denis Johnson’s Fiskadoro (1985), through many novels in between, the idea of a future more primitive than the past ran alongside the idea of a future ever more technologically advanced.

more from n+1 here.

the suffering of keats

080707_r17502_p233

In July, 1820, John Keats published his third and final book, “Lamia, Isabella, The Eve of St. Agnes and Other Poems.” He had no reason to expect that it would be a success, with either the public or the critics: in his short career, the twenty-four-year-old poet had known nothing but rejection on both fronts. After his first book, “Poems,” appeared, in 1817, his publishers, the brothers Charles and James Ollier, refused to have anything more to do with him. In a letter to the poet’s brother George, they wrote, “We regret that your brother ever requested us to publish his book, or that our opinion of its talent should have led us to acquiesce in undertaking it.” They went on, “By far the greater number of persons who have purchased it from us have found fault with it in such plain terms, that we have in many cases offered to take the book back rather than be annoyed with the ridicule which has, time after time, been showered upon it.”

more from The New Yorker here.

greenaway’s last supper

Greenaway128

With a glint of a dagger and a blaze of celestial light, Leonardo da Vinci’s The Last Supper burst into new life on Monday night after Peter Greenaway finally secured permission to reinvent the crumbling, 510-year-old masterpiece as a sound and light show.

In a remarkable coup for the British film director, the Italian authorities allowed Greenaway to wheel a battery of projectors, computers and speakers into the usually hushed and air-sealed refectory of Santa Maria delle Grazie, where the image of Christ telling the apostles one of them will betray him decorates an end wall. Inside, Greenaway unveiled a provocative vision of one of Christianity’s most sacred and fragile paintings, reimagined “for the laptop generation”.

more from The Guardian here.

Wednesday Poem

///
The City That Never Sleeps

Frederico Garcia Lorca

In the sky there is nobody asleep.  Nobody, nobody.

Nobody is asleep.

The creatures of the moon sniff and prowl about their cabins.

The living iguanas will come and bite the men who do not dream,

and the man who rushes out with his spirit broken will meet on the
            street corner

the unbelievable alligator quiet beneath the tender protest of the
            stars.


Nobody is asleep on earth.  Nobody, nobody.

Nobody is asleep.

In a graveyard far off there is a corpse

who has moaned for three years

because of a dry countryside on his knee;

and that boy they buried this morning cried so much

it was necessary to call out the dogs to keep him quiet.


Life is not a dream.  Careful!  Careful!  Careful!

We fall down the stairs in order to eat the moist earth

or we climb to the knife edge of the snow with the voices of the dead
            dahlias.

But forgetfulness does not exist, dreams do not exist;

flesh exists.  Kisses tie our mouths

in a thicket of new veins,

and whoever his pain pains will feel that pain forever

and whoever is afraid of death will carry it on his shoulders.


One day

the horses will live in the saloons

and the enraged ants

will throw themselves on the yellow skies that take refuge in the
            eyes of cows.


Another day

we will watch the preserved butterflies rise from the dead

and still walking through a country of gray sponges and silent boats

we will watch our ring flash and roses spring from our tongue.

Careful!  Be careful!  Be careful!

The men who still have marks of the claw and the thunderstorm,

and that boy who cries because he has never heard of the invention
            of the bridge,

or that dead man who possesses now only his head and a shoe,

we must carry them to the wall where the iguanas and the snakes
            are waiting,

where the bear’s teeth are waiting,
where the mummified hand of the boy is waiting,
and the hair of the camel stands on end with a violent blue shudder.


Nobody is sleeping in the sky.  Nobody, nobody.

Nobody is sleeping.

If someone does close his eyes,

a whip, boys, a whip!

Let there be a landscape of open eyes

and bitter wounds on fire.

No one is sleeping in this world. No one, no one.

I have said it before.

No one is sleeping.

But if someone grows too much moss on his temples during the
            night,

open the stage trapdoors so he can see in the moonlight

the lying goblets, and the poison, and the skull of the theaters.

Translation: Robert Bly

///

Revolutions per Minute

From Orion Magazine:

Rev Sex before marriage. Bob and his boyfriend. Madame Speaker. Do those words make your hair stand on end or your eyes widen? Their flatness is the register of successful revolution. Many of the changes are so incremental that you adjust without realizing something has changed until suddenly one day you realize everything is different. I was reading something about food politics recently and thinking it was boring.

Then I realized that these were incredibly exciting ideas—about understanding where your food comes from and who grows it and what its impact on the planet and your body are. Fifteen or twenty years ago, hardly anyone thought about where coffee came from, or milk, or imagined fair-trade coffee. New terms like food miles, fairly new words like organic, sustainable, non-GMO, and reborn phenomena like farmers’ markets are all the result of what it’s fair to call the food revolution, and it has been so successful that ideas that were once startling and subversive have become familiar en route to becoming status quo. So my boredom was one register of victory.

More here.

THE END OF THEORY: Will the Data Deluge Makes the Scientific Method Obsolete?

From Edge:

Andersonchris200 Sixty years ago, digital computers made information readable. Twenty years ago, the Internet made it reachable. Ten years ago, the first search engine crawlers made it a single database. Now Google and like-minded companies are sifting through the most measured age in history, treating this massive corpus as a laboratory of the human condition. They are the children of the Petabyte Age.

The Petabyte Age is different because more is different. Kilobytes were stored on floppy disks. Megabytes were stored on hard disks. Terabytes were stored in disk arrays. Petabytes are stored in the cloud. As we moved along that progression, we went from the folder analogy to the file cabinet analogy to the library analogy to — well, at petabytes we ran out of organizational analogies. According to Chris Anderson, we are at “the end of science”, that is, science as we know it.” The quest for knowledge used to begin with grand theories. Now it begins with massive amounts of data. Welcome to the Petabyte Age.”

More here.

Tuesday, July 1, 2008

wood and the text

29_jameswood_lgl

IT is not unusual, post-Foucault, to observe the decline of God as a source of meaning in the West since the Enlightenment and the subsequent diminishment of the power of the Bible. Nor is it unusual to point out that this occurred side by side with the rise of the novel.

In fact, in The Art of the Novel (1985), Czech novelist Milan Kundera makes an even bolder claim on behalf of the novel, as not only an expression of the 17th century’s dawning humanism, but as one of its chief enabling technologies. By creating stories and characters that require the suspension of authorial judgment in order to come fully alive, Kundera argues, the novel is a form that is per se sceptical, and that makes us think beyond the boundaries of any religious dogma.

But in his reviews and two books of critical essays, The Broken Estate (1999) and The Irresponsible Self (2005), British literary critic James Wood makes an even more extravagant claim for the novel. Wood’s critical practice is based on the idea that fiction at a certain point took over the cachet and power of the sacred. For Wood, the best novelists combine the humane scepticism of the novel form with a quasi-religious drive to improve it.

At the bottom line of Wood’s writing is a conception of the novel that is almost kabbalistic.

more from The Australian here.

nas

Nas080630_250x375

To say that Nas’s new album is one of the most anticipated of the summer is true, but it misses the point—every Nas album is highly anticipated, because no rapper is held to as high a standard. When a Nas record is about to drop, hip-hop fans cross their fingers and wonder: Will he change hip-hop forever, again? Will the new record be as good as Illmatic?

Released in 1994, when Nas was 20, Illmatic was a prodigious debut. No other rap debut, and few rap albums at all, are as lauded. It’s hard to quantify the album’s achievement precisely, except to say that rapping is a craft, and Nas was the first to discover how to do it right. Rap has two components—beats and rhymes. On Illmatic, the beats were mostly good, sometimes great; and there had been virtuoso emcees before Nas who’d moved the stylized rhythms of early groups like N.W.A toward the conversational. But no one had ever sounded as natural as Nas. “One Love,” which takes the form of a monologue to an incarcerated friend, exploits poetic devices like enjambment so subtly that it works as prose. Every rapper who hopes to be taken seriously—from Kanye West to the Game—must grapple with Nas’s discovery.

more from New York Magazine here.

the pitiless dahlberg

Becauseiwasflesh2

He grew up in Missouri, the son of a lady barber. And in order to get a flavor of the man one must read lines like these, describing the orphanage where he spent childhood time:

They were a separate race of stunted children who were clad in famine. Swollen heads lay on top of ashy uniformed orphans. Some had oval or oblong skulls; others gigantic watery occiputs that resembled the Cynocephali described by Hesiod and Pliny. The palsied and the lame were cured in the pool of Bethesda, but who had enough human spittle to heal the orphans’ sore eyes and granulated lids.

Dahlberg talked and wrote like this. Unlike Charles Olson, whom he’d met at Harvard and whose work would always return to postwar Eurpoean philosophy and American politics, the autodidactic Dahlberg had identified with the proletarian underground since the twenties—and with ancient texts; he went into seven years of withdrawal from writing to study these. His first book, Bottom Dogs, had an introduction by D.H. Lawrence. After his withdrawal, he renounced his former self, his politics, everyone he knew, almost all men who aspired to write, and his early works.

more from Poetry here.

darwin and wallace

Dar460x276

In early 1858, on Ternate in Malaysia, a young specimen collector was tracking the island’s elusive birds of paradise when he was struck by malaria. ‘Every day, during the cold and succeeding hot fits, I had to lie down during which time I had nothing to do but to think over any subjects then particularly interesting me,’ he later recalled.

Thoughts of money or women might have filled lesser heads. Alfred Russel Wallace was made of different stuff, however. He began thinking about disease and famine; about how they kept human populations in check; and about recent discoveries indicating that the earth’s age was vast. How might these waves of death, repeated over aeons, influence the make-up of different species, he wondered?

Then the fever subsided – and inspiration struck. Fittest variations will survive longest and will eventually evolve into new species, he realised. Thus the theory of natural selection appeared, fever-like, in the mind of one of our greatest naturalists. Wallace wrote up his ideas and sent them to Charles Darwin, already a naturalist of some reputation. His paper arrived on 18 June, 1858 – 150 years ago last week – at Darwin’s estate in Downe, in Kent.

Darwin, in his own words, was ‘smashed’.

more from The Guardian here.