MySpace and culture of fear

Danah Boyd writes about how youth culture is treated in the US and examines the connections between Columbine and banning MySpace:

“I’m tired of mass media perpetuating a culture of fear under the scapegoat of informing the public. Nowhere is this more apparent than how they discuss youth culture and use scare tactics to warn parents of the safety risks about the Internet. The choice to perpetually report on the possibility or rare occurrence of kidnapping / stalking / violence because of Internet sociability is not a neutral position – it is a position of power that the media chooses to take because it’s a story that sells. There’s something innately human about rubbernecking, about looking for fears, about reveling in the possibilities of demise. Mainstream media capitalizes on this, manipulating the public and magnifying the culture of fear. It sells horror films and it sells newspapers.

…The effects are devastating. Ever wonder why young people don’t vote? Why should they? They’ve been told for so damn long that their voices don’t matter, have been the victims of an oppressive regime. What is motivating about that? How do you learn to use your voice to change power when you’ve been surveilled and controlled for so long, when you’ve made an art out of subversive engagement with peers? When you’ve been put on drugs like Strattera that control your behavior to the point of utter obedience? “

More Here

Monday Musing: Reexamining Religion

Pervez Hoodbhoy is a well-known physicist who teaches at the Quaid-e-Azam University in Islamabad, Pakistan. He is also well-known for his frequent and intelligent interventions in politics. In an article entitled Miracles, Wars, and Politics he writes:

PervezOn the morning of the first Gulf War (1991), having just heard the news of the US attack on Baghdad, I walked into my office in the physics department in a state of numbness and depression. Mass death and devastation would surely follow. I was dismayed, but not surprised, to discover my PhD student, a militant activist of the Jamaat-i-Islami’s student wing in Islamabad, in a state of euphoria. Islam’s victory, he said, is inevitable because God is on our side and the Americans cannot survive without alcohol and women. He reasoned that neither would be available in Iraq, and happily concluded that the Americans were doomed. Then he reverentially closed his eyes and thrice repeated “Inshallah” (if Allah so wills).

The utter annihilation of Saddam Hussein’s army by the Americans which soon followed, did little, of course, to attenuate this student’s convictions. (Also, it is mildly interesting that Muslim conceptions of heaven focus so much on precisely the easy availability of alcohol and women.) Constantly confronted by such attitudes, atheists such as myself are often driven to hair-pulling exasperation by the seeming irrationality of religious belief, and specifically its immunity to refutation by experience, logic, argument, or it seems, anything else. Professor Hoodbhoy goes on to note that:

In Pakistan today – where the bulk of the population has been through the Islamized education initiated by General Zia-ul-Haq in the 1980’s – supernatural intervention is widely held responsible for natural calamities and diseases, car accidents and plane crashes, acquiring or losing personal wealth, success or failure in examinations, or determining matters of love and matrimony. In Pakistan no aircraft – whether of Pakistan International Airlines or a private carrier registered in Pakistan – can take off until appropriate prayers are recited. Wars certainly cannot be won without Allah’s help, but He has also been given the task of winning cricket matches for Pakistan.

Dawkins_7 And this state of affairs by no means obtains only in Islamic societies. It is more-or-less universal. Consider the following about the born-again-Christian-led United States: all polls about such subjects show that a great majority of Americans believe in miracles, angels, an afterlife where one will be reunited with one’s relatives and friends, and according to one recent poll, 96 percent believe in God. It is only in the rarefied air of elite academic institutions such as the National Academy of Sciences that one finds a majority of atheists and agnostics. And contrary to popular misconception, Europe is not much different. The reaction to this ubiquity of faith-based superstition, on the part of intellectuals, is best epitomized by Richard Dawkins’s frequent and witty expressions of indignant frustration with and attacks on religion. (He is not always choleric on this issue: one of the more tenderly moving things I have read is Dawkins’s letter to his 10 year-old dauStephen_1ghter Juliet, published in A Devil’s Chaplain as “Good and Bad Reasons for Believing.” If I ever have children, it will be required reading for them.) And I stand beside him in calling attention not only to the silliness of religious superstition, but to the misguidedly anodyne view repeatedly expressed by Stephen Jay Gould and others that religion and science do not clash and can peacefully coexist. They can do no such thing, and one has only to look at the recent court battles over Intelligent Design in Kansas, Pennsylvania, and Delaware to see that (battles similar to the creationist ones Gould was bravely at the forefront of fighting while alive). But until recently, few scientists have put much effort into explaining the ubiquity of religious beliefs. If it is so irrational, then why is religious conviction so widespread?

BoyerpicToday, I would like to report the fascinating work on this question of two young scientists: Pascal Boyer, an anthropologist, and Paul Bloom, a psychologist. Traditional explanations of religious beliefs have tended to fall roughly into two categories: first, there is what might be called the “opiate of the masses” view. This claims that religion is a way of assuaging the pain and suffering of everyday life. Faced with injustice and an indifferent physical universe, people have invented scenarios which help them imagine rewards and punishments in an afterlife, and other ways of injecting meaning into a seemingly purposeless existence. And second, there is the category of explanation of religion which relies on the social benefits which accrue to a society which shares religious beliefs. In addition to providing group solidarity through ritual, these might include the acceptance of uniform moral codes, for instance. On this theory, religious beliefs are seen as memes that are particularly successful because they provide a survival advantage to the groups that hold them (maybe even simply by making people happier). As Pascal Boyer points out in his excellent book Religion Explained, in both cases it is assumed that reason is somehow corrupted or even suspended by the attractiveness (and benefits) of religious belief. [That’s Boyer on the right, above.]

Bloom_1There are problems with these views, and I will, again, just mention two: first, it is clear that people will not just believe anything that provides meaning or promotes social cohesion. There is a very limited type of belief that people are willing to accept, even in religion, and these explanations do not address this selectivity. For example, it would be very hard to convince people of a God who ceased to exist on Wednesdays [Boyer’s example]. The second problem, which has also been pointed out by Steven Pinker, is that both these types of explanation rely on showing that some advantage comes from believing in religion, but this is really a putting of the cart before the horse. We do not generally just believe a thing because having the belief might help us; we believe things that we think are true. If you are hungry, it may help you to believe that you just ate a huge meal, but you will not. As Bloom says in an article in this month’s Atlantic, “Heaven is a reassuring notion only insofar as people believe such a place exists; it is this belief that an adequate theory of religion has to explain in the first place.” [The picture is of Bloom.]

The new approach to explaining religion that Boyer and Bloom (and Scott Atran and Justin Barrett and Deborah Kelemen and others) represent does not see religious belief as a corruption of rationality, but rather as an over-extension of some of the very mental mechanisms that underlie and make rationality possible. In other words, rather than religion having emerged to serve a social or other purpose, in this view it is seen as an evolutionary accident. In particular, Bloom uses some developments in child psychology to shed light on the issue of religious beliefs, and it is these that I would like to focus on now. I cannot here go into the details of the experiments which demonstrate this, but it turns out that one of the things which seems hardwired (is not learned by experience) in young infants (before they can even speak), is the distinction between inanimate and animate objects. Infants are clearly able to distinguish physical things from objects which demonstrate intentionality and have psychological characteristics. In other words, things with minds. In Paul Bloom’s words, children are “natural-born dualists” (in the Cartesian sense). It is quite clear that the mental mechanisms that babies use to understand and predict how physical objects will behave are very distinct from the mechanisms they use to understand and predict how psychological agents will behave. This stark separation of the world into minds and non-minds is what, according to Bloom, makes it eventually possible for us to conceive of minds (or souls) without bodies. This explains beliefs in gods, spirits, an afterlife (we continue without bodies), etc. The other thing that babies are very good at, is ascriptions of intentionality. They are very good at reading desires and intentions in animate objects, and this is necessary for them to function socially. Indeed, they are so sensitive to this that they sometimes overshoot and even ascribe goals and desires to inanimate objects. And it is this tendency which eventually makes us animists and creationists.

Notice that while previously most people have proposed that we are dualists because we want to believe in an afterlife, this new approach turns that formulation around: we believe in an afterlife because we are born dualists. And we are born dualists to be able to make sense of a world which has two very different kind of entities in it (in terms of trying to predict what they will do): physical objects and things with minds. Bloom describes as interesting experiment in which children are told a story (with pictures) in which an alligator eats a mouse. The mouse has clearly died, and the children understand this. Bloom says:

The experimenters [then] asked the children a set of questions about the mouse’s biological functioning–such as “Now that the mouse is no longer alive, will he ever need to go to the bathroom? Do his ears still work? Does his brain still work?”–and about the mouse’s mental functioning, such as “Now that the mouse is no longer alive, is he still hungry? Is he thinking about the alligator? Does he still want to go home?”

As predicted, when asked about biological properties, the children appreciated the effects of death: no need for bathroom breaks; the ears don’t work, and neither does the brain. The mouse’s body is gone. But when asked about the psychological properties, more than half the children said that these would continue: the dead mouse can feel hunger, think thoughts, and have desires. The soul survives. And children believe this more than adults do, suggesting that although we have to learn which specific afterlife people in our culture believe in (heaven, reincarnation, a spirit world, and so on), the notion that life after death is possible is not learned at all. It is a by-product of how we naturally think about the world.

While it is this natural dualism that makes us prone to belief in an afterlife, spirits, gods, and other supernatural entities, it is what Pascal Boyer has called a hypertrophied sense of social cognition which predisposes us to see evidence of purpose and design even when it does not exist. Bloom describes it this way:

…nascent creationist views are found in young children. Four-year olds insist that everything has a purpose, including lions (“to go in the zoo”) and clouds (“for raining”). When asked to explain why a bunch of rocks are pointy, adults prefer a physical explanation, while children use a functional one, such as “so that animals can scratch on them when they get itchy.” And when asked about the origins of animals and people, children prefer explanations that involve an intentional creator, even if the adults raising them do not. Creationism–and belief in God–is bred in the bone.

As another example of attribution of causality to intentional agents where there are none, consider the widespread belief in witches. In an article entitled Why Is Religion Natural?, Pascal Boyer writes:

Witchcraft is important because it seems to provide an “explanation” for all sorts of events: many cases of illness or other misfortune are spontaneously interpreted as evidence for the witches’ actions. Witchcraft beliefs are only one manifestation of a phenomenon that is found in many human groups, the interpretation of misfortune as a consequence of envy. For another such situation, consider the widespread beliefs in an “evil eye,” a spell cast by envious people against whoever enjoys some good fortune or natural advantage. Witchcraft and evil eye notions do not really belong to the domain of religion, but they show that, religious agents or not, there is a tendency to focus on the possible reasons for some agents to cause misfortune, rather than on the processes whereby they could do it.

For these occurrences that largely escape control, people focus on the supernatural agents’ feelings and intentions. The ancestors were angry, the gods demanded a sacrifice, or the god is just cruel and playful. But there is more to that. The way these reasons are expressed is, in a great majority of cases, supported by our social exchange intuitions. People focus on an agent’s reasons for causing them harm, but note that these “reasons” always have to do with people’s interaction with the agents in question. People refused to follow God’s orders; they polluted a house against the ancestors’ prescriptions; they had more wealth or good fortune than their God-decreed fate allocated them; and so on. All this supports what anthropologists have been saying for a long time on the basis of evidence gathered in the most various cultural environments: Misfortune is generally interpreted in social terms. But this familiar conclusion implies that the evolved cognitive resources people bring to the understanding of interaction should be crucial to their construal of misfortune.

To state it one more time, the correct explanation for the ubiquity and stability of religious beliefs lies not in postulating rash abandonments of rationality for the gain of some social or mental benefit, but rather, such superstitious beliefs are firmly rooted in our ordinary mechanisms of cognitive functioning. In addition, these beliefs are parasitic upon mental systems which have evolved for non-religious functions, but which have similarities to religious concerns: for example, fear of invisible contaminants (religious rituals of washing), or moral intuitions and norms (religious commandments).

Obviously, to see this sort of naturalistic account of religious and other supernatural beliefs as an endorsement or defense of religion would be to commit a naturalistic fallacy of the worst sort. What Boyer, Bloom, et al have done is to point out a weakness in our cognitive apparatus, which is a by-product of the way our mental systems have evolved. This is analogous to the well-known systematic weaknesses that people show in thinking about probabilistic phenomena (shamelessly exploited in Las Vegas and Atlantic City, not to mention the highly deplorable state-run lotteries). Having discovered an accidental source of incorrect beliefs within ourselves, we must struggle against it, and be ever-vigilant when thinking about these sorts of issues.

Have a good week!

My other recent Monday Musings:
Posthumously Arrested for Assaulting Myself
Be the New Kinsey
General Relativity, Very Plainly
Regarding Regret
Three Dreams, Three Athletes
Rocket Man
Francis Crick’s Beautiful Mistake
The Man With Qualities
Special Relativity Turns 100
Vladimir Nabokov, Lepidopterist
Stevinus, Galileo, and Thought Experiments
Cake Theory and Sri Lanka’s President

Selected Minor Works: Taxonomy as a Guide to Morals

Justin E. H. Smith

There is a long tradition in philosophy, going back at least to Epicurus, of allowing examples drawn from the domain of sexuality to serve in the analysis of eating, and vice versa. Sometimes this amounts to sloppiness, but often one can gain insight. Consider the photographs of peaches or cherries that make their way onto the covers of books in the erotica genre. These might tromper l’oeil, for an instant, but when we see what the photo is actually of, we are inclined to think: how clever, that peach looks like a naked woman from behind. Yet publishers of erotic literature dare not attempt the same trick with a suitably ambiguous photograph of a goat’s haunches. An erotic experience caused by a cherry is a fundamentally different sort of experience than one caused by a goat. This difference might, on its own, lead one to think that, similarly, a culinary experience with a cherry and one with a goat are two very different things as well. It is also interesting to note that in poetry allusions to fruit work well as erotic metaphors, while mention of ‘meat’ in the same context would be not erotic, but pornographic.

But it is zoology, and not phenomenology, that informs the dietary rules of contemporary ethical eaters. Most vegetarians today seek to index their dietary rules to Linnaean taxonomy. A moment’s reflection will show this to be an odd project. To eat corn and mushrooms, but not beef and mussels, only because, as we inhabitants of the post-Linnaean world know, cows and marine invertebrates are grouped together in the kingdom “animalia,” whereas plantae and fungi are different kingdoms altogether, is, one might think, to put a bit too much faith in the ability of scientific taxonomy to reflect reality, and, what’s more, to serve as a guide to practice. When it comes to dietary decisions of this sort, surely folk taxonomy is a more reliable guide. The Karam of the New Guinea Highlands, to cite one of many examples available from the anthropological literature of kingdom-mixing in folk taxonomy, class certain mushrooms with animals, in virtue of the texture of their ‘meat’. nd what folk taxonomy tells us is that cows are more like humans than they are like scallops, and scallops are more like corn-on-the-cob than they are like cows– the intuitive appropriateness of the phrase ‘frutti di mari’ has, after all, survived three centuries of taxonomic precisification.

The relevant likeness, again, has nothing to do with arguments for or against moral status based on neurophysiological evidence. Rather, it has to do with the instruments and methods employed to kill the creature, the amount of blood spilled, and the sense of the relative specialness of the meal that results from this killing. Though the taxonomies are very different, in all cultures, in addition to the class of entities that cannot be killed and eaten under any circumstances –pets and people (at least the friendly ones, again, as we will see below), and usually negatively social creatures such as rats—, there appears to be a certain class of entities cordoned off from the rest, distinguished by the fact that members of this class cannot be casually killed and eaten. They can be killed and eaten, but this will require some kind of communal to-do by which their sociocosmic significance –or what we would call their ‘moral status’– is acknowledged.

We are led astray not just in trying to index ‘moral wrongness’ to the innate cognitive and sensitive capacities of the beings in question, but that we are led astray even in thinking that the question of what we are and are not to eat has much to do with ‘moral wrongness’ in the sense in which philosophers understand it. Rather, rules about what can be eaten, and under what circumstances –never social animals like pets or rats, sometimes large game, fruits and nuts more or less anytime—seem to involve a few basic, evolutionally ingrained, cross-cultural rules, and on top of these a good deal of culturally variable rules that nonetheless within the culture feel as inexorable as the basic ones. Eating, as the Epicureans suspected, thus parallels sexuality in significant ways—the mother-son incest taboo is universal, but whether sex with your second cousin, or your second wife, or outside of marriage, or during menstruation, is ‘morally wrong’ will differ from place to place. All of these practices are capable of being morally wrong, but only in the etymological sense of ‘moral’: pertaining to the practices of a group.

The classicist and philosopher G. E. R. Lloyd has argued in his Magic, Reason, and Experience (Cambridge, 1979), that often it is not just difficult but impossible to determine when, in ancient texts, some reference to “purification” or “cleansing” is meant in a medical, and when in a moral-religious, context. He notes that the ambiguity arises only because we ourselves are intent on separating the two usages, whereas the Greek writers themselves may not have seen any need to do so. He cites Mary Douglas’s work in a more general anthropological context, which shows convincingly that “notions of the ‘clean’ and the ‘dirty’ usually reflect fundamental assumptions concerning the natural, and the moral, order.” It would be useful to bear in mind the ease with which naturalistically understood rules about ‘what one does’ and moral proscriptions are elided, and not to assume that we are radically different from the ancient Greeks or the Lele of the Congo in this regard. And for us, as for other cultures, there are presuppositions about what one may fitly do with an object that serve to constitute our very concept of the object, and that these must precede any explication of our moral commitments vis-à-vis that object. On Douglas’s approach, the moral proscription against eating something would be nothing more than an ad hoc rationalization of the fact that some potential food item belongs to the class of things that are ‘not to be eaten’. Yet the tendency in philosophical discusssions of vegetarianism has been to presume that we can meaningfully distinguish between ‘hygienic’ and ‘moral’ considerations that might give form and meaning to a person’s vegetarianism, as though hygiene had nothing to do with morality, as though the pretheoretical perception of an entity’s belonging to the class of edibles or inedibles had nothing to do with the way we subsequently give reasons for why we eat the things we do and not others.

I do not know if meat-eating is something humans ‘ought’ to be doing. I suspect the answer to this question has more to do with primatology than with moral philosophy: are we the sort of primate that eats meat? And with anthropology: are there human cultures that class all of what zoology places under the heading ‘animalia’ under the heading ‘inedibles’? The unwillingness of people on either side of the debate to consider the question in these terms surely is not doing any animals any good.

Poetry and Culture

Australian poet and author Peter Nicholson writes 3QD’s Poetry and Culture column (see other columns here). There is an introduction to his work at and at the NLA.

Benjamin Britten: Music and poetry, attendant muses at the grinding gears

Leonard Bernstein once said of Benjamin Britten that his music was ‘dark, there are gears grinding and not quite meshing . . . making great pain’. That seems true. Certainly, there is no transcendence in Britten, as there is in Elgar, for example. The Dream of Gerontius is a work of faith and Christian journeying, but Britten was having none of Cardinal Newman’s agenda. Nor is there that charged explosion of sensuality Elgar achieved with In The South, though the composer does let his hair down occasionally, as in the Four Cabaret Songs or the finale  of Spring Symphony. Even Elgar’s melancholy seems schooled in hopefulness by comparison with Britten. Britten is simply dark, and anxiety is close to the surface. Neither could Britten luxuriate, as Delius does, or glitter with  delight, like Walton. But what Britten does have, something those other composers do not have to same degree, is the most exquisite ear for poetry and the ability to set is superbly. To a very real degree, it is poetry that gets Britten through his dark nights of the soul.

Britten was lucky that one of his earliest friendships was with Auden, and, naturally enough, being with a poet of this stature couldn’t help but rub off on a sensitive and intelligent personality like Britten’s. There are many early settings of Auden, On This Island being one of the best known. Britten and Auden had a falling out later on, but I don’t think Britten ever forgot what he learnt from Auden about the intimacies possible when music and poetry work in harmony. True, Britten was rather scornful of Auden’s and Kallman’s text for Stravinsky’s The Rake’s Progress, but that scorn was based on a profound working knowledge of how to set dramatic texts for opera. Britten showed he could do it marvellously well in A Midsummer Night’s Dream. The other opera librettos may not always be settings of poetry, but they are certainly poetic. When Peter Grimes sings ‘Now the Great Bear and Pleiades . . .’ it is certainly something poetic we are hearing. The fact that is dramatic too just goes to show how effective Britten’s settings could be when his imagination was fired by a suitable subject.

I was fortunate enough to once meet Britten at the 1970 Adelaide Festival. He was one of the first composers I started collecting on LP. People of a certain generation remember those Decca recordings with their texts in print size that made them easy to read, unlike today’s CD equivalents. Well, I was a particularly green student at the time, but I knew Britten had been interested in setting King Lear, and I asked him about that. There was an ominous silence, but I often think that would have been a more suitable final work for a composer of his temperament, rather than Death In Venice with its chilled ecstasies and gamelan playfulness. It’s one of those ‘what if’ questions we ask about artists we like. Fellini’s Mastorna project or Wagner’s proposed final symphonies also come to mind.

One of the first recordings I bought had Les Illuminations on it. I didn’t understand the full ramifications of the work at the time, but could feel Britten’s identification with the text. Somehow, music and text are integrated naturally, instinctively. You could say the same thing of all of Britten’s setting of poetry texts. There are no false notes. There is a real marriage of true minds, the muses of music and poetry meeting equally on Helicon, neither subsuming the other, each requiring the other’s succour.

The War Requiem is a real act of transfigurative creative feeling. There had been a kind of precursor when Mahler, in his Symphony No 8, set the Latin hymn Veni Creator Spiritus and then completed the work with the last part of Goethe’s Faust, but Britten was doing something more adventurous, at least from a literary viewpoint. Since Britten cannot find the transfigurative moment to redeem the deaths memorialised on the dedication page of the War Requiem, or fill Coventry cathedral with ‘Take me away’ chords out of Gerontius, he does something quite original. He inserts the poetry of Wilfred Owen throughout and, just when we might be expecting the summons to a higher cause, what we get is the sheer awfulness of war, the ‘pity of war’, the imagined reconciliation in hushed remonstrance in ‘Strange Meeting’. To think that this work was once regarded by a certain section of the musical avant-garde as the white elephant of British music speaks of their failure to react creatively to poetry in the way that Britten did so effectively in this work.

However, the composer pacifist still had to deal with his own violent demons, and poetry seems to be one of the ways he accommodated what must have seemed, in the wake of the Second World War with its apocalyptic severances, the failure of art to prevent the facts of the Holocaust and the boundless dead. Britten played with Menuhin at the end of the war for survivors of the concentration camps, and the memories he brought back from that time prompted the song cycle he composed not long after, The Holy Sonnets of John Donne. The muscular confrontation with the fact of suffering brought forth a cycle in which Donne’s verse starkly counterpoises the music. The counterweight to this confrontational style is the calm and lucid settings of Shelley, Tennyson, Coleridge, Middleton, Wordsworth, Owen, Keats and Shakespeare in Nocturne, where Britten finds the kind of equipoise so often missing elsewhere. On the edge of sleep, or in the idea of sleep itself, the composer finds repose. To use Yeats’ words, the ceremony of innocence may be drowned (though not in most of the works written for children such as The Young Person’s Guide to the Orchestra), but the memory that one was once innocent—Britten reaches at that with all his yearning. You still wake to find the blood and pain of the world, but during the cycle one has been enchanted, a little. Some of Puck’s juice has been sprinkled in our eyes too. The moment passes, but the moment was beautiful. And one doesn’t forget that it was real. Britten has made it so. Poetry has helped the composer get there. Perhaps, essentially, Bernstein is wrong. The gears do mesh, because if they didn’t there would be no music, no memorability, no greatness of spirit, which there clearly is in these compositions.

Britten was not a parochial composer, for all the jokiness about ‘Addleborough’ (Aldeburgh). The languages set include French, Russian, German, Italian, American and British poets. His sensitivity encompasses Soutar and Hardy, Michelangelo and Jonson, nervous fibres reaching out for any memorable words to centre what seems, at heart, a certain pessimism. If one takes account of all the poetry settings Britten composed music for, and thinks of the literary imput from Crabbe, Melville, James and Mann, and others, then one really is prompted to consider Britten one of poetry’s, and language’s, most eloquent advocates. A composer as subtle and as various in his or her choice of texts, and the ability to set them as memorably as Britten: the muses were here in agreement, and they bestowed their graces liberally, even though darkness is clearly visible and any joy achieved is hard won.

How singing unlocks the brain

Jane Elliot writes for the BBC:

_41032360_whole_brain203 “As Bill Bundock’s Alzheimer’s progressed he became more and more locked into his own world.

He withdrew into himself and stopped communicating with his wife, Jean.

Jean said Bill lost his motivation, and his desire and ability to hold conversations, but all this changed when the couple started attending a local sing-song group, aimed especially for people with dementia.

Jean said Singing for the Brain had unlocked Bill’s communication block. “

Yes, Virginia

From The New York Times:Woolf1

IN January 1915, when Virginia Woolf was 33, she and her husband, Leonard, resolved to do three things: lease a house outside London; acquire a printing press; and buy a bulldog. As Julia Briggs recounts in her intelligent and well-researched new biography of Woolf, the couple never got the dog, but the creation of the Hogarth Press – named after Hogarth House, their new home – significantly influenced 20th-century literature. Purposely seeking out “work that might not otherwise get into print,” they published T. S. Eliot, Katherine Mansfield and Woolf herself. Freed from commercial pressures, Woolf could now pursue her most “radically experimental” leanings, and in her formal innovation, she became a pioneer of modernism.

Today, some of Woolf’s books seem stylized, at times experimental for the sake of being experimental – “The Waves” comes to mind – but her most widely read and admired works, including “To the Lighthouse” and “Mrs. Dalloway,” are read and admired for a reason. Briggs’s subtitle pays tribute to Woolf’s exploration of the inner life, her ability to capture the nebulousness of the human experience as it plays out second by second and translate it, in thrillingly nuanced ways, into words.

More here.

George W’s nemesis

From The Guardian:Jokes_final_1

Ever found yourself between a rock and a hard place? You loathe George Bush, for example, yet feel queasy looking to Michael Moore or George Galloway as your lodestar. You want to demonstrate against the war, or just against the handling of its fallout, but aren’t sure you want to march under the same banner as Bolsheviks for the Republic of Palestine.

If this strikes a chord, Al Franken is for you. As a hammer of Bush, Karl Rove and Co, the liberal comedian and nemesis of the right-wing shock-jocks has all of Moore’s wit and audacity and perhaps a touch of his ego, but avoids sounding like a propagandist. His latest book, subtle, laugh-or-cry-out-loud and ultimately devastating, is Michael Moore without the exclamation marks.

More here.

italo for beginners


Italo Calvino never wrote a bad book. Yet an author of such diffusion, without a single, encompassing magnum opus to embrace (some readers will argue for “Invisible Cities,” but that ineffably lovely book shows too narrow a range of Calvino’s effects, too little of his omnivorous exuberance) needs a beginner’s entry point, as well, perhaps, as a compendium to point toward posterity. Does it seem sacrilegious to propose a fat volume called “The Best of Calvino”? Call it “Tales,” then, or “Sixty Stories.” Does it seem to do violence to choose from linked pieces, or from books long since enshrined in reader’s hearts in their present, inviolate state? It isn’t as though the individual volumes need to go out of print to make room for the career-spanning omnibus I have in mind. Perhaps you consider it impossible to choose from within a structure as organically perfect as “Invisible Cities”? Fine, then include the entirety of that short book, just as “The Thurber Carnival” found space for the whole of “My Life and Hard Times.”

more form the NY Times Book Review here.

gombrowicz: the plotlessness thickens


Throughout the book, there are faint resonances of the intellectual prepossession with language that marked the era when “Cosmos” was written: the structural linguistics of Roman Jakobson, semiotics, the echo of the Sapir-Whorf hypothesis of “linguistic relativity” (which posits not consciousness but language itself as the human capacity that creates and organizes reality).

I don’t know whether Gombrowicz was deliberately playing with the intellectual currents of his day or whether he was one of those seminal artists who give voice to questions scholars will later rationalize. It doesn’t really matter. What’s important is that the insight in these remarkable pages is creatively captivating and intellectually challenging. Perhaps Gombrowicz’s break-out attempt from the Nietzschean “prison house of language,” in which postmodernism so blithely accepts its life sentence, feels a bit quaint today. But it’s also true that in the 40 years since “Cosmos” was published, no one has done any better.

more from the NY Times Book Review here.

Pakistan: $5.4B in Quake Aid Raised

From The Washington Post:

Kid ISLAMABAD, Pakistan — International donors have pledged $5.4 billion in quake aid to Pakistan, surpassing the amount sought by the government, the prime minister said Saturday. The U.S. nearly tripled its pledge to more than half a billion dollars in a show of support for a key ally in the war on terror. The new pledges came at a donors conference attended by about 50 nations. Pakistan had hoped to get $5.2 billion for rebuilding from the Oct. 8 quake, which killed 86,000 people in its territory and another 1,350 in neighboring India. Before the conference, aid pledges totaled $2.4 billion but Pakistan had only received about 10 percent of it.

Musharraf said the calamity provided an “an opportunity of a lifetime” for Pakistan and archrival India to improve relations and resolve their dispute over Kashmir.

“If leaders fail to grasp fleeting opportunities, they fail their nations and peoples,” Musharraf told the conference. “Let success and happiness emerge from the ruins of this catastrophe, especially for the people of Kashmir. Let this be the Indian donation to Kashmir.”

More here.

Generation Rx

From The New York Times:Pills

APOCALYPTIC literature naturally gravitates toward the maudlin, lamenting that the world is going to hell in a handbasket, usually courtesy of someone like Eminem orTom DeLay. This is what makes Greg Critser’s “Generation Rx” such an unexpected delight. Although his message is unrelievedly depressing – drug companies, with the nation’s physicians and the federal government already on the payroll, have transmogrified a self-reliant nation into a herd of functional drug addicts – there is something so congenial and non-self-righteous about the way he tells his story that few of the scoundrels singled out for public obloquy will take personal offense.

Thus, describing the evolution of Glaxo from a sleeping giant to a juggernaut, Critser says that “in the boggy pharma jungle,” the company “swung on the vine of prior greatness while withering on stultifying British business practices.” Marveling at the liver, he writes, “It is the only organ that can, with time, regenerate itself, a kind of Donald Trump of the human body.” And he identifies Washington as “an unfathomable brothel to all but the Reverends Rove and Cheney.”

More here.



With how many people did people used to sleep? It’s hard to tell. Language changes, and there’s the problem of bragging. Take the French. Stendhal in his treatise on love is expansive on the seduction strategies of his friends (hide under the bed; announce yourself so late in the night that kicking you out would already be a scandal), but in The Red and the Black Julien Sorel sleeps with exactly two women—and for this they cut off his head! A generation later, the dissipated Frédéric Moreau hardly does any better in Sentimental Education. Flaubert himself mostly slept with prostitutes. In Russia, one could always sleep with one’s serfs, as Tolstoy did. (He felt terrible about it.) But peers, acquaintances, members of one’s own class? America was the worst. Henry James in his notebooks wonders if he should write a story about a man, “like W. D. H. [Howells], who all his life has known but one woman.” James had known zero women! Twenty years later, there was Greenwich Village. Edna St. Vincent Millay, riding back and forth all night on the ferry, was the most promiscuous literary woman of her time. But her biographer puts the grand total of her conquests at fourteen, and some of these, according to a rival biographer, are questionable—and three were “well-known homosexuals.” So ten. For the modern college senior, this is a busy but not extravagant Spring Break.

more from n+1 here.



The American painter Richard Pousette-Dart (1916-1992), whose very large late paintings are the subject of an enchanting exhibition at Knoedler & Company, was often described in his lifetime as the youngest of the Abstract Expressionist painters of the New York School. He was indeed younger than Pollock, de Kooning and a few other artists in that group, and his paintings were often exhibited with theirs. Yet in neither his art nor his life did Pousette-Dart have much in common with the artists of that group. For one thing, he was never any sort of Expressionist. The bravura gestural style that we associate with Pollock, de Kooning et al. was entirely alien to Pousette-Dart’s sensibility; so was the hard-drinking bohemian lifestyle of the painters who made the Cedar Tavern a favorite destination of art-world groupies. Pousette-Dart’s interest in the social life of the fashionable art world was practically nil. By temperament and conviction he was a family man, and his was a family of artists: His father was a painter and art writer; his mother a writer; and his children, too, have pursued careers in art and music.

more from Hilton Kramer at The New York Observer here.

we pee on things and call it art


His enemies, and God knows he has a few, often complain that Sewell’s love of art ends with Poussin. Anything later and he just isn’t interested. This is inaccurate. He has a ‘quite unreasonable passion’ for Joseph Beuys and loves the Chapman brothers. On balance, however, it is fair to say that he thinks that modern art is rubbish. ‘We’ve reached the point where Laurence Llewellyn-Bowen might as well be an artist; all he needs is an empty room and some chalk. We pee on things, we pee into things, we pee over things… and call it art.’ He is especially contemptuous of women artists. ‘Women are no good at squeezing cars through spaces. If you have someone who is unable to relate space to volume, they won’t make a good artist. Look at Barbara Hepworth – a one-trick pony. Look at that pile of rubbish in the Tate by Rachel Whiteread.’ I choose not to respond to this. He moves on. ‘This will end in disaster. In another generation, it will be inconceivable that anyone will be taught how to paint. The blind are leading the blind. The head of painting at the Royal College couldn’t paint a Christmas card.’ Does he find this depressing? ‘Not enormously. I’ve looked over the edge at death in the past few years enough times; when you’ve done that, you no longer find anything much very depressing.’

more from The Observer here.

What made us human?

From Science:Human

Humans and chimpanzees share at least 98% of their DNA, yet chimps are an endangered species while people have used their superior cognition to transform the face of the Earth. What makes the difference? A new study suggests that evolutionary changes in the regulation of a gene implicated in perception, behavior, and memory may be partly responsible.

Thirty years ago, geneticist Mary-Claire King and biochemist Allan Wilson proposed that changes in how genes are regulated, rather than in the proteins they code for, could explain important differences between chimps and humans (Science, 11 April 1975, p. 107). To test this hypothesis, an international team led by evolutionary biologist Gregory Wray of Duke University in Durham, North Carolina, focused on the gene that codes for the protein prodynorphin (PDYN), a precursor to a number of endorphins, opiatelike molecules involved in learning, the experience of pain, and social attachment and bonding. Humans carry one to four copies of a region of DNA that controls the expression of this gene. Human copies had five DNA mutations not seen in the other primates. The team concludes that the pattern is a solid example of natural selection acting on the human lineage after it split from the chimp line from 5 million to 7 million years ago.

More here.

The show goes on for Stephen Hawking

From MSNBC News:Hawking_hmed_1

Wednesday’s appearance at the Paramount Theatre — presented by the Oregon-based Institute for Science, Engineering and Public Policy, or ISEPP — was the last of three scheduled stops on the Cambridge professor’s U.S. lecture tour. Hawking, who suffers from a progressive neurodegenerative disease that has almost completely paralyzed him, was due to travel to Seattle from San Francisco. But when he was taken off his respirator Monday morning, “he basically flat-lined,” said Terry Bristol, ISEPP’s president and executive director. “They had to resuscitate, and that panicked a few people,” Bristol told the audience. “But he’s been there before.” Once the crisis had passed, Hawking wanted to go ahead with the Seattle leg of the trip, but his medical caretakers — including his wife, Elaine — thought he should stay put awhile longer, Bristol said. So Hawking and his aides worked with Intel, ISEPP and the Paramount to set up a Web-based teleconferencing link from a Bay Area hotel.

“Many scientists were still unhappy with the universe having a beginning, because it seemed to imply that physics broke down,” Hawking said. “One would have to invoke an outside agency, which for convenience one can call God, to determine how the universe began.” Hawking traced how scientists have tried to address that conundrum using quantum theory, inflationary Big Bang theory and observations of the cosmic microwave background radiation — sometimes known as the Big Bang’s “afterglow.”

More here.

Akbar at Yale

Said Hyder Akbar in Slate:

051116_cw_group_tnHere at Yale, most students turn to teachers and friends for advice in figuring out what they really want from college. But for me, the person who really helped me understand what I wanted was a guy writing to his wife in 1780. John Adams, in a letter to Abigail Adams, wrote, “I must study politics and war that my sons may have liberty to study mathematics and philosophy. My sons ought to study mathematics and philosophy, geography, natural history, naval architecture, navigation, commerce, and agriculture, in order to give their children a right to study painting, poetry, music, architecture, statuary, tapestry, and porcelain.” The quote, which I first read at the library while researching for a paper, really resonated.

When my native country, Afghanistan, was turned upside down in the fall of 2001—my senior year in high school—I became involved with a place I had never seen before. (I call it “native” because I was born a refugee in Pakistan, and my parents lived in Afghanistan for most of their lives.)

More here.  [Akbar is on the right in the picture.]

John Updike on George MacDonald Fraser’s New Novel

From The New Yorker:

George MacDonald Fraser’s twelfth book about the Victorian rogue and soldier Flashman, finds both the author and the hero in dauntless fettle, the former as keen to invent perils and seducible women as the latter is, respectively, to survive and to seduce them. Fraser, an Englishman schooled in Scotland, served with the Highland Regiment in India, Africa, and the Middle East, before settling on the Isle of Man. He has written other fiction, plus history, autobiography, and film scripts, besides serving as Flashman’s assiduous editor; the series is presented, under the over-all title “The Flashman Papers,” as its protagonist’s memoirs, which need only a few footnotes and spelling corrections to become excellent entertainments. It was a brilliant stroke of Fraser’s, in the first volume, “Flashman” (1969), to retrieve a minor figure in Thomas Hughes’s greatly popular, intensely Christian best-seller “Tom Brown’s School Days” (1857) and reanimate him as a lauded though inadvertent hero in the service of the British Empire.

More here.

Everything comes down to 1 and 0

Elizabeth Svoboda reviews The Lifebox, the Seashell and the Soul: What Gnarly Computation Taught Me About Ultimate Reality, the Meaning of Life and How to Be Happy by Rudy Rucker, in the San Francisco Chronicle:

RuckerA human brain, the assumption goes, is far more complex, incisive and unpredictable than any mere rules-governed machine.

Rudy Rucker thinks we’re all missing the point. If the affable computer scientist and sci-fi novelist had a mantra, it would be “Existence is computation.” Part technical treatise, part polemic, with a smattering of philosophy, Rucker’s magnum opus advances a red-hot firecracker of a thesis: Pretty much everything in the universe — Deep Blue, the human brain, the natural world and the way a soda can sprays when it’s cracked open — operates according to the same kinds of basic computational principles. He proposes that computation is everywhere in the same way pantheists assert that God is all around us.

Though Rucker defends this so-called “computational worldview” with all the zeal of a recent convert, his enthusiasm never becomes grating. His written demeanor is much more M. Scott Peck than Pat Robertson, and he is masterful at predicting and dispelling readers’ misgivings.

More here.