Teaser Appetizer: Sleep and Insomnia, A Letter to Shakespeare

Dear Shakespeare,

The other day I counted the word “sleep” in 167 passages of your work and I am sure I did not count them all. You have penned sleep in its own image and as a metaphor. You knew sleep:

“The innocent sleep, sleep that knits up the ravell’d sleeve of care, The death of each day’s life, sore labour’s bath, Balm of hurt minds, great nature’s second course, Chief nourisher in life’s feast.” (Macbeth, Act II)

And you knew sleep’s associated afflictions, especially its deprivation: insomnia.

Enjoy the honey-heavy dew of slumber: Thou hast neither figures nor no fantasies Which busy care draws in the brains of men; Therefore thou sleep’st so sound. (Julius Cæsar, Act II)

The purpose of my letter is to share with you new information we have learned about sleep and insomnia since your demise in 1616.In the intervening four hundred years; we have invented complex contraptions to study a sleeping person. We can record the brain activity, eye movement, muscle tone, breathing patterns and measure the levels of various substances (chemicals, molecules, hormones) floating in the blood. All these studies have yielded considerable information.

We now describe the architecture of sleep according to the movement of the sleeping eyes, which when moving rapidly is the “rapid eye movement” (REM) phase; the rest is the non rapid eye movement phase (NREM). We start our sleep with NREM and get into REM before waking up. NREM starts with easy arousal light sleep (stage1 and 2) and marches into deep sleep (stage 3 and 4) from which a slumbering person is difficult to arouse. The unpleasant experiences of night terrors, sleepwalking and bed-wetting – the afflictions you are familiar with — occur during deep sleep.

Macbeth had a troubled stage 4 sleep:

“Ere we will eat our meal in fear and sleep In the affliction of these terrible dreams That shake us nightly: better be with the dead, Whom we, to gain our peace, have sent to peace, Than on the torture of the mind to lie In restless ecstasy” (Macbeth, act III)

And Lady Macbeth too sleep walked in her deep stage 4 sleep:

“Since his majesty went into the field, I have seen her rise from her bed, throw her night-gown upon her, unlock her closet, take forth paper, fold it,. write upon’t, read it, afterwards seal it, and again return to bed; yet all this while in a most fast sleep.” (Macbeth, Act V)

Mr. Shakespeare, Did Parolles wet his bed in REM sleep or deep sleep?

“In his sleep he does little harm, save to his bed-clothes about him” (Alls Well That Ends Well, Act IV)

The sleep cycle starts with stage1 during which we drift in and out of sleep, our brain activity slows down. Then we enter stage 2: our brain activity slows down further; our muscles may suddenly jerk but our eyes do not move. Stage 3 shows periods of slow moving delta waves on recording of the brain activity and in stage four our muscles are immobile and brain activity reflects only delta waves. REM sleep follows stage 4: the eyes throw jerky movements, blood pressure rises, breathing becomes shallow and rapid, temporary muscle paralysis ensues, heart pounds faster, men get penile erections and the brain waves gather speed. We indulge in dreams during this active stage of sleep — the brain is hardly idle. (Most mammals and birds show REM sleep, but cold blooded animals and reptiles do not. Do other mammals dream?)

Mercutio got it wrong:

“True, I talk of dreams, which are the children of an idle brain” (Romeo and Juliet, Act I)

We sleep in cycles of NREM and REM and about 90 to 100 minutes elapse from the beginning of stage 1 to the end of REM. This cycle repeats 3 to 6 times at night. In a cycle of 100 minutes, the duration of stage 1 is 10 minutes, stage 2 is 50 minutes; stage 3 and 4 is 15 minutes and finally REM lasts 25 minutes. If we miss our REM sleep, we fall into REM the next night without other stages, till we catch up with this REM deficit.

We flash signals from the base of the brain which either awaken us or put us to sleep. These chemical signals or neurotransmitters like serotonin and norepinephrine exude from the nerve cells (neurons) of the brain stem and keep the brain awake. Other neurons at the base of the brain turn off the awakening process and we fall asleep. The levels of adenosine build up while we are awake and subside during sleep. Caffeine containing potions like coffee and tea inhibit adenosine.

REM sleep is regulated by the part of brain — we call Pons, which sends signals to other parts of the brain and also inhibits neurons in the spinal cord causing temporary paralysis.

We have learnt our bodies function in a cyclic rhythm spread over 25 hours: we call it circadian rhythm. Mere 200,000 neurons in the suprachiasmic nucleus (SCN) of hypothalamus play the role of the body clock. Sunlight or other bright light and even external noise triggers SCN which signals the pineal gland to shuts off the production of melatonin. The pineal gland secrets melatonin (a drowsiness inducing hormone) at night and in darkness. Some people with blindness suffer from sleeping disorders because they are unable to respond to light. Traveling long distance in a short period (jet lag) or change of shift at the work place can disrupt the circadian rhythm.

The neuro-chemical control of sleep is autonomous and we can not voluntarily deprive ourselves of sleep. Cleopatra in her raging defiance may have succeeded in starving herself but to defy sleep was an empty threat.

“Sir, I will eat no meat, I’ll not drink, sir; If idle talk will once be necessary, I’ll not sleep neither: this mortal house I’ll ruin, Do Caesar what he can. Know, sir, that I will not wait pinion’d at your master’s court.” (Antony and Cleopatra, Act V)

Voluntary sleep deprivation may not be possible, yet we have enough reasons to loose sleep: anxiety, depression and body pain. Iago could not sleep because of pain…

“And, being troubled with a raging tooth, I could not sleep.” (Othello, Act III)

And King Richard was anxious.

“For never yet one hour in his bed Have I enjoy’d the golden dew of sleep, But have been waked by his timorous dreams.” (King Richard III, Act IV)

In our “progress” we have added a few more causes of insomnia: jet lag, shift change and stimulant drugs. People who work at night and travel through time zones disturb the sunlight stimulus to circadian regulation.

We have found that sleep is essential for survival – at least for rats. Scientists made rats live on a platform floating in tub of water. When rats drifted into REM sleep their muscles got paralyzed and they slipped off the platform and fell into the water. Poor rats! Wet and drenched they struggled and climbed back onto the platform went into paralyzing REM sleep and fell into water again. Deprived repeatedly of REM sleep, they died in 3 weeks instead of usual 2 to 3 years.

Mr. Shakespeare, do you accuse us of being sadistic? We defend the progress of science at all costs! This very scoundrel race of rats spread the germs that devastated London with bubonic plague which must have caused you a few sleep less nights! Well, we people are adept at taking revenge for historical blood feuds; in his case it is against the rats. Well, they are all rats!

You ask what have we achieved in the past 400 years? We have accumulated wealth and information; yet it is true that our restless days still end up in sleepless nights. King Henry also knew it; money can buy you a bed but not sleep.

“Not all these, laid in bed majestical, Can sleep so soundly as the wretched slave, And, but for ceremony, such a wretch, Winding up days with toil and nights with sleep,” (King Henry V, act IV)

You probably think the information gained in the last 400 years has cured our insomnia… not so Mr. Shakespeare: 30 to 50 percent of our people now suffer from insomnia; we are a sleep deprived world. Here is some news for you from the Boston Globe:

The Institute of Medicine report said loss of sleep has increased in recent decades due to longer workdays and computer use and television watching taking up more time. Lack of sleep increases the risk of a variety of health problems, the report said, including diabetes, cardiovascular disease, and heart attacks. It also raises the chances of injury or death due to accidents at work, home, or in automobiles. Studies in the 1990s estimated the cost of medical care for sleep disorders at $15.9 billion, the report said. In addition, fatigue is estimated to cost businesses about $150 billion a year in lost productivity and mishaps, and damage from motor vehicle accidents involving tired drivers amounts to at least $48 billion a year. The National Sleep Foundation issued a report indicating only 20 percent of US adolescents get the recommended nine hours of sleep a night.

So, here we are, four centuries after you! Amazing: so much knowledge, yet how little we have learned! We have coaxed but a few ounces of wisdom from the tons of information we have collected and culled. Our data could fill the cavernous base of a medieval church and all the wisdom will rise but slim as a spire. From you we need to learn to build with in the scaffold of wisdom and not on the foundation of data. We need at least as many seers as we have scientists.

“Not poppy, nor mandragora, nor all the drowsy syrups of the world, Shall ever medicine thee to that sweet sleep, Which thou owedst yesterday.” (Othello, Act III)



Sojourns: Bored by the World Cup

Absolutvisionimg_1_1Let me confess at the outset that my lack of interest in the World Cup is matched only by my ignorance of the sport itself. Call me what you will. A philistine. A provincial. A vulgarian. An ugly American. But I have not been getting up in the morning to watch the matches. There is a reason for this I think. Sports are an acquired taste and deeply autobiographical. I grew up on the Jewish faculty-brat diet of baseball and basketball. By the time I was in college and self-consciously developing an interest in the arts and literature, televised sports seemed like something from a distant planet. When I returned to watching sports in my thirties, it was with the intense relish of rediscovering forgotten pleasures. I wanted the sweet succor of bygone days and older knowledge. Learning new things was for different regions of my brain and other times of day. Thus soccer fell between the cracks in my life. Too bad for me, I hear you say.

By not developing an interest in the World Cup, or at any rate by not professing one, I am something of a traitor to my own professional class. Even the most sports-averse and tweed-adorned professor these days can be seen taking a break to watch the surprising run of Ghana or the stalwart march of the Germans. (I find no great surprise, for example, that this very website, ordinarily so earnest and sober, so interested in international affairs, science, and medicine, has two separate bloggers reporting from the games.) The World Cup has in other words developed an odd kind of reach. It is both sports and not sports. Clearly billions of people who grew up in countries other than my own feel an intensity of fandom I cannot really understand, but which equally clearly provides the kind of visceral pleasure in viewing I can. I am however not interested here in what motivates soccer fans in the countries where the sport thrives. What I’m interested in, rather, is the acquired situational appreciation of soccer and its elevation into a sport that is more than a sport.

Gauloises1_1Perhaps I should just phrase this is as a simple question. Why do intellectuals or the chattering classes or the intelligentsia care so much about the World Cup? The least generous answer is simple Europhilia. Like smoking Gauloises or eating haggis, watching the tournament expresses a kind of vicarious belonging to a different continent, a sign that you spent your junior year abroad in Florence or Paris or Edinburgh and, when pressed, even know a word or two in a different language. Seen this way, one’s viewing habits provide a form of cultural capital and means of distinction. The sport is not simply a competition like the World Series; it is rather something of an aesthetic artifact, the appreciation of which becomes a badge of sophistication. It is, in the words of the New York Times, a “beautiful game.”

To be a little less cynical, the World Cup is for some clearly less about sports than about international relations and politics. On this account, the games are interesting for their allegorical significance. Teams really do represent nations after all. If say Ghana defeats France then centuries of colonialism and domination are momentarily upended in a great reversal of fortune. Even the uglier dimensions of the tournament—violence, “hooliganism,” racism, and the like—are interesting because they express some underlying sociological or political cause. One is interested in the sport not because it is a “beautiful game” but because of what it reveals about class tensions, race war, the new Europe, etc.

In either case, viewers of the World Cup watch the game from a sort of distance: the distance of aesthetics or of politics. The first translates the game into a mark of distinction and cultural capital; the second translates the game into an allegory or a symptom. The thing about such distance, at least for me, is that it gets in the way of the deeply intuitive and primal enjoyment that accompanies watching a sport with which one is intimately familiar. So I return to autobiography. Suburban kids now seem to be introduced to soccer as a matter of course. (Hence the specter of the American “soccer mom” looming large over pollsters and politicos everywhere.) When I was in elementary school back in the 70s, however, soccer was only beginning to be touted as the next thing to come. Some day soon, we were told, everyone would be kicking checkered balls, right about the same time as we would be measuring things in metric. The great metric conversion never came. And by the time soccer camps and leagues sprung up I was very much into other things. I simply never developed the self-transcending pleasure watching soccer that I did with other sports.

I tend not to think my own history is that unique, so I doubt that many Americans of my generation did either. While I am interested in the interest in the World Cup, therefore, the tournament itself leaves me bored.

Monday Musing: Susan Sontag, Part 2

The first part of this essay can be found here.

Inevitably, the exaltation and dreams of unity that she harbored during the Sixties were to disappoint Sontag, as they did everyone else. She was going to have to come down from those heights and find her own version of Zagajewski’s soft landing. And that is another thing that makes Susan Sontag so remarkable. At her most exalted, writing in 1968, just after returning from Hanoi, she says:

“I recognized a limited analogy to my present state in Paris in early July when, talking to acquaintances who had been on the barricades in May, I discovered they don’t really accept the failure of their revolution. The reason for their lack of ‘realism’, I think, is that they’re still possessed by the new feelings revealed to them during those weeks—those precious weeks in which vast numbers of ordinarily suspicious, cynical urban people, workers and students, behaved with an unprecedented generosity and warmth and spontaneity toward each other. In a way, then, the young veterans of the barricades are right in not altogether acknowledging their defeat, in being unable fully to believe that things have returned to pre-May normality, if not worse. Actually it is they who are being realistic. Someone who has enjoyed new feelings of that kind—a reprieve, however brief, from the inhibitions on love and trust this society enforces—is never the same again. In him, the ‘revolution’ has just started, and it continues. So I discover that what happened to me in North Vietnam did not end with my return to America, but is still going on.”

The world did return to normalcy, if not worse. But Sontag didn’t indulge in the outright lunacy of the New Left as it spiraled off into fantasyland. (Though she did endorse something of the mood of the New Left in one of her less successful and rather more hysterical essays “What’s Happening in America? (1966).” Still, when the chips were down she didn’t take that path. She kept her head.)

And the hint as to how she kept her cool is already there in the above passage. Her commitment to the integrity of the individual mind was a buttress for her. The solid structure of her mental edifice, built with that sternness of pleasure she never abandoned, allowed her to come in for a soft landing while people like the Situationists or the Yippies or The Weathermen floundered or came apart at the seams.

More than that, she was able to recognize her own missteps and rethink her exaltation. Even as she continued to lament the way in which her new experiences were sullied and her new consciousness never came to pass, she realized that much of its promise, especially in its political variants, had been an illusion. Increasingly in her essays in the Eighties and Nineties she celebrated the writers and artists of Central and Eastern Europe who fought the disaster of the ‘revolution’. In 1997, she was to write, “Intellectuals responsibly taking sides, and putting themselves on the line for what they believe in . . . are a good deal less common than intellectuals taking public positions either in conscious bad faith or in shameless ignorance of what they are pronouncing on: for every Andre Gide or George Orwell or Norberto Bobbio or Andrei Sakharov or Adam Michnik, ten of Romain Rolland or Ilya Ehrenburg or Jean Buadrillard or Peter Handke, et cetera, et cetera.”

She came to see that communism in Vietnam had been a lie and a farce, even as the Vietnamese resistance to the American war machine had been noble and just. She went to Bosnia again and again and never, for even a moment, indulged in the repellant apologies for Serbian nationalism that many of her colleagues on the Left dishonored themselves with. In fact, she always saw Europe and North America’s failure in Bosnia as another manifestation of the shallow interest in material happiness and comfort.

Such a vapid happiness was not what Sontag was referring to in her quest for difficult pleasure.

***

This is not to say that she was happy about politics and culture after the Sixties. Sometimes she was outright despondent. Sometimes she felt she had been tricked. She marveled how her own arguments had come back to haunt her. Things that she had advocated for in the Sixties were realized in ways completely contrary to her original intentions.

For instance in her seminal essay “Against Interpretation” (1962), she argued that criticism had become too Baroque. It was preventing immediate appreciation of things as things. So she made a call for transparence. “Transparence,” she said, “means experiencing the luminousness of the thing in itself, of things being what they are.” And then notoriously, at the end of the essay, she proclaimed, “In place of a hermeneutics, we need an erotics of art.”

Later, she came to realize that history had pulled something of a fast one on her. People did begin to appreciate, even worship, surface and appearance. Camp moved further into the mainstream. But it wasn’t happening in the way that Sontag intended. In a preface to Against Interpretation written in 1995 and entitled “Thirty Years Later . . .” she addressed the issue.

“It is not simply that the Sixties have been repudiated, and the dissident spirit quashed, and made the object of intense nostalgia. The ever more triumphant values of consumer capitalism promote–indeed, impose–the cultural mixes and insolence and defense of pleasure that I was advocating for quite different reasons.”

She won a battle at the expense of the greater victory she was hoping for. There was a revolution in a sense, and a democratizing of culture. But Sontag realized that it wasn’t leading to pleasure, real pleasure. Instead, it led to a devaluation of the seriousness of intellect that Sontag took to be a prerequisite for genuine pleasure. In what she calls her own naiveté, Sontag, in the Sixties, made an appeal for changes that consumer culture was only too ready to provide during the next few decades. But those changes came as an empty package. Talking thirty years later about the essays of Against Interpretation, she says, “The judgments of taste expressed in these essays may have prevailed. The values underlying those judgments did not.”

In response to this cruel trick of history, Sontag did verge dangerously close to nostalgia on occasion. Perhaps that is understandable. Her problem was even more acute than the problem of the Central Europeans for whom she had such sensitivity. Central Europeans might look back with some wistfulness on the intense seriousness of the ‘bad old days’ but they were, still, the bad old days. For all of Sontag’s hesitation in identifying with the Sixties as a movement, it was during those years that she experienced her greatest pleasures in art and understanding. They weren’t bad old days at all for her.

And she felt that as she was getting older she was simultaneously witnessing the disappearance of much of what had given her the greatest pleasure. In 1988, she expressed this as a European elegy. Europe, to Sontag, always represented resistance to the tide of philistinism—she even calls it barbarity—that emanates from America and its consumer culture. She says, “The diversity, seriousness, fastidiousness, density of European culture constitute an Archimedean point from which I can, mentally, move the world.”

By the late Eighties, she believed that that Archimedean point was drifting away as Europe became more homogeneous and “Americanized”. Without naming it directly, her contempt for the idea of European integration (this, again, in 1988) is palpable. What she calls the ‘diversity’ of Europe is predicated, for Sontag, on preserving the differences that come with national and thereby cultural boundaries. But with all the language of preservation and loss, Sontag manages to rescue the essay from outright nostalgia. She recognized the malleability and relativity of the “idea of Europe”. The idea of Europe is at its most potent, she argued, when wielded by the Central and European intellectuals who used it, implicitly, as a critique of the Soviet domination they were resisting. But Sontag was also aware that the rallying cry of “Europe” was distinctly unpalatable when raised in Western Europe as a warning against the new immigration. This latter point has only become more incisive in recent years. As always, Sontag was ahead of the times.

Indeed, by the end of her lament for Europe, Sontag turns a corner. Having aired her grievances, she begins to move forward. She comes in for another soft landing. She begins to shift onto another battlefield, moving just as quickly as modern experience does. That quickness, that readiness to move at the pace in which new experiences present themselves allows her, in seeming paradox, to find what is solid and lasting in things. “The modern has its own logic,” she writes, “liberating and immensely destructive, by which the United States, no less than Japan and the rich European countries, is being transformed. Meanwhile, the center has shifted.”

Having started “The Idea of Europe (One More Elegy)” by veering into a cultural conservatism that she spoke so eloquently against in her earliest essays, she manages to steer herself back into more Sontag-like territory. She is prepared to become an exile again, as she always was in the first place. Exiled in the sense that every intellect of integrity stands alone in the last instance, as a self. In asking what will happen next, as the greatness of Europe fades and transforms, Sontag refers to Gertrude Stein’s answer to those who wondered how she would deal with a loss of her roots. “Said Gertrude Stein, her answer perhaps even more Jewish than American: ‘But what good are roots if you can’t take them with you’.”

***

Susan Sontag always understood the melancholic personality lingering in the back alleys of modern consciousness. She understood the will to suicide in men like Walter Benjamin. She knew why Benjamin lived under the sign of Saturn and could write:

“The resistance which modernity offers to the natural productive élan of a person is out of proportion to his strength. It is understandable if a person grows tired and takes refuge in death. Modernity must be under the sign of suicide, an act which seals a heroic will . . . . It is the achievement of modernity in the realm of passions.”

Sontag understood the will to death and failure in Artaud. She understood the will to silence in Beckett and John Cage. Not only did she understand these things, she could write about them clearly, put her finger on them. She knew that Nietzsche’s prognostication about the coming nihilism had come to pass in much of the modern, and modernist, aesthetic she cherished so dearly.

She felt the exhaustion of the modern spirit. But she wasn’t exhausted by it. In her essay on Elias Canetti, “Mind as Passion,” she wrote the following;

“‘I want to feel everything in me before I think it’, Canetti wrote in 1943, and for this, he says, he needs a long life. To die prematurely means having not fully engorged himself and, therefore, having not used his mind as he could. It is almost as if Canetti had to keep his consciousness in a permanent state of avidity, to remain unreconciled to death. ‘It is wonderful that nothing is lost in a mind’, he also wrote in his notebook, in what must have been a not infrequent moment of euphoria, ‘and would not this alone be reason enough to live very long or even forever?’ Recurrent images of needing to feel everything inside himself, of unifying everything in one head, illustrate Canetti’s attempts through magical thinking and moral clamorousness to ‘refute’ death.”

Sontag is writing about Canetti but she is writing about Sontag too. As much as she measured and reported the pulse of an era in thought, art, morals, . . . as much as she eulogized its passing, she also stood for the brute continuation of life, of pleasure, and of joy. She’s dead now, but there is nothing that stimulates a desire to live more than reading one of her essays. If it so happens that we’re stumbling into an age of new seriousness and new sincerity we’re doing so partly because Susan Sontag showed us how important the world can be.

Monday, June 19, 2006

Selected Minor Works: The Opposite of Sports

Justin E. H. Smith

[An extensive archive of Justin Smith’s writing can be found at www.jehsmith.com]

Hamburg, Germany

I have just given a talk at the University of Hamburg in the Ernst Cassirer Memorial Lecture Series. What my hosts did not tell me when I was invited was that the event would coincide with some of the early matches of the World Cup, being hosted this year in (just my luck) Germany. I had come to Hamburg to discuss the legacy of Aristotle in the Protestant Reformation, but ended up being practically shut out by the nationwide roar accompanying the Poland-Germany match, with borderline hooligans –and as far as I’m concerned they’re all borderline hooligans– roving the streets shouting ‘Tor!’ as I droned on to an empty auditorium in poorly accented German about Martin Luther’s commentary on Genesis. Today I would like to explain why not just the victory of Germany over Poland, but also the victory in Europe of the drunken rabble over traditional piety, of the circus over the temple, is a thing to be bemoaned. I recognize that some in the 3QD community are ‘pumped’ about this year’s competition, and I do so hate to be a prick. But the fans among you may take comfort in the knowledge that I am grossly outnumbered, and that my quaint plaidoyer will assuredly come to nothing.

How, I want to know, does it manage to command such passions, this month of noise in which the outcome is known in advance? The winning team is of course not known in advance; what is known is that one or the other of them will win, according to inflexible rules that bind all of them equally. The only way the affair can amount to anything more than a mere stochastic process –as when atoms are bound to decay, even if we can’t say which ones– is if something goes terribly wrong, if some non-sportive interference breaks into the flow of events, some interference from the truly interesting domain of politics, a domain from which the fans like to sanctimoniously claim sports gives us a needed reprieve.

Thank God for the irruptions of the political, I say. Without them, there really would be no reason to pay attention to international athletic competitions. With them, we get bizarre reports of Iran’s soccer team being made to watch Sally Field in Not without my Daughter during their visit to France for the 1998 World Cup; we get a Cold War sublimated through doped-up East German shot-putters and 75-pound, slave-driven Chinese gymnasts. And we get Kim Jong Il at his zany best. In 1972, the now-Dear Leader gave a speech to the North Korean national soccer team: “It is very important to develop sports,” he declared. “Pointing out that physical culture is one of the means to strengthen the friendly relations with foreign countries, the great leader [Kim’s father, Kim Il Sung] said that physical culture should be developed. At present, however, the instructions of the great leader are not carried out to the letter in the sphere of physical culture, and sports exchange is not conducted properly, as required by the Party. A common example is the fact that our soccer players were defeated in the recent preliminaries for the Olympic Games.” Kim takes defeat as itself proof that the athletes are lapsing into counterrevolutionary laziness. Should this unsubtle hint that the higher- ups expect to start seeing some victories cause a bit of stress, the Dear Leader has a cure: “As for those whose nerves are on edge,” he assures the team, “they will get better if they live in tents on Rungna Islet.”

(Speaking of geopolitics, why is it that every time American football is brought up by, or in the company of, a European, an awkward apology has to be made for the sport’s name? There are millions of Americans who believe that what they watch on Sunday mornings is football, and as far as I understand language that is enough to make it so. Nor shall I make any apologies for calling soccer ‘soccer’. That is just what it is called where I am from, and to do any differently is nothing but an affectation, like spelling ‘color’ with a ‘u’, or saying ‘bloody’.)

With the end of the Cold War, for the most part sporting events are no longer about nationalism, but about transnational corporations. Nations either prove they can behave themselves, like the perpetually prostrate Germans, or they are not invited, like the North Koreans. An important part of good behavior is to allow yourself, your team, your stadium, and your country to be covered in advertisements.

The corporations have proved better than the old nation-states at pretending that what they are doing is not political. Rather, it is at its dreariest just business, and at its best downright fun. Nike and the other image-makers have convinced us that competitive sports are something that one gets involved in not for love of Fatherland, but for the sheer, personal enjoyment of it. To be Beckham, the suggestion is meant to be, would be to know a life of unmodulated, unadulterated joy.

I am a runner, and so I have no choice but to go into some of the same stores, and think some of the same thoughts, as soccer fans. I buy Nikes and knee-braces, and I run round municipal lakes and parks with athletes and fans of competitive sports. This is the closest I ever come to community with them. When I run in Germany I am often asked by these over-earnest folk, “Ah so, you like to make sport?” Yes, I want to say. Gern. It gives me great joy. It makes me feel like Beckham. With the right shoes on, and the right sport-beverage, I almost feel as though I’m in a commercial, as though I’m moving in slow-motion, as though the promise of the masses chanting ‘We will, we will rock you,’ is at long last coming true.

Bullshit. I run because I’m afraid of letting myself go, as they say, since to do so would be to allow the inevitable unraveling of this mortal coil to define the terms from here on out. It has nothing to do, in other words, with fun. It has to do with sheer terror. In a bygone era, there were many Germans who wrote with conviction and power about terror in the face of death. Now they ‘make sport’.

I do not spend all that much time in Germany, but somehow have managed to be here just in time for the past three World Cups. If it weren’t for these coincidences between my research and lecture schedule on the one hand, and the world’s athletic schedule on the other, I would have no idea of soccer beyond my youth-league misadventures. My first lessons about the World Cup were in 1998, when a gang of German hooligans rampaging outside a stadium in France ended up killing a French policeman. Helmut Kohl went on TV with utmost contrition. He noted, rightly and obviously, that this event brought back distressing memories of a still living history, and declared, wrongly, that this murder had nothing to do with the true spirit of competitive sports. The German people rushed to agree with him on both of these points. Might it be worth considering, though, that far from being the cancer cells of fandom, the hooligans are in fact the true fans, making explicit what it is all this shouting and side-taking is really all about? To wit, violence, in its mild form only vocal and gestural, but not for that reason categorically different from a kick in the shins or a brick in the head.

Heidegger liked to say that the ‘forgetfulness of Being’ that came with the rise of Western rational thought had its ultimate issue in Gerede or pointless small-talk. Heidegger was of course pumped up in his own idiotic way, but still it seems to me that there is something we may justly call ‘forgetful’ about sports fandom. There are, in marked contrast, many other things human beings do in order to come to terms with their fate, rather than to avoid it. These are, in their own way, the opposite of sports. I for my part would rather go to church, I would rather go to a funeral, than to invest one second of my attention in a staged contest between men, in the aftermath of which everything, but everything, is destined to remain the same.

Rx: Ian McEwan, medical ethics and plagiarism

I recently read the novel Saturday by Ian McEwan. I have to admit that in the proverbial sense, despite all the excellent reviews this book has received since its release last year, it really is very good. Near the end, Mr. McEwan’s plot started me thinking about issues related to medical ethics, patient-doctor relationships and the difference between cure and healing. The book tells the story of a Saturday in the life of a neurosurgeon. Right from the start, this weekend is quite extraordinary starting with the doctor chancing upon the sight of an exploding airplane from his bedroom window in the early morning hours and evolving into an increasingly macabre day, finally ending with the doctor and his family held hostage in their own home. Without wishing to divulge any crucial aspect of the plot in this exceptionally well researched and equally well written book, I would like to get to the part that I found thought provoking. In the course of the evening, the criminal who has singularly invoked the wrath of the good doctor by abusing his beloved wife and their twenty-something daughter, retains a head injury. The police and ambulances arrive and the abuser is rushed to the hospital under police custody. No sooner has the family finally soothed their frayed nerves enough to have a drink and think about the dinner they never had does the phone ring and a medical colleague on the line requests the doctor’s expert services urgently at the hospital. The criminal who was just brought in with the head injury is turning out to be a complicated case and therefore in need of greater expertise than what the inexperienced registrar on call could offer. In fact, the neurosurgeon is badly needed. The doctor who called of course has no idea of any connection between the neurosurgeon and the injured patient. The neurosurgeon decides to go and lend his expertise to save that miserable low life who only hours ago had desecrated all that was precious to him. Here is the problem. If he does not go, the man is likely to die. If he does go and operate, can he be trusted to save the life of a man who just a short time ago had inspired the deepest hatred and anger that the doctor was capable of feeling towards another human. Can the doctor be impartial?

Slide1_21

I faced an interesting dilemma myself some years ago which made me question my own ability to be impartial. I had received a scientific grant to review from a Government agency. I read it, took detailed notes throughout, and formed a generally negative opinion of the grant. However, before I wrote the formal review, I decided to read it yet another time since something somewhere in the grant had left me feeling uneasy. Sure enough, I realized the source of my discomfort rather quickly upon the next read. There were several passages in one of the published papers sent as supplemental material with the proposal which were much too familiar. In fact, I recognized them as having been written by one of my post docs in a paper we had submitted for publication, and which had been turned down by the reviewers. Now at least we knew who the reviewer was. Apparently he disliked our paper enough to recommend rejection, but liked it enough to plagiarize parts of it. More importantly, we had subsequently submitted our paper to another journal where it was already accepted for publication and was in press. This meant that when our paper finally came out, it would contain passages that had already been published by another author and ironically enough, we are the ones who would appear to be the plagiarizers. Plagiarism is a serious offense and needed to be reported and of course I felt angry and violated. The question was whether I should send in my review of the grant anyway or not. Eventually, I decided to excuse myself from the review process, even though my negative opinion had been formed prior to the discovery of the plagiarized piece, because I did not wish to even give an appearance of a conflict of interest.

It is worthwhile at this point to recall the following Principles adopted by the American Medical Association. These are not laws, but standards of conduct which define the essentials of honorable behavior for the physician.

A physician shall be dedicated to providing competent medical care, with compassion and respect for human dignity and rights.

I. A physician shall uphold the standards of professionalism, be honest in all professional interactions, and strive to report physicians deficient in character or competence, or engaging in fraud or deception, to appropriate entities.

II. A physician shall respect the law and also recognize a responsibility to seek changes in those requirements which are contrary to the best interests of the patient.

III. A physician shall respect the rights of patients, colleagues, and other health professionals, and shall safeguard patient confidences and privacy within the constraints of the law.

IV. A physician shall continue to study, apply, and advance scientific knowledge, maintain a commitment to medical education, make relevant information available to patients, colleagues, and the public, obtain consultation, and use the talents of other health professionals when indicated.

V. A physician shall, in the provision of appropriate patient care, except in emergencies, be free to choose whom to serve, with whom to associate, and the environment in which to provide medical care.

VI. A physician shall recognize a responsibility to participate in activities contributing to the improvement of the community and the betterment of public health.

VII. A physician shall, while caring for a patient, regard responsibility to the patient as paramount.

VIII. A physician shall support access to medical care for all people.

Adopted by the AMA’s House of Delegates June 17, 2001.

Obviously, writing a negative grant review is not the same as saving a life, but the point is how each of our actions would be perceived given that they were performed while we functioned under a spell of psychic violation. Two important tenets of medical ethics are important in this context. In the words of Eric Issacs, M.D., Director of Quality Improvement, San Francisco General Hospital:

Beneficence: refers to acting in the best interests of the patients. This concept often is confused with nonmaleficence, or “do no harm.”

Veracity: is truth telling and honesty.

Clearly, as the only neurosurgeon on hand who could save the patient’s life, the doctor had no choice but to operate and therefore practiced beneficence. However, given the extraordinary circumstances under which the doctor was operating, it was all the more important for him to practice veracity. The patient was unconscious and there was no immediate family to whom he could have spoken, but there were other members of the medical staff and the operation theatre as well as the police escort. I think that under the extraordinary circumstances, the honorable thing to do would have been to practice full disclosure and then proceed with the operation to save the life of a fellow human.

Since the criminal had been caught, one can imagine that now the doctor could overcome his anger and afford to treat him only as a patient. The problem here is the paradox of cure and healing; the doctor may have been cured because the criminal was caught, but not healed. The doctor’s wife asks:

“You’re not thinking about doing something, about some kind of revenge, are you? I want you to tell me.”

“Of course not.”

The distinction between cure and healing is best summarized in this excellent passage by a real life good doctor, Abraham Verghese (“The Healing Paradox” The New York Times, 12-8-02): “If you were robbed one day, and if by the next day the robber was caught and all your goods returned to you, you would only feel partially restored; you would be cured but not healed, your sense of psychic violation would remain. Similarly, with illness, a cure is good, but we want the healing as well, we want the magic that good physicians provide with their personality, their empathy and their reassurance. Perhaps these were qualities that existed in the pre-penicillin days when there was little else to do. But in these days of gene therapy, increasing specialization, managed care and major time constraints, there is a tendency to focus on the illness, the cure, the magic of saving a life. Science needs to be more cognizant of the other magic, the healing if you will, even if we reach for the proven cures. We need to develop and refine that magic of the physician-patient relationship that complements the precise pharmacologic interventions we may prescribe; we need to ensure the wholeness of our encounter with patients; we need to not lose sight of the word “caring” in our care of the patient. And doggedly, in this fashion, one patient at a time, we can restore faith in the fantastic advance of Science we are privileged to witness”.

Monday, June 12, 2006

Dispatches: Ones and Zeroes

For a while now, sculpture and painting’s preeminence among the plastic arts has seemed a little anachronistic.  Painters such as Richter or Freud who stick to using paint to do two-dimensional figurations, or sculptors like Serra who stick to ‘raw’ materials like steel over the found objects that decorate so much installation art, feel classic or even old-fashioned.  After Pop art, art and photography that mix media became much more common: Gilbert and George, Lee Bontecou, landscape art, Bruce Nauman.  More recently, the materials of plastic art keep getting more worldly.  Witness the Young British Artists:  Rachel Whiteread (plaster casts of negative space), Chris Offili (dung), Marc Quinn (blood), the Chapmans (figurines), Tracy Emin (household materials and furniture).

While the world has been intruding into art’s materials, and art has been escaping the gallery (as with street art), I’ve been thinking about another development lately, one which leaves plasticity behind altogether: the use of computers, not just to create art, but as the subject of art as well.  For two or three years this field has been gathering momentum, and it feels like a generational shift.  There’s now a group of people approaching thirty who have grown up in an entirely novel social condition, that of having used computers all their lives, and for whom navigating the programmed landscapes of operating systems and icons is as natural as Wordsworth rambling the Lake District.  This is neither a good nor bad development, it’s history.  Anyway, I don’t believe in being too technologically determinist about kinds of art, but looking at the work of this group is incredibly exciting because the kinds of inquiries they make denaturalize and probe their environment, which in their case happens to be the space of computing.  They add computing to the world, and add the world to computing.

Let’s start with the celebrated Cory Arcangel.  Cory’s work usually uses obsolete game systems, computers, file formats, and other computing detritus as the basis for experiments and invasions.  His most famous work is “Super Mario Clouds,” in which he hacked a Super Mario Brothers cartridge to display only the blue sky and floating clouds, a work shown at the Whitney Biennial.  Other stuff includes a shooting game hacked to make Andy Warhol the target, with Flavor Flav and Col. Sanders the decoys;  matching Kurt Cobain’s suicide letter with ads from Google AdSense; rearranging the DVD chapter markers on ‘Simon and Garfunkel Live at Central Park’ to notate all the moments where they look like they hate each other; and so on.  Are you thinking this stuff is juvenile?  You’d be wrong, but in a way, you’d be right: Cory conserves the open-source ethos of young hackers, to the point of supplying instructions for how to replicate his most famous works. 

Cory’s instructions to “Super Mario Clouds” are a very strange and very fascinating kind of aesthetic document.  (They’re also funny.)  As it turns out, his manic methods are refreshingly low-tech, born of a taste for the ground floor of computing.  He writes a new set of instructions that uses the game’s existing graphics, compiles it (translates it into 1’s and 0’s, or assembly language), and burns it onto to the same chip that the original Nintendo uses (he has a chip burner used by Nissan hotrodders who use it to hack their engines).  Then things get even more basic: he takes the game cartridge, desolders and removes the program chip, and solders in his newly burned chip, cutting a hole in the plastic casing to fit it.  The result is a slightly haunting image of a glowing blue sky and those iconic Super Mario Clouds, floating right out of the collective imaginary.

You might wonder why Arcangel doesn’t just make the image on Photoshop; it would be a heckuva lot easier.   Here are his own learned and excitable words:

“A typical NES Cartridge has two chips. One is a graphics chip, and the other is a program chip. Basically the program chip tells the graphics chip where to put the graphics, and thus if you do this in a interesting manner, you have a video game. When making a “Super Mario Clouds” cartridge, I only modify the program chip, and I leave the graphic chip from the original game intact. Therefore since I do not touch the graphics from the original cartridge, the clouds you see are the actual factory soldered clouds that come on the Mario cartridge. There is no generation loss, and no “copying” because I did not even have to make a copy. Wasss up.”

See, these are the real “factory soldered” clouds, chief.  Surely one of the more bizarre yet convincing determinations of authenticity I’ve seen in a while, Cory expresses perfectly the thrill of using the actual relevant materials to create an artwork.  Simply Photoshopping the image would be fake, clearly, yet it’s hard to explain exactly why, a mark of art that applies itself to present conundrums.  Maybe the physical Nintendo cartridge matters because it’s the real physical object that inflected our world, and it’s important to use it, understand it, and work with it.

The first Palm Pilot was exciting to lots of computer geeks because its tiny memory meant ingenious little games were invented and shared, as they were in the initial stages of personal computing.  (I remember writing simple BASIC programs for my Atari 400 and saving them on cassette tapes, which I excitedly played to hear the “sound” of my code, not that I was ever a dork or anything.)  Arcangel returns to the obsolete technologies of his childhood, not as nostalgic fixations (he claims never to have liked playing Mario), but as an aesthetic embrace of the real.  Surrounded by these things, he developed an entirely artistic fixation with changing them, interfering with them, transforming them.  His work, over time, keeps getting simpler, showing how little it takes to get into the cracks of things that appear seamless, like hardware and operating systems.  He takes on challenges for the sake of curiousity: he recently calculated where the exact Manhattan center of Starbucks gravity is, and explained how.  What’s implicit is how just paying the right kind of attention keeps the world interesting, fully alive and of the moment.

The official art world has already begun to sanction this type of work, as the Whitney Biennial makes pretty clear, as well as a recent show at the slightly old-school Pace Wildenstein gallery, curated by Patricia Hughes and featuring Arcangel, Brody Condon, the collective Paper Rad, and others.  Another feature they seem to share is an eclecticism with respect to materials and genres: many of them make music as well as art, and all seem to be feeding off of the whole range of waste-products of consumer obsolescence, rotting eighties junk that begs to be categorized and indexed.  Underneath a lot of this work is a desire for mastery, for lost comprehension, that’s so hard to satisfy in the present condition of unprecedented epistemological overload and confusion.  Cory again:

“We [BEIGE, Cory’s art/music collective] started using fixed architecture machines, computers which are no longer being developed, at this time because it is impossible to keep up with commercial software and hardware. Imagine trying to play Bach on the piano if they switched keys around every few years … and charged you for it! Plus the limited capabilities of these computers allows us to understand every aspect of the machine.”

Hence the attraction to precisely the limitations of older systems.  To take another recent example, have a look at this short animation from Michael Bell-Smith, entitled “Keep On Moving (Don’t Stop).”  The use of the squarish graphical template of early role-playing games has a similarly aesthetic, as opposed to nostalgic, motivation to that of Arcangel’s work.  It’s got the immersion in music too: yellow is color of sunrays.  The cheery quest followed by the recursive, fractal surprise at the end – trapped! – suggests the computer game as a new locale of the culture industry, appropriating all attempts to escape.  Adorno would have liked to have predicted it, but here the feeling isn’t as dour, it’s more pragmatic: we’re stuck with this world, so let’s transform it.  This topos isn’t chosen because of fondness, this is part of the air, part of the world and how we represent ourselves now.  And because computer avatarship is inescapable, it’s all the more important to subject computing to an aesthetic investigation.  I’ve seen works by the macro-photographer Andreas Gursky and the miniaturist painter Shazia Sikander that use computer animation, but this current group goes further.  They make art, not just using computers as a engineering tool, but out of and delving into computing as a cultural form.

See some more Dispatches.

Old Bev: So Dark the Con of Men?

Professor Robert Langdon, hero of Dan Brown’s The DaVinci Code,  is a well-known symbologist, American, and bachelor.  In the space of a few days in Paris and London, Langdon is accused of murder, seeks the Holy Grail, and gets the first date he’s had in years.  He’s a classic bumbling hero, roused from a hotel slumber to meet Bezu Fache, the captain of the Central Directorate Judicial Police who is hell-bent on arresting him; Sophie Neveu, a cryptologist who believes Langdon can help solve the mystery of her grandfather’s murder; and Sir Leigh Teabing, the world’s preeminent Holy Grail scholar.  Dan Brown doesn’t give us too much personal information about our hero, preferring instead to let chatty Langdon provide most of the historical exposition, but we do know that he has had but one love in his life, a woman named Vittoria who drifted away a year before The DaVinci Code picks up.  Langdon, like most every other character in the book, is looking for a lady.

Davincicode_us_3The DaVinci Code is a competent thriller, but it takes more than that to sell over 40 million copies and nearly $200 million worth of movie tickets.  Dan Brown’s genre elements – murder, escape, conspiracy, romance – exist first in the primary plot starring Langdon and company, and finally in a more famous plot starring Jesus Christ and his disciples.  Murder, escape, and conspiracy are all familiar Biblical elements, but romance?  There’s where the 40 million copies come into play.  Robert Langdon’s newest manuscript asserts that the Holy Grail is the sarcophagus of Mary Magdalene and documents that trace her bloodline into the present day.  Why Mary?  She allegedly married and bore the child of Jesus Christ.  This is more than Biblical gossip though, because Dan Brown’s Catholic Church has suppressed Mary’s claim to the church (ie, the sacred feminine) and seeks to destroy the Grail and murder Christ’s living descendents.  The DaVinci Code is ostensibly a book about restoration of (or failing that, reverence for) female power – so how come there’s only one female character in the book?

If Dan Brown doesn’t tell us much about Robert Langdon, he tells us even less about Sophie Neveu.  At least Langdon’s ramblings and manuscripts are evidence of his own passion and intellectual life; Sophie’s interest in cryptology is attributed directly to the design of the grandfather who raised her. If Sophie had a love at one point, he doesn’t get a name – we know just that she is lonely.  She’s useful to the search for the Grail only because of her hazy childhood memories.  Sophie is a human code, and she needs Langdon to help her read herself.  There’s nothing inherently wrong with a lonely repressed lady cryptologist, but Brown isolates her in a world where female power has been lost, and encourages only the men to reclaim it.

When Langdon informs his lecture hall of the “mind boggling” “concept of sex as a pathway to God” held by the early church, he fields a question from the crowd:

“Professor Langdon?” A male student in back raised his hand, sounding hopeful.  “Are you saying that instead of going to chapel, we should have more sex?”  Langdon chuckled, not about to take the bait.  From what he’d heard about Harvard parties, these kids were having more than enough sex.  “Gentlemen,” he said, knowing he was on tender ground, “might I offer a suggestion for all of you.  Without being so bold as to condone pre-marital sex, and without being so naïve as to think you’re all chaste angels, I will give you this bit of advice about your sex lives.”  All the men in the audience leaned forward, listening intently.  “The next time  you find  yourself with a woman, look in your heart and see if you cannot approach sex as a mystical, spiritual act.  Challenge yourself to find that spark of divinity that man can only achieve through union with the sacred feminine.”  (310)

What’s interesting here is that Langdon responds only to the “gentlemen” in his class.  He perceives the question – a misunderstanding of his lecture – to be fundamentally male, and assumes either that women already have an appropriate attitude toward sex or that perhaps they don’t need that appropriate attitude if the man has it.  There’s no space in his treatment of the sex act for female sexuality except as a conduit for a male experienced “spark of divinity.”  Sophie Neveu is positioned similarly in the narrative – there’s no space for her experience of the murder or Grail quest outside of what it means for her late grandfather and her male companions.  She is the portal through which the academics finally make tangible their theory.  Her agency is only as great as the extent to which Langdon and Teabing exploit her.  Dan Brown offers few clues, in a 454 page bestseller about the suppression and celebration of the ‘sacred feminine,’ as to how a woman might negotiate her own intrinsic divinity.

Sophie, though she serves as an object of desire for the bulk of The DaVinci Code, only behaves sexually after she learns herself to be the direct descendant of Mary Magdalene and Jesus Christ.  It’s as if Brown can’t ask his heroine to reevaluate her sex life in the same manner that he asks men, through Langdon, to; instead he fashions Sophie as a blank slate.  She doesn’t endure a reawakening at the conclusion of the novel, but an awakening:

The stars were just appearing, but to the west, a single point of light glowed brighter than any other.  Langdon smiled when he saw it.  It was Venus.  The ancient Goddess shining down with her steady and patient light…Langdon looked over at Sophie.  Her eyes were closed, her lips relaxed in a contented smile…Reluctantly, he squeezed her hand…Langdon felt an unexpected sadness to realize he would be returning to Paris without her.  “I’m sorry, I’m not very good at—”  Sophie reached out and placed her soft hand on the side of his face.  Then, leaning forward, she kissed him tenderly on the cheek.  [Langdon asks Sophie to meet him in Florence the following month, and she agrees as long as they avoid museums etc.]  “In Florence?  For a week?  There’s nothing else to do.”  Sophie leaned forward and kissed him again, now on the lips.  Their bodies came together, softly at first, and then completely.  When she pulled away, her eyes were full of promise.  “Right,” Langdon managed.  “It’s a date.”  (448-449)

This is an unremarkable conclusion to an unremarkable romance plot, except for the fact that Brown offers no representations of female desire not explicitly allied with a Goddess.  If her enlightened sex with Langdon will in fact help her explore her spirituality, why does this probihit Brown from acknowledging her prior sexual impulses?  Jane Schaberg and Melanie Johnson-DeBaufre point out in their article “There’s Something About Mary Magdalene” that female sexuality in The DaVinci Code exclusively “helps men achieve their full spiritual potential,” but they also posit that Brown’s Christianity is one that “appeals to those looking for a spirituality not based in creed or authority, but on knowledge, personal reflection and an embodied life in the world.”  (Ms. Magazine, Spring 2006)  In The DaVinci Code, however, no woman who is not literally the descendant of a goddess negotiates such a spirituality.  Individual women – like Vittoria, who made choices illegible to Langdon – are absent.  Brown keeps women abstract by referring to them only in groups: Langdon’s female students “smiled knowingly, nodding” (but don’t speak to each other as the men do), no female participant in a traditional sex ritual is identified (as Sophie’s grandfather is), an unnamed Parisian academic reminds Langdon of all the simpering women back home, a nun is emblematic of a body of believers, and Sophie’s grandmother represents the entire bloodline of Mary Magdalene.  For the individual woman in The DaVinci Code, there is no “embodied life in the world” if it does not involve a male body – or if she is a part of the literal bloodline of Jesus and Mary.

One scene in The DaVinci Code stands out from the rest.  It’s not included in the screen adaptation by Akiva Goldsman.  In it, Langdon and Sophie take a cab ride through the Bois de Boulogne:

Langdon was having trouble concentrating as a scattering of the park’s nocturnal residents were already emerging from the shadows and flaunting their wares in the glare of the headlights.  Ahead, two topless teenage girls shot smoldering gazes into the taxi.  Beyond them, a well-oiled black man in a G-string turned and flexed his buttocks.  Beside him, a gorgeous blond woman lifted her miniskirt to reveal that she was not, in fact, a woman…Langdon nodded, unable to imagine a less congruous a backdrop for the legend he was about to tell.  (157)

Sophie doesn’t have trouble paying attention; her eyes are “riveted” on Langdon. It’s also worth noting that the women appear in groups here as well.  But the scene’s explicit depiction of a secular shadowy sexual marketplace is unique within the novel. Here a more complex web of connections between individual and temporal sexualities, lifestyles, and belief systems is glimpsed. Langdon himself seems to congratulate his author on the decision.  Unfortunately, the cab-driver’s radio begins to crackle with news of the fugitives, and Langdon and Sophie have to hightail it out of the Bois de Boulogne, and it’s on to a Swiss Bank to search for the distillation of the sacred feminine.

Lunar Refractions: Longing for Perfect Porn Aristocrats and Other Delights

Leonard Cohen’s music first came to me in my early teens. I fell deeply in love, and thought, this will pass, this is an adolescent thing, a phase, an infatuation; time or luck will have me grow out of this.

01_natural_born_killers_front His words came to me—as many great things come to me, in pathetic or even hideous masks, to test whether or not I am easily fooled by the disguises woven to hide their wondrous nature—in Oliver Stone’s 1994 movie Natural Born Killers. It is a rather dismissible movie, though the soundtrack is amazing (thank you, Trent Reznor et al.), and it did its job of delivering the unexpected, unforeseeable goods.

But by then I already, albeit unwittingly, knew these tw02_livesong_2o introductory songs, “The Future” and “Waiting for the Miracle,” from one of my friend Vanessa’s mixed tapes. I just didn’t know who was behind the suave voice. A few years and several album acquisitions later, an acquaintance in Rome asked me what I was listening to at the moment, and it was Cohen’s 1973 album Live Songs. The response so impressed me that I bring it to you verbatim: “Leonard Cohen? Nobody listens to him anymore. We were all listening to him in the late seventies, when we were young and radical and left.” Yeah, I left. I’m fine being told that my tastes are quite yesterday, and I knew this guy probably didn’t get it because he was, well, who he was. He was also definitely one of the numerous Europeans who helped make Cohen more popular over there than in North America by not understanding his lyrics.

Well, to echo the rampant name-calling that follows him everywhere, the Ladies’ Man, the Grocer of Despair, grandson of the Prince of Grammarians, has just published a new old book, titled Book of Longing, and was on the radio three weeks ago chatting with Terry Gross. She did a fairly good job, considering that the usual sort of questions, many of which she tried, really didn’t fit here, and Cohen seems to be no comforter.

03_006112558xLie to me, Leonard

Firstly, he lied his way through the entire hour. Okay, perhaps they weren’t all lies, and the ones that were lies were committed with some definite * intentions (*I’m at a loss for the appropriate adjective: honest? Low? Lofty? Sick? Sweet? Romantic? All of the above?). The truth is that he can’t help how charming he is, and frankly it’s a miracle he’s done what he has to melt deeply frozen hearts. He had tea on April 21 with his Zen master in celebration of the latter’s ninety-ninth birthday, but immediately backtracks to point out that it wasn’t tea—it was liquor. In his poem “Titles” he reads that “I hated everyone / but I acted generously / and no one found me out.” He valiantly assures the listener that this is true, and equally valiantly contradicts it in song and in print. Plus, I can’t help but suspect that many people have found him out. Is it possible to feign this man’s passion? Probably, but I just don’t want to think so.

Alright, that’s not so many lies. But a lot of interesting things came up. When discussing04a_150pxcandleburning the idea of composing a poem versus composing a song, Gross asks him about the early sixties song “Famous Blue Raincoat” and which of those two it originally was, to which he replies, “It’s all the same to me.” [Aside: forgive me for sticking to the script here and bringing up the blockbuster songs, when I’d rather fawn over the lesser-known songs like “Teachers,” “Passing Through,” “Who by Fire,” “If It be Your Will,” “Here It Is,” etc.]

A lot of what one might call romantic creation is touted here. Ignoring the famous traits of “despair, romantic longing, and cynicism” alongside the idea that “at the same time, there’s a spiritual quality to many of his songs” mentioned as an introductory nothingness on the radio show, when asked where “Famous Blue Raincoat” came from, he replies, “I don’t know, I don’t remember how it arose—I don’t remember how any of them get written.” When asked why he left the Zen center after five or six years of work there, he replies, “I don’t know… I’m never sure why I do anything, to tell you the truth.” About the creation of “Everybody Knows,” “I don’t really remember…. You see, if I really remembered the conditions which produce good songs, I’d try to establish them,” going on to mention the use of napkins, notebooks, etc.

Then there’s the sheer hard labor of it:

You get it but you get it after sweating…. I can’t discard anything unless I finish it, so I have to finish the verses that I discard. So it takes a long time; I have to finish it to know whether it deserves to survive in the song, so in that sense all the songs take a long time. And although the good lines come unbidden, they’re anticipated, and the anticipation involves a patient application to the enterprise.

Of the early-nineties song “Always,” Gross points out that he’s taken a song by Irving Berlin and added a few lines, making it “suddenly very dark and sour.” His quick reply: “Well, you can depend on me for that…”. His is “a kind of drunken version of it.” He’d like to do a song in the vein of those great American songbook lyricists he doesn’t feel equal to:

05_p55347pvbct_1 I have a very limited kind of expression, but I’ve done the best that I can with it, and I’ve worked it as diligently as I can, but I don’t really—except for songs like “Hallelujah,” or “If It be Your Will,” I think those are two of my best songs—I don’t live up to… those great songwriters….

There’s a lot of things I’d like to do, but when you’re actually in the trenches, and, you know, you’re in front of the page or… the guitar or the keyboard under your hands, you know you have to deal with where the05a_225pxlarge_bonfire_1 energy is, what arises, what presents itself with a certain kind of urgency. So, in those final moments, you really don’t choose, you just go where the smoke is, and the flames and the glow or the fire, you just go there.


The Ponies Run the Girls are Young

06_dancerfullgallop2 But enough about composition. My favorite bits are where Cohen plays the role of the [not exactly dirty] old man. Page 56 of his new book carries a poem for a certain Sandy, and what girl doesn’t occasionally want to be the Sandy sung to here? “I know you had to lie to me / I know you had to cheat / To pose all hot and high behind / The veils of sheer deceit / Our perfect porn aristocrat / So elegant and cheap / I’m old but I’m still into that / A06a_victoria_color1b thousand kisses deep.” Age is very present here, and while he’s sung of so many other mortal weaknesses over the past forty-plus years, it seems he had to wait for this particular one to sink into the bones before it began to permeate his work. In four short lines on page 171 you learn the sorrows of the elderly. Go to page 14 to read my favorite tidbit written to a young nun, speaking of staggered births, time disposing of two people whose generations separate them, and whose turn it is to die for love, whose to resurrect. This one is too beautiful to steal from page to pixel.

Betrayal also comes up. In the end the letter writer who sings about that famous blue raincoat has his woman stolen by the letter recipient. In speaking about such games, his age now seems to save him:

07_4512437720056oberitalien332kopie Fortunately I’ve been expelled from that particular dangerous garden, you know, by my age… so I’m not participating in these maneuvers with the frequency that I once did. But I think that when one is in that world, even if the situation does not result in any catastrophic splits as it does in “Famous Blue Raincoat,” one is always, you know, edging, one is always protecting one’s lover, one is always on the edge of a jealous disposition.

Later he specifies that one does not become exempt from that garden, but is just not as welcome. So what are the trade-offs for no longer being welcome? If nothing else, there’s a special voice, which in Cohen’s case is undeniably alluring. He apparently acquired it through, “well, about 500 tons of whiskey and a million cigarettes—fifty, sixty years of smoking…”. I didn’t know tar could be turned to gold.


The Fall

Then comes the most terrifying subject of all, beauty—physical beauty, superficial beauty. We are either enslaved by it, embody it, or attach ourselves to someone who does. He is still oppressed by the figures of beauty, just as he was thirty-two years ago. And here he’s at his most graceful:08_150559628_2670890923_1

I still stagger and fall…. Of course it just happens to me all the time, you just have to get very careful about it, because it’s inappropriate for an elderly chap to register authentically his feelings, you know, because they really could be interpreted, so you really have to get quite covert as you get older… or you have to find some avuncular way of responding, but still, you just, really are just, you’re wounded, you stagger, and you fall.

One feels deeply in love, and thinks this will pass, this is a phase, an infatuation; time or luck will have me grow out of this.

A Monday Musing by Morgan Meis about Cohen is here, and previous Lunar Refractions can be seen here.

Negotiations 8: On Watching the Iranian Soccer Team Crumble Before Their Mexican Counterparts on German Soil

What is the legacy of two thousand years of Christianity? What are the specific qualities that the Christian tradition has instilled and cultivated in the minds of men? They appear to be twofold, and dangerously allied: on the one hand, a more refined sense of truth than any other human civilization has known, an almost uncontrollable drive for absolute spiritual and intellectual certainties. We are speaking of a theology that through St. Thomas Aquinas assimilated into its grand system the genius of Aristotle and whose Inquisitors in the Church bequeathed to modern science its arsenal of weapons for the interrogation of truth. The will to truth in the Christian tradition is overwhelming. On the other hand, we have also inherited the ever-present suspicion that life on this earth is not in itself a supreme value, but is in need of a higher, a transcendental redemption and justification. We feel that there is something wrong with us, or that the world itself needs salvation. Alas, this unholy alliance is bound finally to corrode the very beliefs on which it rests. For the Christian mind, exercised and guided in its search for knowledge by one of the most sophisticated and comprehensive theologies the world has ever seen, has, at the same time, been fashioned and directed by the indelible Christian distrust of the ways of the world. Such a mind will eventually, in a frenzy of intellectual honesty, unmask as humbug what it began by regarding as its highest values. The boundless faith in truth, a joint legacy of Christ and Greek, will in the end dislodge every possible belief in the truth of any faith. For the Christian, belief in God becomes—unbelievable. Ergo Nietzsche:

Nietzschebig_1_1Have you not heard of that madman who lit a lantern in broad daylight, ran to the marketplace and cried incessantly: “I seek God! I seek God!” As many of those who did not believe in God were standing around just then, he provoked much laughter. Has he got lost? asked one. Did he lose his way like a child? asked another. Or is he hiding? Is he afraid of us? Has he gone on a voyage? Emigrated? Thus they yelled and laughed.

The madman jumped into their midst and pierced them with his eyes. “Whither is God? I will tell you. We have killed him—you and I. All of us are his murderers… God is dead. God remains dead. And we have killed him.

His listeners fell silent and stared at him in astonishment. At last he threw his lantern on the ground, and it broke into pieces and went out. “I have come too early,” he said then; “my time is not yet. This tremendous event is still on its way, still wandering; it has not yet reached the ears of men… This deed is still more distant from men than the most distant stars—and yet they have done it themselves.”

God, as Nietzsche puts it, is dead; and you and I, with the relentless little knives of our own intellect—psychology, history, and science—we have killed him. God is dead. Note well the paradox contained in those words. Nietzsche never says that there was no God, but that the Eternal has been vanquished by time, the Immortal has suffered death at the hands of mortals. God is dead. It is a cry mingled of despair and triumph, reducing, by comparison, the whose story of atheism and agnosticism before and after to the level of respectable mediocrity and making it sound like a collection of announcements by bankers who regret that they are unable to invest in an unsafe proposition.

Nietzsche brings to its perverse conclusion a line of religious thought and experience linked to the names of St. Paul, St. Augustine, Pascal, Kierkegaard, and Dostoevsky, minds for whom God was not simply the creator of an order of nature within which man has his clearly defined place, but to whom He came in order to challenge their natural being, making demands which appeared absurd in the light of natural reason.

Nietzsche is the madman, breaking with his sinister news into the marketplace complacency of the pharisees of unbelief. We moderns have done away with God, and yet the report of our deed has not reached us. We know not what we have done, but He who could forgive us is no more. No wonder Nietzsche considers the death of God the greatest event in modern history and the cause of extreme danger. “The story I have to tell is the history of the next two centuries,” he writes. “Where we live, soon nobody will be able to exist.” Men will become enemies, and each his own enemy. From now on, with their sense of faith raging within, frustrated and impotent, men will hate, however many comforts they lavish upon themselves; and they will hate themselves with a new hatred, unconsciously at work in the depths of their souls. True, there will be ever-better reformers of society, ever-better socialists and artists, ever-better hospitals, an ever-increasing intolerance of pain and poverty and suffering and death, and an ever more fanatical craving for the greatest happiness of the greatest numbers. Yet the deepest impulse informing their striving will not be love, and it will not be compassion. Its true source will be the panic-stricken determination not to have to ask the questions that arise in the shadow of God’s death: “What now is the meaning of life? Is there nothing more to our existence than mere passage?” For these are the questions that remind us most painfully that we have done away with the only answers we had.

The time, Nietzsche predicts, is fast approaching when secular crusaders, tools of man’s collective suicide, will devastate the world with their rival claims to compensate for the lost kingdom of Heaven by setting up on earth the ideological economies of democracy and justice, economies which, by the very force of the spiritual derangement involved, will lead to the values of cruelty, exploitation, and slavery. “There will be wars such as have never been waged on Earth. I foresee something terrible, chaos everywhere. Nothing left which is of any value, nothing which commands, ‘Though shalt!’” Ecce homo; behold the man, homo modernus, homo nihilismus. Nihilism—the state of human beings and societies faced with a total eclipse of all values—thrives in the shadow of God’s death. We have vanquished God, but we have yet to vanquish the nihilism that has risen up within us to take God’s place. There is a profound nihilism at work in this world. How are we to deal with this, the legacy of our greatest deed? There is no going back; there can be no going back. We are perched atop a juggernaut; the reins of that sad cart have been passed to us by the four Horseman of modernity—Nietzsche, Freud, Marx and Darwin. Do we heave back on them now? I think not—we must drive them ever faster, until the juggernaut topples and we nimbles, we free spirits, have the opportunity to leap forward and beyond our time. Play more soccer.

Monday, June 5, 2006

Below the Fold: Forget the Sheepskin, and Follow the Money, or Please Don’t Ask What a University is For…

Garbed in cap and gown and subjected probably for the first time in their lives to quaint Latin orations, three quarters of a million students, sheepskin in hand, will bound forth into the national economy, hungry for jobs, economic security, and social advancement. They exit a higher education economy that looks and works more and more like the national economy they now enter. The ivory tower has become the office block, and its professors highly paid workers in an $317 billion dollar business.

Some of this is, of course, old news. From the Berkeley 1964 Free Speech movement onward, the corporate vision of American universities as factories of knowledge production and consumption bureaucratically organized as the late Clark Kerr’s “multiversities,” has been contested, but has largely come to pass.

But even to this insider (confession of interest: I am now completing my 20th year before the university masthead), there are new lows to which my world is sinking. They amount to the transformation of American universities into entrepreneurial firms, and in some cases, multinational corporations.

Most of you by now are used to the fact that universities are big business. The press never stops talking about the $26 billion Harvard endowment, or how the rest of the Ivy League and Stanford are scheming to be nipping at old John Harvard’s much-touched toes. But many non-elite schools are joining the race for big money and to become big businesses. Twenty-two universities now have billion dollar fund-raising campaigns underway. After talking with a colleague from the University of Iowa on another matter, I went to the university web page to discover that Iowa has raised over a billion dollars in a major campaign since 1999 – not bad when you recall that the state itself only has 3 million residents. Even my university, the City University of New York, the ur-urban ladder to social mobility for generations of immigrants and poor, has announced that it is embarking on a billion-dollar crusade.

In addition to billion-dollar endowments, there is revenue to consider. You might be surprised at all of the billion dollar universities in neighborhoods near you. All it really takes to put a university over the billion-dollar revenue mark is a hospital. Iowa, for instance, is a half billion a year all-purpose education shop; add its medical school and hospital system, and its revenue quadruples. A big enrollment urban school like Temple does a billion dollars of health care business in Philadelphia, easily surpassing its educational budget of 660 million. These university budgets often depend as much on the rates of Medicare and Medicaid reimbursement as they do tuitions from their various educational activities.

Tuitions are no small matter, of course, for those who pay them. The elite schools have recently crossed the $40,000 a year threshold, but the perhaps more important and less noticed change in higher education finances is that states are passing more of the burden for public college and university education directly onto the students themselves. The publics enroll three quarters of the nation’s students. As late as the 1980s, according to Katharine Lyall in a January, 2006 article in Change, states paid about half of the cost of their education; now the proportion has dropped to 30%. For instance, only 18% of the University of Michigan’s bills are paid by the state; for the University of Virginia, state support drops to 8%. Baby-boomers on that six-year plan at “old state” where they paid in the hundreds for their semester drinking and drug privileges find themselves now paying an average yearly tuition of $5,500 a year for their kids. When you add in room and board, a year at “old state” now costs an average of $15,500 a year, a figure that is 35% of the median income for a U.S. family of four.

So under-funded are important state universities that they are resorting to tax-like surcharges to make up for chronic state neglect. The University of Illinois, for example, is adding an annual $500 “assessment” on student bills for what the university president Joseph White, as quoted by Dave Newbart in the April 7 Chicago Sun-Times, describes as deferred maintenance. “The roofs are leaking and the caulking is crumbling and the concrete is disintegrating,” President White says. Next year it will cost $17,650 to go to Champaign-Urbana. The state will cover only 25% of Illinois’ costs.

Illinois’ President Newbart may be a bit old-school, and perhaps has lagged back of the pack of higher education industrial leaders. He should get smart. Instead of milking the kids on a per capita basis and incurring undying consumer wrath (after all the plaster was cracked way before I got there, I can hear a student voice or two saying), Newbart should join his peers in a little financial manipulation. What do big firms with billions in assets and large revenue flows do? They sell bonds! So much money, so little interest. And with principal due after a succession of presidents has become so many oil portraits in the board room, so little personal and professional exposure. With the increasingly short tenure of university presidents, even Groucho Marx’s Quincy Adams Wagstaff could get out in time.

American universities have made a very big bet on their future prosperity. They have issued over $33 billion in bonds, according to the May 18 Economist. For the multinationals like Harvard, this is sound money management. To raise working capital, rather than sell some of their stock portfolio at a less than optimal moment or sell the 265-acre arboretum near my house which would diminish the university endowment, Harvard can use its assets as guarantees. The university’s credit is AAA, interest rates are still historically fairly low, and their tax-exempt status makes them attractive investment choices. Harvard can deploy the money in new projects, or re-invest it in higher-yielding instruments and pocket the difference tax-free.

The entrepreneurial universities, that is, those not internationally branded and not elite, are trying to gain a competitive edge. They borrow through bonds to build dormitories, student unions, and to beautify their campuses. Many are borrowing money they don’t have or can’t easily repay. As the saying goes, they are becoming “highly leveraged.” A turn around a town with more than a few universities will likely reveal how it’s raining dorm rooms locally. Here in Boston, it has afflicted universities on both sides of the Charles. Even an avowedly commuter campus like the University of Massachusetts-Boston is building dorms to create that market-defined “campus” feel. Bonds pay for the dorms, and the students through higher rents, pay them off.

The educative value of dorm living, smart remarks aside, is rather problematic. Talking with an old friend who heads an academic department at a Boston university, I have begun to understand, however, the business logic at work. His bosses have explained the situation thus: the last of the baby boomer progeny are passing through the system, and a trough lies behind them. The children of baby-boomers, alas, prefer the reproductive freedoms of their parents, and are having children late as well. International students, full-tuition payers and once the source of great profit, are choosing increasingly non-American universities, for a variety of reasons, some related to our closed-door policy after 9/11. Add income difficulties among the American middle class, and the entrepreneurial universities calculate that they must improve their marketability and take business from others. Expand market share, create new markets (new diplomas, new student populations), or fight to keep even, they reason. Or face decline, now perhaps even a bit more steep since they are into hock for millions of dollars in bond repayments. The “high yield” customer is the traditional customer, a late adolescent of parents with deepish pockets. So dorms, fitness gyms, and student unions it is, and the faculty is mum.

In the great expand-or-die moment occurring among America’s entrepreneurial universities, you would think faculty would be making out, but they aren’t. Let us set aside for another time comment on the highly limited American Idol, star search quests among the elite schools and the entrepreneurs’ somewhat desperate casting about for rainmakers and high-profile individuals who can help in creating a distinctive brand for their paymasters. College and university faculty salaries as a whole since 1970 have stagnated, the U.S. Department of Education reports. Part of the reason is that although the number of faculty has risen 88% since 1975, the actual number of tenured faculty has increased by only 24%, and their proportion of the total has dropped from 37% in 1975 to 24% in 2003. Full-time, non-tenure track and part-time faculty are being used to meet increased demand. Universities are succeeding in gradually eliminating tenure as a condition of future faculty employment.

Forty-three years after Kerr presented his concept of the “multiversity,” the facts conform in many respects to his vision. American universities are massive producers of knowledge commanded by technocrats who guide their experts toward new domains of experiment and scientific discovery. They possess a virtual monopoly on postsecondary education, having adapted over the past half century to provide even the majority share of the nation’s technical and applied professional training.

But swimming with instead of against the stream of American capitalism over the past half century has cost American universities what few degrees of freedom they possessed. They have become captives of corporate capitalism and have adopted its business model. They are reducing faculty to itinerant instructors. Bloated with marketeers, fund-raisers, finance experts, and layers of customer service representatives, they are complicated and expensive to run, and risky to maintain when the demographic clock winds down or competition intensifies. Moreover, as Harry Lewis, a Harvard College dean pushed out by the outgoing President Larry Summers, put rather archly in the May 27 Boston Globe, students whose parents paying more than $40,000 a year “expect the university to treat them customers, not like acolytes in some temple they are privileged to enter.”

As a priest in the temple, it hurts to note how much further down the road we have gone in reducing teaching and learning to a simple commodity. However, in demanding to be treated as customers, students and their parents are simply revealing the huckster we have put behind the veil. Their demands cannot change the course of American universities for the better, but they tell those of us still inside where we stand, and where we must begin anew our struggle.

Random Walks: Band of Brothers

Ufc_hughesgracie_ufcstoreWhile a large part of mainstream America was blissfully enjoying their long Memorial Day weekend, fans of the Ultimate Fighting Championship franchise were glued to their Pay-Per-View TV sets, watching the end of an era. In the pinnacle event of UFC-60, the reigning welterweight champion, Matt Hughes, faced off against UFC legend Royce Gracie — and won, by technical knockout, when the referee stopped the fight  about 4 minutes and 30 seconds into the very first round.

To fully appreciate the significance of Hughes’ achievement, one must know a bit about the UFC’s 12-1/2-year history. The enterprise was founded in 1993 by Royce’s older brother, Rorion Gracie, as a means of proving the brutal effectiveness of his family’s signature style of jujitsu. The concept was simple, yet brilliant: invite fighters from every conceivable style of martial art to compete against each other in a full-contact, no-holds-barred martial arts tournament, with no weight classes, no time limits, and very few taboos. No biting, no fish-hooks to the nostrils or mouth, no eye gouging, and no throat strikes. Everything else was fair game, including groin strikes.

(Admittedly, the fighters tended to honor an unspoken “gentlemen’s agreement” not to make use of groin strikes. That’s why karate master Keith Hackney stirred up such a controversy in UFC-III when he broke that agreement in his match against sumo wrestler Emmanuel Yarbrough and repeatedly pounded on Yarbrough’s groin to escape a hold. I personally never had a problem with Hackney’s decision. He was seriously out-sized, and if you’re going to enter a no-holds-barred tournament, you should expect your opponent to be a little ruthless in a pinch. But the universe meted out its own form of justice: Hackney beat Yarbrough but broke his hand and had to drop out of the tournament.)

The first UFC was an eight-man, round-robin tournament, with each man fighting three times — defeating each opponent while still remaining healthy enough to continue — to reach the final round. Since no state athletic commission would ever consider sanctioning such a brutal event, the UFC was semi-underground, finding its home in places like Denver, Colorado, which had very little regulations in place to monitor full-contact sports. Think Bloodsport, without the deaths, but plenty of blood and broken bones, and a generous sampling of testosterone-induced cheese. (Bikini-clad ring girls, anyone?)

Rorion chose his younger brother, Royce, to defend the family honor because Royce was tall and slim (6’1″, 180 pounds) and not very intimidating in demeanor. He didn’t look like a fighter, not in the least, and with no weight classes, frequently found himself paired against powerful opponents with bulging pecs and biceps who outweighed him by a good 50 pounds or more. And Royce kicked ass, time and again, winning three of the first four UFC events. (In UFC-III, he won his first match against the much-larger Kimo, but the injuries he sustained in the process were sufficient to force him to drop out of the tournament.)

He beat shootfighter Ken Shamrock (who later moved to the more lucrative pro-wrestling circuit) not once, but twice, despite his size disadvantage. Royce_09_1 His technique was just too damned good. Among other things, he knew how to maximize leverage so that he didn’t need to exert nearly as much force to defeat his opponents. Shamrock (pictured at right) has said that Gracie might be lacking in strength, “but he’s very hard to get a hold of, and the way he moves his legs and arms, he always is in a position to sweep or go for a submission.”

UFC fans soon got used to the familiar sight of the pre-fight “Gracie Train”: When his name was announced, Royce would walk to the Octagon, accompanied by a long line of all his brothers, cousins, hell, probably a few uncles and distant cousins just for good measure, each with his hands on the shoulders of the man in front of him as a show of family solidarity and strength. And of course, looking on and beaming with pride, was his revered father, Helio Gracie (now 93), who founded the style as a young man — and then made sure he sired enough sons to carry on the dynasty.

Royce’s crowning achievement arguably occurred in 1994, when, in UFC-IV’s final match, he defeated champion wrestler Dan “The Beast” Severn. Many fight fans consider the fight among the greatest in sports history, and not just because Severn, at 6’2″ and 262 pounds, outweighed Royce by nearly 100 pounds. Technique-wise, the two men were very well-matched, and for over 20 minutes, Severn actually had Royce pinned on his shoulders against the mesh wall of the Octagon. Nobody expected Royce to get out of that predicament, but instead, he pulled off a completely unexpected triangle choke with his legs, forcing Severn to tap out.

For all his swaggering machismo, Royce was one of my heroes in those early days, mostly because I had just started training in a different style of jujitsu (strongly oriented toward self-defense), at a tiny storefront school in Brooklyn called Bay Ridge Dojo. True, it was a much more humble, amateur environment than the world of the UFC, but Royce gave me hope. I trained in a heavy contact, predominantly male dojo, and at 5’7″ and 125 pounds, was frequently outsized by my class mates. My favorite quote by Royce: “I never worry about the size of a man, or his strength. You can’t pick your opponents. If you’re 180 pounds and a guy 250 pounds comes up to you on the street, you can’t tell him you need a weight class and a time limit. You have to defend yourself. If you know the technique, you can defend yourself against anyone, of any size.” And he proved it, time and again.

For smaller mere mortals like me, with less developed technique, size definitely mattered. The stark reality of this was burned into my memory the first time one of the guys kicked me so hard, he knocked me into the wall. Needless to say, there was a heavy physical toll: the occasional bloody nose, odd sprain, broken bone, a dislocated wrist, and a spectacular head injury resulting from a missed block that required 14 stitches. (I still proudly bear a faint, jagged two-inch scar across my forehead. And I never made that mistake again.) I didn’t let any of it faze me. I worked doggedly on improving my technique and hired a personal trainer, packing on an extra 30 pounds of muscle over the course of two years. Not very feminine, I admit: I looked like a beefier version of Xena, Warrior Princess. At least I could take the abuse a little better. In October 2000, I became only the second woman in my system’s history to earn a black belt.

I learned a lot over that seven-year journey. Most importantly, I learned that Royce was right: good technique can compensate for a size and strength disadvantage. It’s just that the greater the size differential, the better your technique has to be, because there is that much less margin for error. And if your opponent is equally skilled — well, that’s when the trouble can start, even for a champion like Royce.

After those early, spectacular victories, Royce faded from the UFC spotlight for awhile, focusing his efforts on the burgeoning Gracie industry: there is now a Gracie jujitsu school in almost every major US city. He’d proved his point, repeatedly, and it’s always wise to quit while you’re at the top. But every now and then he’d re-emerge, just to prove he still had the chops to be a contender. As recently as December 2004, he defeated the 6’8″, 483-pound (!) Chad Rowan in two minutes, 13 seconds, with a simple wrist lock. (“Either submit, or have it broken,” he supposedly said. Rowan wisely submitted.)

The very fact of Royce’s success inevitably caused the sport to change. Fighters were forced to learn groundfighting skills. Back when the UFC was all about martial arts style versus style, many fighters in more traditional disciplines — karate, tae kwon do, kickboxing — had never really learned how to fight effectively on the ground. The moniker changed from No-Holds-Barred, to Mixed Martial Arts — a far more accurate designation these days. Today, the UFC has time limits (with occasional restarts to please the fans, who get bored watching a lengthy stalemate between two world-class grapplers), and even more rules: no hair-pulling, and no breaking fingers and toes. The formula is commercially successful — UFC events typically garner Nielsen ratings on a par with NBA and NHL games on cable television — but these are not conditions that favor the Gracie style. Eventual defeat was practically inevitable.

And so it came to pass over Memorial Day weekend. The UFC torch has passed to Hughes. But Royce’s legacy is incontrovertible. He changed the face of the sport forever by dominating so completely, that he forced everyone else to adapt to him. That’s why he was one of the first three fighters to be inducted into the UFC Hall of Fame (along with Shamrock and Severn). Royce Gracie will always be a legend.

When not taking random walks at 3 Quarks Daily, Jennifer Ouellette muses about physics and culture at her own blog, Cocktail Party Physics.

Talking Pints: 1896, 1932, 1980 and 2008–What Kenny Rogers Can Teach the Democrats

by Mark Blyth

“You got to know when to hold ‘em, know when to fold ‘em, know when to walk away, and know when to run.”

Kenny_rogersKenny Rogers may seem an unlikely choice for the post of Democratic party strategist, but the advice of ‘the Gambler’ may in fact be the single best strategy that the Democrats can embrace when considering how, and who, to run in 2008. Although we are still a long way from the next US Presidential election, the wheels seem to have truly come off the Republicans’ electoral wagon. The ‘political capital’ Bush claimed after his reelection was used up in the failed attempt to privatize Social Security and in the continuing failure to stabilize Iraq. Sensing this, Congressional Republicans (and fellow travelers) increasingly distance themselves from Bush, claiming that, in the manner of small furry passengers who have decided that the cruise was not to their liking after all, the Bushies (and/or the Congressional Republicans) have betrayed the Reagan legacy, that Iraq was a really bad idea all along, and that when its all going to pot you might as well grab what you can in tax cuts for yourselves and head for the exits.

Such un-characteristic implosion from the usually well-oiled Republican machine might lead one to expect the Democrats to make real political inroads for the first time in years. Yet, as the line attributed to Abba Eban about the Palestinians goes, the Democrats “never miss an opportunity to miss an opportunity.” This lack of Democratic political bite, when seen against the backdrop of an already lame-duck second-term President, is remarkable. For example, leading Democrats cannot get a break. Joe Biden makes a ‘major’ policy speech on Iraq, and outside of the New York Times reading ‘chattering classes’ it is roundly ignored. While some Democrats argue for a troop pull-out in Iraq, others in the same party urge ‘stay the course’ thereby ‘mixing the message’ ever further. Even populist rabble rouser Howard Dean, now head of the Democratic National Committee, has all but disappeared from view.

Yet should we be surprised by this? Perhaps the Democrats are a party so used to offering ‘GOP-lite’ that they really have no independent identity. Just as Canada has no identity without reference to the USA (sorry Canada, but you know its true), so the Democrats have no identity without defining themselves against the GOP. But to be against something is not to be for anything. Given that the Republicans are clearly for something, the ‘fact free politics’ of ‘freedom’, ‘prosperity’, ‘lower taxes’, ‘individual initiative’, and other feel-good buzzwords, the Democrats seem to have no one, and no big ideas, to take them forward, except perhaps one person – Hillary Clinton.

Topmast_hillarythumbIts pretty obvious that she wants the job. Much of the media has decided that she already has the Democratic nomination in the bag, but are split on whether she can actually win. To resolve this issue, we need the help of an expert, and this is where I would like to call in Kenny Rogers. Mr. Rogers’ advice is that you have to know when to hold, fold, walk, or run. I would like to suggest that the best thing that the Democratic Party can do is to realize that this next Presidential election is exactly the time to do some serious running; as far away from the White House as possible. I would like to propose the following electoral strategy for the Democrats:

  1. Hillary Clinton must run in 2008. She will lose. This is a good thing.

  2. If the Democrats lose in 2008, they might well win the following three elections.

  3. If the Democrats nominate anyone other than Hillary they might actually win in

    2008, and this would be a disaster.

Ok, how can losing the next election be a good thing for the Democrats? The answer lies in how some elections act as ‘critical junctures’, moments of singular political importance where because an election went in one direction rather than the other, the next several elections went that way too. 1896 was such an election for the Republicans, as was 1932 for the Democrats when they overturned Republican control and began their own long period of political dominance into the 1970s. Indeed, it is worth remembering that the Democratic party used to be the majority party in the US, and that the institutions and policies they set up in the 1930s and 1940s from Social Security to Fannie Mae, are as popular as ever. Indeed, one might add that only one of nine post-WW2 recessions occurred when the Democrats were in power. How then did the Democrats become the weak and voiceless party that they are now? The answer was Ronald Reagan and the critical election of 1980.

RonaldreaganReagan did something that no Democratic politician ever did before, he (or at least those around him) really didn’t give a damn about the federal budget. Reagan managed to combine tax cuts, huge defense expenditure increases, super-high interest rates, and welfare cuts into a single policy package. Despite the supposed ability of voters to see through such scams and recognize that tax cuts now mean tax raises later, Reagan managed to blow a huge hole in federal finances and still be rewarded for it at the ballot box. Despite their fiscal effects, this tax-cutting ‘thing’ became extremely popular, and the Democrats had to find an issue of their own to argue against them. That new issue was the so-called “Twin deficits’ that Reagan’s policies created and the policy response was deficit reduction.

Under Reagan (and Bush the elder) the US ran increasingly large deficits both in the federal budget and in the current account. The Democrats of the day seized on these facts and banged-on and on about them for a decade as if the very lives of Americas’ children depended on resolving them. The problem was however that as the world’s largest economy with the deepest capital markets, so long as foreigners were willing to hold US dollar denominated assets, no one had to pay for these deficits with a consumption loss. The US economy became the equivalent of a giant visa card where the monthly bill was only ever the minimum payment due. Take the fact that no one ever refused US bonds, and add in that most voters would have a hard time explaining what the budget deficit was, let alone why it was this terrible thing that had to be corrected with tax increases, and you have a political weapon as sharp as a doughnut. By arguing for a decade that the twin deficits were real and dangerous, and that tax increases and consumption losses (pain) were the only way forward, the Democrats went from being the party of ‘tax and spend’ to being the party of tax increases and balanced budgets, which simply played into Republican hands.

200pxbill_clintonWhich brings us to why the election of the other Clinton (Bill) in 1992 was not a critical turning point away from Republican politics in the way that 1932 was. Having banged-on about how terrible the deficits were, once in power the Democrats had to do something about them. Being boxed into a fiscal corner, Bill Clinton’s proposals for a budget stimulus and universal healthcare collapsed, and all that was left was (the very worthy) EITC and (the very painful) deficit reduction. Cleaning up the fiscal mess that the Republicans had made became Clinton’s main job, and this helped ensure that by 1996 Clinton was seen as a lame duck President who hadn’t really done anything. His unexpected win in 1996 confirmed this insofar as it resulted in no significant policy initiatives except the act of a Democrat ‘ending welfare as we know it.’ The asset bubble of the late 1990s may have made the economy roar, and Clinton’s reduction of the deficit may have helped in this regard, but the bottom line was that the Democrats were now playing Herbert Hoover to the Republicans’ Daddy Warbucks.

George20bushSo Bush the younger was elected and he continued the same tax cutting agenda, but coupled this to huge post 9-11 military expenditures and the Iraqi adventure. As a result of these policies the US carries current account and federal deficits that would make Reagan blush, the Republicans have a splintering party and support base, and the country as a whole is mired in Iraq with a very unpopular President at the helm. Surely then 2008 can be a new 1932 in a way that 1992 wasn’t? The inauguration of a new era of Democratic dominance? Possibly…but only if the Democrats loose the 2008 election rather than win it. To see why, let us examine what might happen if the Democrats did win the next election with Hillary Clinton at the helm.

In terms of security politics its far more likely that Iraq will go from bad to worse than from worse to better over the next few years. Its a mess a regardless of who is in charge, and the choices at this point seem to be ‘stay forever’ or ‘get out now.’ If the Republicans sense that they are going to lose in 2008 the smart thing to do would be to keep the troops in Iraq so that the Democrats would be the ones who would have to withdraw US forces. When that happens, Iraq goes to hell in a hand-basket, and the Democrats gets blamed for ‘losing Iraq’ and ‘worsening US security.’ If on the other hand Bush pulls US forces out before 2008 and the Democrats win, the local and global security situation worsens, and the probability that ‘something really bad’ happens on the Democrats’ watch rises, which they then get the blame for.

In terms of economic policy the structural problems of the US economy are only going to get worse over time. Since Bush came into office the dollar has lost over a third of its value against the Euro and around 20 percent against other currencies. This means higher import costs, which along with higher oil prices, suggests future inflation and larger deficits. Given that the US relies on foreigners holding US debt, any future inflation and larger deficits would have to be offset with higher interest rates. This would negatively impact the already over-inflated US housing market, perhaps bursting the bubble and causing a deep recession. So, regardless of who is in office in 2008, the economy is likely to be in worse shape then than it is now. If the Democrats are in power and the economy tanks, they will get the blame for these outcomes regardless of the policies that actually brought the recession about.

In terms of cultural politics, social issues are likely to come to a head with the new Roberts’ court finding its feet. It is probably safe to say that there will be an abortion case before the Court during the 2008-2012 cycle, if not before. This is usually treated as the clinching argument for why the Democrats must win the next election rather than lose it. Again, I disagree. Precisely because the only people who still think Bush is doing a good job are conservatives with strong social policy concerns, you can bet they will mobilize to get this policy through even if the rest of the world is crashing about their ears. I say let them have it. The sad truth is that if Roe v Wade is overturned rich white women will still get abortions if they need them, and poor women will not be much worse off since they don’t get access to abortion in most of the country as it is. But more positively, if the Republicans go for this, anyone who says “I’m a moderate Republican” or “I’m socially liberal but believe in low taxes” etc., has to confront an awkward fact. That they self-identify with an extremely conservative social agenda: one that treats women’s bodies in particular, and sexual issues in general, as objects of government regulation. If this comes to pass then the Democrats have a chance to split the Republican base in two, isolate moderates in the party, and turn the Republicans into a permanent far right minority party.

Finally, in terms of electoral politics, the Democrats have to face up to an internal problem – Hillary Clinton really is unelectable. While she may be smart, experienced, popular in the party, and have a shit-load of money behind her, the very appearance of her on television seems to result in the instant emasculation of around 30 million American men. Indeed, 33 percent of the public polled today say that they would definitely vote against her, and this at a time when Bush’s numbers are the worst of any President in two generations. It may be easy to forget how much of a hate figure Hillary Clinton was in the 1990s. One way to remember is to simply search amazon.com for books about Hillary Clinton and see how the ‘hate book’ industry that dogged the Clintons all through the 1990s is moving back into production with a new slew of ‘why she is evil incarnate’ titles. But Hillary Clinton is not just a hate figure for the extreme right. After a decade of mud slinging (that is about to go into high gear again) she is simply too damaged to win. There is a bright side to all this. Hillary Clinton is a huge figure in the Democratic party in terms of fundraising, profile, and ambition. The only way she will get out of the way and allow new figures in the party to come forward who might actually win is by her losing; so let her lose.

In sum, ‘knowing when to walk away, and when to run’ is a lesson the Democrats need to learn, and losing in 2008 would be ‘the Gambler’s’ recommendation. First, making the Republicans clean up their own mess would not only be pleasing to the eye, it would be electorally advantageous. Forcing the Republicans to accept ownership of the mess that they have made makes their ability to ‘pass the buck’ onto the Democrats, as happened to Bill Clinton, null and void. Clearly, from the point of view of Democratic voters the probable consequences of a third Republican victory have serious short-term costs associated with them, but it is also the case that the possible long term benefits of delegitimating their policies, watching their base shatter, and not having to clean up their mess and get blamed for it, could be greater still. Second, if the Democrats do win, then all the problems of Iraq, the declining dollar, the federal and trade deficits, higher interest rates, a popping of the housing bubble, a possible deep recession, and being blamed for the end of ‘the visa card economy’, become identified with the Democrats. They come in, get blamed for ending the party, clean up the mess, and get punished for it at the next election. Seriously, why do this? Third, if it is the case that Hillary Clinton will indeed get the nomination, then let her have it. She cannot win, so why not kill two birds with one stone. Nominate Hillary, run hard, and lose. That way Hillary cannot not get nominated again, new blood comes into the party, and the Republicans have to clean up their own mess. Do this, and 2012 really might be 1932 all over again.

Mark Blyth’s other Talking Pints columns can be seen here.

NOTE: This essay is posted by Abbas Raza due to a problem with Mark’s computer.

Richard Wagner: Orpheus Ascending

Australian poet and author Peter Nicholson writes 3QD’s Poetry and Culture column (see other columns here). There is an introduction to his work at peternicholson.com.au and at the NLA.

A reassessment of Wagner and Wagnerism

The following (Part 1 June, Part 2 July, Part 3 August) is excerpted from a talk originally given at the Goethe Institut in Sydney on April 18, 1999 and subsequently published in London by the Wagner Society of the United Kingdom in 2001.

Part 1:

There are several versions of the ancient Greek myth of Orpheus. In the best known of these Orpheus goes down to the Underworld to seek the return of his wife Eurydice who had been killed by the bite of a snake. The lord of the Underworld agrees on the condition that Orpheus should not turn round and look at Eurydice until they reach the Upper World. The great singer and musician who could charm trees, animals and even stones could not survive this final and most perilous of temptations. He turns to look on his beloved wife and she is lost to him forever. Another version of the myth tells of Orpheus being torn to pieces by the Thracian women or Maenads; his severed head floated, singing, to Lesbos.

Wagner01Wagner’s dismembered head continues to sing, unheard by many and misunderstood by most. That beautiful yet volatile singing head with its Janus face, enigmatically poised between black holes and galaxies; that is the head we still find puzzling. And because our civilisation does not like puzzles, and wishes to rationalise whatever has provoked it to think or feel, our best critical efforts have reduced one of the greatest creative and cultural phenomenon of Western culture to manageable proportions.

Well, Wagner continues to ascend, leaving Wagnerism behind to do battle on any number of fronts, whether at Bayreuth with its interminable family squabbling, or in the raft of prose that has followed in the wake of the German gigantomane, or through those cliches that Wagnerian ideology has left us with as the Valkyries ride their helicopters across a Vietnamese apocalypse or another wedding is inaugurated to the strains of the bridal chorus from Lohengrin.

That is how we now manage the Wagnerian cosmos. Cliche helps us to feel comfortable near this unquiet grave with its all-too-human disturbing element. Humour helps us, and it’s necessary—we haven’t the fortitude, the talent, the persistence, or, indeed, the genius, to bring into being imperishable works of art. We like to laugh when Anna Russell tells us, ‘I’m not making this up you know’. But of course that is exactly what Wagner did do; he made up an entire aesthetic and cultural world that we still have not been able to come to terms with. We try from time to time to make sense of the life and the work, but never without those attendant twins, partisanship and antagonism.

So Wagner is ascending, like Orpheus, to his place in the cultural imperium, alienated from the world’s embrace, a lonely figure, so lonely in his own life, and lonely still. Liszt saw all too clearly what Wagner would have to accustom himself to, and in a letter to his friend, when Wagner intimated thoughts of ending it all, advised, ‘You want to go into the wide world to live, to enjoy, to luxuriate. I should be only too glad if you could, but do you not feel that the sting and the wound you have in your own heart will leave you nowhere and can never be cured? Your greatness is your misery; both are inseparably connected, and must pain and torture you until you kneel down and let both be merged in faith!’ [Hueffer. Correspondence of Wagner and Liszt, Vienna House, 1973, Vol One, p 273]

The world may celebrate his work, technology may bring his music to every part of the planet, yet another monograph may be published; Wagner turns to look for his audience, and at once he loses that audience. The bloodlust that can be unleashed by a good Wagner performance, the obsessions notoriously associated with Wagnerism, the strident approbation and denunciation—these are not the cultural signifiers of classicism freely given to Shakespeare, Mozart or Goethe. Wagner evades classicism still; yet that is his predestined end. We are still too close to the psychic firestorm of his imagination, and we are still too disturbed by the misuse of his art, for that classicism to show any signs of emerging. Even as passionate a Wagnerian as Michael Tanner cannot bring himself to fully equate Shakespeare and Wagner, an equation that cultural history proposes but which we are not yet up to accepting.

Recently a German said to me, ‘You English have your Shakespeare; we have our Wagner.’ Granted my passion for Wagner and my lack of cultural chauvinism it was perhaps odd that I was so shocked by her remark. I didn’t want to say, ‘But Shakespeare is greater than Wagner.’ But I did feel a strong urge to protest about Wagner’s being seen as part of the cultural landscape in the same way that Shakespeare is (even for Germans, thanks to Tieck and Schlegel). [Michael Tanner, Wagner. HarperCollins, 1996, p 211]

Wagner’s uncertain cultural status reaches beyond our historical moment. Perhaps a Hegelian analogy is best: thesis, antithesis, synthesis. The life, 1813-1883, represents the thesis—and what a proposition it is. The twentieth century represents the antithesis replete with reductionism, antagonism, equally disreputable fanaticism and hatred. It remains for the future to offer the synthesis. And when that synthesis occurs, then Wagner will have ascended to the Upper World; his audience will not flinch from looking at him directly. Shakespeare was lucky not to have left much biographical debris behind. When the biographers and critics got to work, the focus of their studies was necessarily on the plays and poems themselves. The lacerated spirit that gave birth to the murderous rampage of a Macbeth, the suicidal melancholy of a Hamlet or the self-hatred and disgust of a Lear was easily accommodated to textual analysis and theorising because biographical motive was missing. The hunt for the Dark Lady of the Sonnets was a pastime for some but, on the whole, scholars were prepared to indulge Shakespeare’s evident greatness. Only recently have they come around to asking why Shakespeare’s younger daughter couldn’t write. No such luck for Wagner. There is enough biographical material laid on the line to keep critics in clover until the end of time—letters, autobiographies, diaries, pamphlets, theoretical writings. And that’s just the primary material. Has any scholar yet read all of it? Then there is the secondary material and we know that it is now beyond the ability of anyone to read it, let alone make sense of it. This deluge of material shows no sign of abating. Are we now any closer to understanding the phenomenon of Wagner? Wagnerism seems to be one of the chief ways with which we seek to cope with what is now considered to be the ‘problem’ of Wagner.

Thus two mutually antagonistic modes of thinking fail to reach any accommodation with one another. It seems that Wagnerian historiography must advance, not by the slow accumulation of historical and cultural detail, but always explosively, so that an apparent understanding of events is wrenched apart by either previously unknown factual details or fresh polishing of a facet of the Wagnerian rough diamond.

[Parts 2 and 3 of Orpheus Ascending can be read here and here.]

Monday Musing: Susan Sontag, Part I

In an essay about the Polish writer Adam Zagajewski, Sontag writes that as Zagajewski matured he managed to find “the right openness, the right calmness, the right inwardness (he says he can only write when he feels happy, peaceful.) Exaltation-—and who can gainsay this judgment from a member of the generation of ’68—is viewed with a skeptical eye.” She’s writing about what Zagajewski was able to achieve but she is also, of course, writing about herself.

Sontag was also a member of the generation of ’68, if a slightly older one. She too achieved an openness, calm, and inwardness as she matured, though it came with regrets and the sense that the pleasure of a literary life is an ongoing battle against a world that is predisposed to betray that pleasure.

Writing about Zagajewski again, she explains that his temperament was forged in the fires of an age of heroism, an ethical rigor made sharp by the demands of history. These men and women spent decades trying to write themselves out of totalitarianism, or they were trying to salvage something of their selves from what Sontag does not hesitate to call a “flagrantly evil public world”. And then suddenly, in 1989, it was all over. The balloon popped, the Wall came down. Wonderful events, no doubt, but with the end of that era came the end of the literary heroism made possible by its constraints. Sontag says, “how to negotiate a soft landing onto the new lowland of diminished moral expectations and shabby artistic standards is the problem of all the Central European writers whose tenacities were forged in the bad old days.”

Sontag also managed to come in softly after scaling the heights of a more exuberant time. In Sontag’s case, she wasn’t returning to earth after the struggle against a failing totalitarianism, she was coming down from the Sixties. But that is one of the most remarkable things about her. Not everyone was able to achieve such a soft landing after the turbulence and utopian yearnings of those years.

Sontag’s early writings are shot through with a sense of utopian exaltation, an exaltation so often associated with the Sixties. In her most ostensibly political work, “Trip to Hanoi”, she talks specifically about her mood in those days. As always, she is careful not to overstate things. “I came back from Hanoi considerably chastened,” she says. But then she goes on, heating up. “To describe what is promising, it’s perhaps imprudent to invoke the promiscuous ideal of revolution. Still, it would be a mistake to underestimate the amount of diffuse yearning for radical change pulsing through this society. Increasing numbers of people do realize that we must have a more generous, more humane way of being with each other; and great, probably convulsive social changes are needed to create these psychic changes.”

You won’t find Sontag in a more exalted state than that. Rarely, indeed, does she allow herself to become so agitated and unguarded, especially in the realm of the outwardly political. But that is exactly where one must interpret Sontag’s politics, and exaltation, extremely carefully.

Sontag’s political instincts gravitate toward the individual, in exactly the same way that she reverses the standard quasi-Marxian directions of causality in the above quote. Marxists generally want to transform consciousness as the necessary first step toward changing the world. In contrast, Sontag wants the world to change so that we can get a little more pleasure out of consciousness. Convulsive social changes, for Sontag, are but extreme measures for affecting a transformation that terminates in psychic changes. Politics means nothing if it obscures the solid core of the individual self. Her commitment to this idea gives all of her writing a Stoic ring even though she never puts forward a theory of the self or a formal ethics. It is the focus on her particular brand of pleasure that provides the key. Pleasure and the Self are so deeply intertwined in Sontag’s writing that one cannot even be conceived without the other.

Writing years later, in 1982, about Roland Barthes, Sontag spoke again pleasure and the individual self. Barthes great freedom as a writer was, for Sontag, tied up with his ability to assert himself in individual acts of understanding. Continuing a French tradition that goes back at least to Montaigne (a man not unaware of the Stoics), she argues that Barthes’ writing “construes the self as the locus of all possibilities, avid, unafraid of contradiction (nothing need be lost, everything may be gained), and the exercise of consciousness as a life’s highest aim, because only through becoming fully conscious may one be free.” She speaks about the life of the mind as a “life of desire, of full intelligence and pleasure.”

A human mind, i.e., an individual mind, will, at its best, be ‘more generous’ and ‘more humane’. But for Sontag, it is what humans have access to in the world of ideas, as individual thinking agents, that marks out the highest arena of accomplishment.

“Of course, I could live in Vietnam,” she writes in A Trip to Hanoi, “or an ethical society like this one—but not without the loss of a big part of myself. Though I believe incorporation into such a society will greatly improve the lives of most people in the world (and therefore support the advent of such societies), I imagine it will in many ways impoverish mine. I live in an unethical society that coarsens the sensibilities and thwarts the capacities for goodness of most people but makes available for minority consumption an astonishing array of intellectual and aesthetic pleasures. Those who don’t enjoy (in both senses) my pleasures have every right, from their side, to regard my consciousness as spoiled, corrupt, decadent. I, from my side, can’t deny the immense richness of these pleasures, or my addiction to them.”

Sontag’s political thinking is driven by the idea that what is otherwise ethical, is often thereby sequestered from what is great, and what is otherwise great, is often mired in the unethical. She never stopped worrying about this problem and she ended her life as conflicted about it as ever. It was a complication that, in the end, she embraced as one of the interesting, if troubling, things about the world.

But for a few brief moments, as the Sixties ratcheted themselves up year after year, she indulged herself in considering the possibility that the conflict between ethics and greatness could be resolved into a greater unity. She thought a little bit about revolution and totality. She got excited, exalted. Summing up thoughts about one of her favorite essays, Kleist’s “On the Puppet Theater,” Sontag leaves the door open for a quasi-Hegelian form of historical transcendence. She says, “We have no choice but to go to the end of thought, there (perhaps), in total self-consciousness, to recover grace and innocence.” Notice the parenthesis on ‘perhaps’. She’s aware that she (and Kleist) are stretching things by saying so, but she can’t help allowing for the possibility of ‘total self-consciousness’. Often, when Sontag uses parentheses she is allowing us a glimpse into her speculative, utopian side.

In “The Aesthetics of Silence (1967),” for instance, she equates the modern function of art with spirituality. She defines this spirituality (putting the entire sentence in parenthesis). “(Spirituality = plans, terminologies, ideas of deportment aimed at resolving the painful structural contradictions inherent in the human situation, at the completion of human consciousness, at transcendence.)”.

***

In the amazing, brilliant essays that make up the volume Against Interpretation it is possible to discover more about the utopian side of Sontag’s thinking. Drawing inspiration from Walter Benjamin, whose own ideas on art explored its radically transformative, even messianic potential, Sontag muses that, “What we are witnessing is not so much a conflict of cultures as the creation of a new (potentially unitary) kind of sensibility. This new sensibility is rooted, as it must be, in our experience, experiences which are new in the history of humanity…”

Again with the parenthesis. It is as if, like Socrates, she always had a daimon on her shoulder warning her about pushing her speculations too far. But the talk of unity is an indication of the degree to which she was inspired by the events of the time, or perhaps more than the specific events of the time, by the mood and feel of the time. Her sense that there was an “opening up” of experience, sensibility, and consciousness drove Sontag to attack certain distinctions and dichotomies she saw as moribund. Again following closely in the footsteps of Walter Benjamin and his influential “Art in the Age of Mechanical Reproduction” she writes, “Art, which arose in human society as a magical-religious operation, and passed over into a technique for depicting and commenting on secular reality, has in our own time arrogated itself a new function…. Art today is a new kind of instrument, an instrument for modifying consciousness and organizing new modes of sensibility.” This led her to a central thesis, a thesis that drove her thinking throughout the Sixties, a thesis that is nestled into every essay that makes up Against Interpretation. She sums it up thusly:

“All kinds of conventionally accepted boundaries have thereby been challenged: not just the one between the ‘scientific’ and the ‘literary-artistic’ cultures, or the one between ‘art’ and ‘non-art’; but also many established distinctions within the world of culture itself—that between form and content, the frivolous and the serious, and (a favorite of literary intellectuals) ‘high’ and ‘low’ culture.”

Sontag’s famous “Notes on ‘Camp’” is simply a sustained attempt to follow that thesis through. Her defense of camp is a defense of the idea that worth can be found in areas normally, at least back in the Sixties, relegated to the realm of the unserious. The new unity was going to raise everything into the realm of the intellectually interesting, and pleasurable.

Yet, Sontag is not trying to abolish all distinctions. It isn’t a leveling instinct. Even in her youngest days, Sontag was suspicious of the radically democratic impulses that would, say, collapse art and entertainment. Sontag is doing something different. She is trying to show that the arena for aesthetic pleasure should be vastly expanded, but never diluted. She wants the new critical eye to stay sharp and hard. Sontag’s version of pleasure is an exacting one. It is relentless and crystalline. It is an effort.

“Another way of characterizing the present cultural situation, in its most creative aspects, would be to speak of a new attitude toward pleasure. . . . Having one’s sensorium challenged or stretched hurts. . . . And the new languages which the interesting art of our time speaks are frustrating to the sensibilities of most educated people.”

In this, there was always an element of the pedagogue in Sontag. She was trying to teach a generation how to tackle that frustration in the name of aesthetic pleasure. She was driven by her amazing, insatiable greed for greater pleasure. She wanted us to be able to see how many interesting and challenging things there are in her world of art, a world vaster and richer than the one surveyed by the standard critical eye of her time. And at least in the Sixties, her passion for greatness and its pleasures spilled over into a yearning for a societal transformation that would make that passion and pleasure universal…

to be continued…

Monday, May 29, 2006

Teaser Appetizer: The Definition of Health

The world health organization (WHO) defines health as “A state of complete physical, mental and social well-being and not merely the absence of disease or infirmity” This definition entered the books between 1946 and 1948 and has remained unchanged.

Current medical knowledge is desperately struggling — only with partial success –just to “merely” control “disease or infirmity.” while “complete well being” is unlikely to sprout out of our incomplete knowledge If your politicians were to legislate health by this definition, they will be in default for ever for one obvious reason: no nation – I repeat – no nation has the knowledge or the resources to deliver care to match this definition. We all learnt in the kinder garden – well except the politicians – not to promise what we can not fulfill.

This definition is a lofty, laudable visionary statement that may reflect a distant aspiration but its realization is elusive in current practice. In all humility, we should concede that “complete — well being” is a probable unquantifiable metaphysical state which is unattainable without taming nature’s evolutionary laws of life and death. And to presume that we have the ability to do so is a whiff of arrogance – an aromatic trait our species emits in abundance.

The realization of this dream was probably considered feasible in 1948, when we had made a quantum leap in understanding infectious diseases and for the first time in human history, we were exuberant in our demonstrated ability to extend longevity by about twenty years in some countries But that was far before we could predict the explosion of health technology and understand its consequential individual, societal and economic effects.

Isn’t it time we seek a second opinion on the health of this definition and evolve a flexible definition which encompasses the current reality and is malleable enough to accommodate future developments?

While the WHO definition stays seemingly immutable, a new framework linked to human rights has evolved: The human right to health paradigm reiterates: the enjoyment of highest attainable standard of health is a fundamental right of every human being This linkage has provided an inspirational tool to demand “health..” The tenor of this discourse takes a cue from the rhetoric of Kofi Anan: “It is my aspiration that health will finally be seen not as a blessing to be wished for; but as a human right to be fought for.”

This paradigm recognizes that violation of human rights has serious health consequences and promoting equitable health is a prerequisite to development of the society. The discourse rightly demands abolition of slavery, torture, abuse of children, harmful traditional practices and also seeks access to adequate health care without discrimination, safe drinking water and sanitation, safe work environment, equitable distribution of food, adequate housing, access to health information and gender sensitivity.

All nations are now signatories to at least one human rights treaty that includes health rights. One hundred and nine countries had guaranteed right to health in their constitutions by the year 2001 which qualifies it as an effective instrument for policy change; but it also raises some difficult questions.

Human rights discourse uses the words health and health care interchangeably. Rony Brauman, past president of Médecins Sans Frontières comments: “WHO’s definition of a “right to health” is hopelessly ambiguous. I have never seen any real analysis of what is meant by the concept of “health” and “health for all,” nor do I understand how anyone could seriously defend this notion.” The notion is more defensible if the demand of health care replaced the demand for health.

Yet no country in the world can afford to give all health care to all its citizens all the time. Nations conduct a triage of priorities according to their prejudices and large swaths of populations are not caught in the health care net. Even nations that have right to health embedded in the constitution face a gap between the aspirations and resources.

The human rights debate skirts round the issue by invoking the “Principle of progressive realization”, which allows resource strapped countries to promise increments in health care delivery in future This effectively gives a tool to the governments to ration and allocate resources, even if it conflicts with individual rights.

The following example illustrates the problem: post apartheid government of South Africa had enshrined the right to health in the constitution, yet the courts decided against a petitioner who demanded dialysis that he needed for chronic kidney failure. The court ruled that the government did not have an obligation to provide treatment. The court in essence transferred some responsibility to the individual.

Gandhi had also expressed his concern that rights without responsibility are a blunder. A responsibility paradigm could supplement the rights movement; a pound of responsibility could prove to be heavier than a ton of rights, but the current noise for rights has muzzled the speech for responsibility and “Complete health” is becoming an entitlement to be ensured by the state without demanding that the family and the individual be equal stake holders. Hippocrates said “a wise man ought to realize that health is his most valuable possession and learn to treat his illnesses by his own judgment”

This conflict will escalate further with the impact of biotechnology. A quote from Craig Venter gives the feel: “It will inevitably be revealed that there are strong genetic components associated with most aspects of what we attribute to human existence — the danger rests with what we already know: that we are not all created equal. —- revealing the genetic basis of personality and behavior will create societal conflicts.”

Derek Yach, a respected public health expert and professor at Yale University says “With advances in technology, particularly in the fields of imaging and genetic screening, we now recognize that almost all of the population either has an actual or potential predisposition to some future disease.”

We can’t help but rethink about health itself before we promise health care. An alternative definition can be derived from the health field concept of Marc Lalonde who was the health minister of Canada in1974. He surmised that interplay of four elements determined health, namely: genetic makeup, environment including social factors, individual behavior and organization of health care. The health field model holds many stake holders accountable.

Each stake holder approaches health with a seemingly different goal. (Even though they complement each other) A healthy person wishes not to fall sick; a sick person demands quick relief; a health care provider attempts to cure and prevent disease; a molecular biologist envisions control of molecular dysfunction; a public health person allocates resources to benefit maximum number of people; a health economist juggles finances within the budget; the government facilitates or hampers the delivery of care according to its priorities and the activist demands that every person has the right to the” Highest attainable standard of physical and mental health.”

Many stake holders mean more questions than answers. Who decides the limits of health a society should attain? Shall the boundary limit to basic primary care or extend to genetic manipulation to deliver well being? Who decides the mechanism of attaining that limit? Who decides positive mental well being? And who pays for it?

It is apparent that ‘Complete well being’ is as much an oxymoron as ‘airline food!’ We urgently need a new definition as a starting point for debate: a definition that is quantifiable for outcomes, accommodative of stake holders, absorbent of future advances, accountable for delivery of care and cognizant of limitations. The new definition has to be both correct and politically correct. Dr. Brundtland, former director-general of the WHO, wrote in the world health report that “The objective of good health is twofold – goodness and fairness; goodness being the best attainable average level; and fairness, the smallest feasible differences among individuals and groups.” We should match our expectations to reality.

These elements, compressed and enveloped into a workable statement, may sound as follows:

Health is a state of freedom from physical and mental consequences of molecular and psychological derangements caused by the interaction of individual biology and the environment; health care is an attempt to reverse such derangement by providing equitable access to all without discrimination within the constraints of available resources and knowledge.

You may call this, if you please: the 3QD definition of health — you read it here first!

Dispatches: Affronter Rafael Nadal

Roland Garros, or tennis’ French Open, started yesterday.  Perhaps you’ve noticed; articles ran in most Sunday papers about it, quite extensive ones too, considering that the French has often been viewed as a third-rate (after Wimbledon and the U.S. Open) Grand Slam tournament, largely because it is usually won by a cadre of specialists instead of the best-known players.  Not only is this perception unfair, but, this year, Roland Garros will be the most important men’s tennis tournament of the year.  Here’s why.

The increasing specialization of tennis has meant that this tournament, the only Grand Slam played on clay, has a set of contenders that is quite distinct from those at the grass courts of Wimbledon and the hardcourts of Flushing Meadows, Queens.  Not only has it been won by players who have not been dominant on the other surfaces, but it has been very difficult for anyone to enjoy repeat success sur la terre battue.  Ten of the last twelve Wimbledons were won by Pete Sampras and Roger Federer; the last five winners of Roland Garros are Gustavo Kuerten, Albert Costa, Juan Carlos Ferrero, Gaston Gaudio, and Rafael Nadal.  I’m going to try to explain both phenomena (specialized success and lack of repeat dominance) below.

Why does it make a difference what surface the game is played on, and what difference does it make?  Basically, the surface affects three things: the speed of the ball after it bounces, the height of the ball’s bounce, and the player’s level of traction on court.  In terms of the speed of the ball and height of its bounce, clay is the slowest and highest, and grass is the fastest and lowest, with hardcourt in the middle.  This results in differing strategies for success on each surface, with grass rewarding aggressive quick strikes – with the speed of the ball and the low bounce, you can ‘hit through’ the court and past the other player with relative ease.  For this reason, the great grass-court players have mostly been offensive players, who use serve-and-volley tactics (i.e., serving and coming to net to take the next ball out of the air).  Clay, on the other hand, reverses this in favor of the defensive player: the slow, high bounce means it is very tough to hit past an opponent, and points must be won by attrition, after long rallies in which slight positional advantages are constantly being negotiated before a killing stroke.  Clay-court tennis is exhausting, brutal work.

Clay and grass, then, are opposed, slow and fast, when it comes to the ball.  How then did Bjorn Borg, perhaps the greatest modern player (he accomplished more before his premature retirement at twenty-five than anyone other than Sampras) manage to win Roland Garros (clay) six times and Wimbledon (grass) five but never a major tournament on the medium paced surface, hardcourt?  The third variable comes into play here: traction.  Clay, and, to a lesser extent, grass, provide negative traction.  That is, you slip when you plant your foot and push off.  Hardcourt provides positive traction – your foot sticks.  Consequently, entirely different styles of quickness are needed.  Borg didn’t like positive traction.  On clay, particularly, players slide balletically into the ball, the timing for which skill is developed during childhood by the most talented players, most of whom grew up in countries where clay courts are the rule: Spain, Italy, Argentina, Chile, Brazil.  Grass is not as slidey, but offers less traction than the sticky hardcourts, and like clay, grass’ uneven natural surface produces unpredictable hops and bounces, frustrating the expectations of the more lab-conditioned hardcourt players.

So, clay slows the ball and provides poor footing, both of which qualities means that it’s ruled by an armada of players who grow up playing on it and mastering the movement and strategic ploys it favors.  Perhaps foremost among these is the dropshot, which works because the high bounce of the clay court drives players way back and sets them up for the dropper.  This explains the dominance of the clay specialists, but why has the title switched off among so many players lately?  For the most part, this is because of the grinding nature of clay.  So much effort must be expended to win a match (five sets on clay can take five hours of grueling back-and-forth; in contrast, bang-bang tennis on grass can be practically anaerobic), that players tire over the course of the tournament, and so much depends upon perseverance that a superhuman effort will often overcome a greater talent.  It just so happens that last year there emerged a player who combines the greatest clay talent with the greatest amount of effort, but more on him below.  For now, let me return to my claim that this edition of the French is the most important men’s tennis event this year.

Historically, the greatest offensive players (meaning players who try to dictate play and win points outright, rather than counterpunchers, who wait for their opening, or retrievers, who wait for you to mess up), have been unsuccessful at Roland Garros, while the defensive fiends who win in Paris have been unsuccessful on grass.  (Borg, a counterpunching genius, is the great exception.)  The best attackers, namely John McEnroe, Boris Becker, Stefan Edberg, and of course Pete Sampras, have won zero French Opens, while Ivan Lendl, a three-time Roland Garros winner, narrowly failed in his endearing late-career quest to win Wimbledon (all of these players won major titles on hardcourts as well).  The only man since 1970, in fact, to win all four major titles (known as the Grand Slam tournaments), on the three disparate surfaces, is one Andre Agassi, a hybrid offensive baseliner.  This has made the dream of winning all four Slams in a single year, a feat also known, confusingly, as winning the Grand Slam–last accomplished by Rod Laver in 1969–seem pretty quixotic nowadays.  Until now.  The game’s best current offensive player is also an excellent defensive player, and an extremely competent mover and slider on clay.  Roger Federer has the best chance of anyone since Agassi to win the career Grand Slam, and, as the holder of the last Wimbledon, U.S. Open, and Australian titles, could win his fourth straight major this month (a feat he is calling, with a little Swiss hubris, the “Roger Slam”).  If he succeeds this year at Roland Garros, he’ll accomplish something Sampras couldn’t, and if he does I think it’s almost inevitable that he’ll sweep London and Flushing and complete the calendar Grand Slam as well. 

Standing in the way of Federer’s c.v.-building efforts is the aforementioned combination of talent and drive, the nineteen-year-old Mallorcan prodigy Rafael Nadal.  He had one of the finest seasons I’ve ever seen last year, absolutely destroying the field on clay, winning Roland Garros, winning over Agassi in Montreal and over Ljubicic in Madrid.  He’s now won a record 54 matches on clay without a loss.  Not only does Nadal’s astonishing effort level intimidate opponents, but he is surprisingly skilled, a bulldog with the delicacy of a fox.  You can see him break opponents’ spirits over the course of matches, endlessly prolonging rallies with amazing ‘gets,’ or retrievals, which he somehow manages to flick into offensive shots rather than desperate lobs.  When behind, he plays even better until he catches up.  His rippling physique and indefatigable, undying intensity make him literally scary to face on clay.  And yet, when off the court, he is a personable and kind presence at this stage of his young life.  All in all, a player this brutal has no business being this likable, but there it, and he, is.

Nadal and Federer have played six times: Nadal has won five, and held a huge lead in the other before wilting on a hardcourt.  Let me underline here just how anomalous this state of affairs is: here we have the world number one on a historic run of victories, and yet he cannot beat number two.  Federer has lost his last three matches with Nadal; with all other players, he has lost three of his last one hundred and nineteen matches.  Rafa is the only player on whom Federer cannot impose his will; indeed, Federer must try and quickly end points against Nadal to avoid being imposed upon.  In the final at Rome two weeks ago, Federer unveiled a new strategy, coming in to net whenever the opportunity arose, though not directly following his serve.  Federer’s flexibility, his ability to adopt new tactics, made for a delicious and breathtaking final, which he led 4-1 in the fifth and final set, and held two match points at 5-4.  Here Nadal’s hypnotic retrieving unnerved him once again, and two errors led the match to a fifth-set tiebreaker.  In a microcosmic repetition, Federer again led (5-3 and serving) and again let the lead slip away.  Nadal, after a full five hours, took the title and reconfirmed his psychological edge, even over the most dominant player of the last twenty years.  His confidence will be nearly unimpeachable, where Federer’s will be shaken by losing a match in which he played the best clay-court tennis of his life.  If, as expected, they play again in the final of Roland Garros, for all the marbles, you’re going to see the most anticipated tennis match in several years.

(Note: I have gone on for way too long without handicapping the women’s field, for which I apologize.  I’ll just say here that I am hopeful that France’s glorious all-court player, Amelie Mauresmo, will win.)

See All Dispatches.

Selected Minor Works: Why We Do Not Eat Our Dead

Justin E. H. Smith

[An extensive archive of Justin Smith’s writing is now online at www.jehsmith.com]

Now that an “extreme” cookbook has hit the shelves offering, among other things, recipes for human flesh (Gastronaut, Stefan Gates, Harcourt, 257 pages; paperback, $14), perhaps our gross-out, jack-ass culture has reached the point where it is necessary to explain why these must remain untried.

I will take it for granted that we all agree murder is wrong. But this alone is no argument against anthropophagy, for people die all the time, and current practice is to let their remains simply go to waste. Why not take advantage of the protein-rich corpses of our fallen comrades or our beloved elderly relatives who have, as they say, “passed”? Surely this would not be to harm them or to violate their integrity, since the morally relevant being has already departed or (depending on your view of things) vanished, and what’s left will have its integrity stolen soon enough by flame or earth. Our dearly departed clearly have no objections to such a fate: they are dead, after all. Could we not then imagine a culture in which cannibalizing our dead were perfectly acceptable, perhaps even a way of honoring those we loved?

The fact that we do not eat our dead, in spite of their manifest indifference, has been duly noted by some participants in the animal-rights debate. They think this reveals that whatever moral reasoning goes into our decisions about what sort of creature may be eaten and what must be left alone, it simply is not, for most of us, the potential suffering of the creature that makes the moral difference. Whereas Peter Singer believes that we should stop eating animals because they are capable of suffering, others have responded that this is beside the point, since we also make humans suffer in multifarious ways. We just don’t eat them.

But again, why not? Some moral philosophers have argued that the prohibition has to do with respect for the memory of the deceased, but this can’t get to the heart of it, since there’s no obvious reason why eating a creature is disrespectful to it.

It may be the answer is simply that, as a species, we are carrion-avoiders. After all, it is not just the vegetarian who will not eat a cow struck by lightning, but the carnivore as well. Put another way: we do not eat fallen humans, but we also do not eat fallen animals; we eat slaughtered animals. It is then perhaps not so much the fact that dead humans are (or were) human that prevents us from eating them, but the fact they are carrion, and that we, as a species, are not scavengers.

Consider in this connection the Islamic Shariah laws that one must follow if one wishes to eat a camel that has fallen down a well (I turn here to the version of the rules stated as stated by the Grand Ayatollah Sistani): “[If the camel] falls down into a well and one feels that it will die there and it will not be possible to slaughter it according to Shariah, one should inflict a severe wound on any part of its body, so that it dies as a result of that wound. Then it becomes… halal to eat.”

Now, why is it considered so important to inflict a fatal wound before the camel dies as a result of its fall? Though this is but one culture’s rule, it seems to be the expression of a widespread prohibition on eating accidentally dead animals. In the case of the camel, an animal that is about to die from an accident, and the instruction is: if you want to eat it, you better hurry up and kill it before it dies! This suggests that people do not slaughter simply so that a creature will be dead, but rather so that it will be dead in a certain way. Relatedly, in the southern United States, roadkill cookbooks are sold in souvenir shops as novelty items, and the novelty consists precisely in the fact that tourists are revolted and amused by the thought of the locals scavenging like vultures.

Of course, human beings do in fact eat other human beings, just not those dead of natural or accidental causes. Some decades ago, the reality of cannibalism was a matter of controversy. In his influential 1980 book, Man-Eating Myth: Anthropology and Anthropophagy the social anthropologist William Arens argued that stories of cannibal tribes were nothing more than racist, imperialist fantasies. Recently, though, substantial empirical evidence has been accumulated for the relative frequency of cannibalism in premodern societies. Notable among this work is Tim White’s archaeological study of anthropophagy among the Anasazi of Southwestern Colorado in the twelfth century. More recently, Simon Mead and a team of researchers have made the case on the basis of genetic analysis that epidemics of prion diseases plagued prehistoric humans and were spread through cannibalistic feasting, in much the same way that BSE spreads among cattle.

In the modern era, frequent reports of cannibalism connected with both warfare and traditional medicine come from both natives and visitors in sub-Saharan Africa. Daniel Bergner reported in the New York Times that “in May [2003], two United Nations military observers stationed in northeastern Congo at an outpost near Bunia, a town not far from Beni, were killed by a local tribal militia. The peacekeepers’ bodies were split open and their hearts, livers and testicles taken – common signs of cannibalism.” One of Bergner’s informants, a Nande tribesman, recounts what happened when he was taken prisoner by soldiers from the Movement for the Liberation of Congo:

“One of his squad hacked up the body. The commander gave Kakule [the informant] his knife, told him to pare the skin from an arm, a leg. He told Kakule and his other assistant to build a fire. From their satchels, the soldiers brought cassava bread. They sat in a circle. The commander placed the dead man’s head at the center. He forced the two loggers to sit with them, to eat with them the pieces of boiled limb. The grilled liver, tongue and genitals had already been parceled out among the commander and his troops.”

Bergner notes that it is a widespread, and commonly acknowledged belief in the region that eating the flesh, and especially the organs, of one’s enemy is a way to enhance one’s own power. This practice is sufficiently documented to have been accepted as fact by both the U. N. high commissioner for human rights as well as Amnesty International.

Cannibalism has been observed in over seventy mammal species, including chimpanzees. The hypothesis that cannibalism is common to all carnivorous species, or that this is something of which all carnivores are capable under certain circumstances, does not seem implausible. If one were to argue that these recent reports are fabrications, and that its modern disappearance in our own species has something to do with ethical progress, surely sufficient counterevidence could be produced from other, even better documented practices to quickly convince all concerned that no progress has been made.

The evidence suggests that, when cannibalism does happen, it is never the result of the fortuitous death of a comrade and the simple need among his survivors for protein. Rather, it follows upon the slaughtering of humans, which is exactly what we would expect, given the human preference for slaughtered pigs and cows over lightning-struck ones. Where eating animals is permitted, there is slaughter. And where slaughtering humans is permitted, the general prohibition on eating them does not necessarily hold.

In short, eating human beings is wrong because murder is wrong, and there’s no way to get edible meat but by slaughtering it. I suppose Stefan Gates could look for a “donor,” who would in case of an untimely death –a car accident, say– dedicate his body to pushing the limits of experimental gastronomy. But if the cook fails to find any willing diners, this may have much more to do with our distaste for roadkill than with respect for the memory of a fellow human.

Monday Musing: Frederica Krueger, Killing Machine

Catty1_1It is a warm, languorous, late-spring day here in New York, and I don’t feel like thinking about anything complicated. So, I’m just going to tell you a cat story today.

A couple of months ago, my wife Margit’s friend Bailey asked us to look after her cat (really just a kitten) while she was going to be out of town for about ten days. It was decided that the cat would just stay with us during that time. Bailey had only recently found the cat cowering in her basement, half-starved and probably living on the occasional mouse or whatever insects or other small creatures she could find. Bailey hadn’t got around to naming the cat yet, and not wishing to prematurely thrust a real name upon her, we just called her Catty while she stayed with us. We thought she must be about six months old at that time, but she was quite tiny. Catty, to put it kindly, turned out to be a more ferociously mischievous cat than I had ever seen before. She did not like to be petted, and shunned all forms of affection. This, however, should by no means lead you to infer that our interactions with Catty were limited or sparse. Not at all: we were continuously stalked and hunted by her. I may not know what it is like to be a bat, but thanks to Catty, I have a pretty good idea what it is like to be an antelope in the Serengeti! [Photo shows Catty when she first came to stay with us.]

250pxfreddykCatty wanted to do nothing but eat and hunt. Any movement or sound would send her into a crouching tiger position, ears pinned back, tail twitching. Though she is very fast, her real weapon is stealth. (Yes, she is quite the hidden dragon, as well.) I’ll be watching TV or reading, and incredibly suddenly I am barely aware of a grayish blur flying through the air toward me from the most unexpected place, and have just enough time to instintively close my eyes protectively before she swats me with a claw. After various attacks on Margit and me which we were completely helpless to prevent, and which left us mauled with scratches everywhere (and I had been worried about cat hair on my clothes making me look bad!), Margit took her to a vet to have her very sharp nails trimmed (we did not have her declawed, which seemed too cruel and irreversible). The vet asked Margit for a name to register her under, and Catty immediately tried to kill him for his impertinence. While he bandaged his injuries, Margit decided to officially name the little slasher Frederica Krueger, thereby openly acknowledging and perhaps even honoring her ineluctably murderous nature. We started calling her Freddy.

Abbas_and_lord_jim_1Here’s the funny thing: despite her fiercely feral, violent tendencies, Freddy was just so beautiful that I fell in love with her. To echo Nabokov’s Humbert Humbert speaking about another famous pubescent nymphet: Ladies and Gentlemen of the Jury, it was she who seduced me! As Freddy got more used to us, it was as if she could not decide whether to try and eat us, or be nice. She started oscillating between the two modes, attacking and then affectionately licking my hand, then attacking again… But it was precisely the graceful, lean, single-minded perfection of her design as a killing machine that I could not resist. Like a Ferrari (only much more impressive), she was clearly built for one thing only, and therein lay her seductive power. (Okay, I admit it, I’ve always liked cats. The photo here shows me sitting on a chimney on the roof of our house in Islamabad in the late 60s with my cat Lord Jim.)

We mostly read whatever psychological intentions we want (and can) into our pets, imputing all sorts of beliefs and desires from our own psychological economies to them, and this works particularly well to the advantage of cats. They are just intelligent enough to get our attention as intentional agents (unlike say, a goldfish, or even a hamster, which seem barely more than the automatons Descartes imagined all animals except humans to be), but the fact that they are very mentally rigid and cannot learn too much makes them seem imperious, haughty, independent, and noble to us, unlike dogs, who are much more flexibl250pxcat_mummy_maske in intelligence and can learn to obey commands and do many tricks to please us. Let’s be blunt: cats are quite stupid. But to be fair, maybe much of the nobility we read into some humans is also the result of their rigidity. Who knows. In any case, cats are such monomaniacally hardwired hunters that it is impossible not to admire their relentless pursuit of prey, even if (in my case!) that prey is us. Since like many gods of the ancients, cats are mostly oblivious to human wishes and impossible to control, it is no surprise that some ancient peoples held them to be gods.

In ancient Egypt cats were considered deities as early as 3000 BCE and later there existed the cult of the goddess Bast, who was originally depicted as a woman with the head of a lioness, but soon changed to an unmistakeably domestic cat. Since cats were considered sacred, they were also mummified. Herodotus reports that when Egyptian cats died, the members of the household that owned it would shave their eyebrows in mourning. Killing a cat, even accidentally was a capital crime. The cult of Bast was officially banned in 390 BCE, but reverence for cats continued. Another greek historian, Diodorus Siculus, relates an incident from about 60 BCE where the wheels of a Roman chariot accidentally crushed an Egyptian cat. An outraged mob immediately killed the soldier driving the chariot.

The domestic cat was named Felis catus by Linnaeus, and like dogs, belong to the order Carnivora. Not all carnivores are in this order (even some spiders are carnivores, after all) and not all members of the Carnivora are carnivores, such as the panda. Other members of this order are bears, weasels, hyenas, seals, walruses, etc. Like our own, the ancestors of the modern domestic cat came from East Africa. Cats were probably initially allowed or encouraged to live near human settlements because they are great for pest control, especially in agricultural settings with grain storage, etc. This arrangement also afforded cats protection from larger predators who stayed away from humans for the most part. Even now, cats will hunt more than a thousand species of small animals. Domestic cats, if left in the wild, will form colonies, and by the way, a group of cats is known as a clowder. (Be sure to throw that into your next cocktail party conversation.)

Ae411aIt took even physicists a while to figure out how a cat always lands on its feet, which is known as its “righting reflex.” The problem is that in mid-air, there is nothing to push off against to change your orientation (imagine being suspended in space outside a rocket, and trying to rotate). So how do they do it? The answer is actually quite technical and has to do with something called a phase shift. (Like a spinning figure skater being able to speed up or slow down her rate of rotation by drawing her arms in or holding them out.) What the cat does is first put its arms out and rotate the front half of its body in one direction and the back half in the opposite direction (a twisting motion), then it draws its arms in and twists in the opposite direction. But because angular momentum must be conserved, and angular momentum depends on the radial distance of mass from its axis of rotation, it will rotate back less this time, thereby achieving a net rotation in the direction of the first twist. If you don’t get it, don’t worry about it!

Cats appear frequently in fiction and writers seem to have a particular predilection for them. Ernest Hemingway and Mark Twain were serial cat-owners. Hemingway at various times had cats named Alley Cat, Boise, Crazy Christian, Dillinger, Ecstasy, F. Puss, Fats, Friendless Brother, Furhouse, Pilar, Skunk, Thruster, Whitehead, and Willy. Twain’s cats were Appolinaris, Beelzebub, Blatherskite, Buffalo Bill, Satan, Sin, Sour Mash, Tammany, and Zoroaster. Meanwhile, Theodore Roosevelt’s cat Tom Quartz was named for a cat in Mark Twain’s Roughing It. T.S. Eliot owned cats named Tantomile, Noilly Prat, Wiscus, Pettipaws, and George Pushdragon. William and Williamina both belonged to Charles Dickens.

Lord Byron and Jorge Luis Borges both had cats named Beppo. (Byron travelled accompanied by five cats.) Edgar Allen Poe had Catarina; Raymond Chandler, Taki. Kingsley Amis’s cat was Sara Snow. Some cats were, of course, named for famous people as well as owned by them, such as Gloria Steinem’s Magritte and Anatole France’s Pascal. John Lennon was the proud owner of Elvis. John Kenneth Galbraith was forced to change his cat’s name from Ahmedabad to Gujarat after he became the U.S. ambassador to India because Muslims were offended by “Ahmed” (one of Mohammad’s names) being associated with a cat. Mohammad himself, according to a report (hadith) attributed to Abu Huraira, owned a cat named Muezza, about whom it is said that one day while she was asleep on the sleeve of Mohammad’s robe, the call to prayer was sounded. Rather than awaken the cat, Mohammad quietly cut his sleeve off and left. When he returned, the cat bowed to him and thanked him, after which she was guaranteed a place in heaven.

Drevil_bigglesworth1Isaac Newton not only loved cats, but is also said (probably apochryphally) to be the inventor of the “cat flap,” allowing his cats to come and go as they pleased. (Wonder how long a break he had to take from inventing, say calculus, to do that.) And by the way, among famous cat haters can be counted such luminaries as Genghis Khan, Alexander the Great, Julius Caesar, Napoleon Bonaparte, Benito Mussolini, and last but not least, Adolf Hitler. What is it about cat-hating that basically turns one into a Dr. Evil? But wait, Dr. Evil likes cats!

Okay, enough random blather. Back to Ms. Frederica Krueger’s story: as the moment of Bailey’s return from her trip and the time for Freddy to leave us approached, I grew more and more agitated, finally threatening Margit that I would kidnap the cat and run away with her unless she did something to stop Bailey from coming to pick up her cat. At first Margit tried to tell me that we could get another cat, which only made me regress further and throw a tantrum yelling, “I don’t want another cat! I only want this cat!” At this point, Margit told me I had finally cracked up completely and advised me to call a shrink. Bailey was coming to get the cat early next morning. I went to bed late, as I often do, and was still asleep when Margit awakened me to say that Bailey had agreed to let us have the cat as it seemed very happy here, and Bailey’s apartment was really too small anyway. Thus Frederica becames ours, and we remain her willing and ever-anxious prey.

Freddy’s Photo Gallery

Here are some glamour and action shots of Ms. Frederica Krueger, which you can click to enlarge. Captions are below the photos:

Fk1_1 Fk4 Fk2

Fk3 Fk5 Fk6

TOP ROW:

  1. I catch Freddy suddenly pouncing on an unsuspecting Margit’s hand from behind our living room sofa (a favorite place of hers from which to launch her demonic attacks). Her eyes reflect the light from the camera flash because of a mirror-like layer behind her retinas called the tapetum. Nocturnal animals have this reflective surface there to bounce photons back toward the photosensitive cells of the retina, thereby almost doubling the chance that they will be registered, and greatly improving the animal’s night vision. The daytime vision of cats is not as good as humans, however.
  2. She is striking a deceptively demure pose. Don’t let if fool you. I have paid dearly for that mistake. In blood.
  3. Freddy loves this incredibly silly toy, which is basically just a little felt mouse that goes around and around, driven by a battery-powered motor. She spends inordinate amounts of time and energy trying to slay this patently fake rodent.

BOTTOM ROW:

  1. Freddy has a habit of sitting on various bookshelves in the apartment, usually at a greater height than in this picture, surveying the scene below, much like a vulture.
  2. Margit too-bravely holds Freddy in her lap, who is only milli-seconds away from trying to shred Margit’s hands with the claws of her powerful rear legs.
  3. If you didn’t believe me when I said that often all I see is a grayish blur flying at me, have a good look at this picture (enlarge it by clicking on it) taken at 1/8th of a second shutter speed. Freddy is jumping from a lower bookshelf to the shelp avove the stereo on the right, so she can climb to even higher shelves along that wall.

Have a good week!  My other Monday Musing columns can be seen here.

Monday, May 22, 2006

Monday Musing: Modern Myths

I’ve spent the last two months binge watching nearly every season of Buffy the Vampire Slayer. Season 6 sadly got sent to a different address, forcing me to wait and watch the series out of order. I’ve seen every episode at least a few times, so no surprise is ruined. (Since I’ve bought my DVD sets, I’ve watched a few episodes a few times.) There is a certain satisfaction to watching the series of out sync, something akin to looking through photo album and remembering your life out of order.

Watching these episodes, I find myself more caught up in the world of BtVS. I certainly need more that the 7 seasons. I’ve found myself reading though the whedonwiki (after Joss Whedon, the creator of BtVS and the series Angel), hyperlinked episode guides, but mostly a lot of fan fiction.

Fan fiction as a genre is fairly well examined, although there are plenty of debates about what counts as fan fiction. Satire or works such as The Wide Sargasso Sea don’t really seem to cut it. The earliest clear instance of fan fiction may be Sherlock Holmes related stories. Apparently after Conan Doyle killed Holmes off in 1893, fans of the detective wrote tales of the Baker Street Irregulars, the street urchins that Holmes and Watson would turn to for information. It’s at least a century old. Contemporary fan fiction seems to have really taken off with Star Trek.

The effect of a work of fan fiction is simple. The fan of a television show, comic book, movie, etc. becomes a producer of the stories set in these worlds and not merely a consumer of them. That’s pretty straightforward. Another effect is that the storylines spin out of control, series become inconsistent and characters’ personalities follow arcs that seem at odds with that of the originals.

Fan fiction is not the only genre to suffer from inconsistencies. The other genre with similar problems, perhaps virtues is comics. Apart from a few foundational moments, it’s impossible to tell the history of Batman, for example. Part of this stems from the fact that many stories are written by many writers over the decades since the Batman character first appeared.

Verification becomes a problem in these universes. Being works of fiction we can only look to the texts themselves, and inconsistencies become contradictions when we try to a sense of what happened in a story universe. Movies, cartoon, and video games only compound the problem.

In the case of comic books, there are attempts every so often to try to re-write the history of the hero’s universe. The fact of the contradictions are faced head-on, but with a multi-universe caveat, and some authoritative “smoothing” is carried out. The results seem more confusing than the problem. With fan fiction, the studio or the author declares a canon, with everything outside being non-canonical. Of course, the “canon” is not a legal category, and ultimately it’s left to the community of readers to “decide”, as it were.

Fan fiction and comic books point to two opposing tendencies, one associated with antiquity and the other with modern narratives. At least that was my impression when I started plowing through some of the Buffy fan fiction and was hit with the lists of story synopses that were incompatible with each other. I got that sense largely because of a passage from The Marriage of Cadmus and Harmony, which I’d been reading recently.

Mythical figures live many lives, die many deaths, and in this they differ from the characters we find in novels, who can never go beyond the single gesture. But in each of these lives and death all the others are present, and we can hear their echo. Only when we become aware of a sudden consistency between incompatibles can we say we have crossed the threshold of myth. Abandoned in Naxos, Ariadne was shot dead by Artemis’s arrow; Dionysus ordered the killing and stood watching, motionless. Or: Ariadne hung herself in Naxos, after being left by Theseus. Or: pregnant by Theseus and shipwrecked in Cyprus, she died in childbirth. Or: Dionysus came to Ariadne in Naxos, together with his band of followers; they celebrated a divine marriage, after which she rose into the sky, where we still see her today amid the northern constellations. Or: Dionysus came to Ariadne in Naxos, after which she followed him around on his adventures. Sharing his bed and fighting with his soldiers; when Dionysus attacked Perseus in the country near Argos, Ariadne went with him, armed to fight amid the ranks of the crazed Bacchants, until Perseus shook the face of Medusa in front of her and Ariadne was turned to stone. And there she stayed, a stone in a field.

Only when we become aware of a sudden consistency between incompatibles can we say we have crossed the threshold of myth. And if there’s a sign that Holmes, Batman, Kirk, Picard, Dax, Faith, Spike, the others all crossed the threshold of myth, it may be this in the structure of their narratives and the feeling of consistency between incompatibles that you find reading fan fiction.

Monday, May 15, 2006

Lunar Refractions: in it for the Long Run

Hokusaisketch1_1Last week I had the fortune to see the Hokusai exhibit at the Sackler Gallery in Washington, DC. Hokusai lived to be eighty-nine (or ninety, depending on your calendar), 157 years ago. The show addressed his entire time on earth, from 1760 to 1849, and the work spanned from just after his apprenticeship to his terrestrial end. I cannot say much here about the exhibit, because the work just needs to be seen, but within it were embedded a lot of very timely ideas.

By any Other Name it’s not the Same

“With each major shift in direction of his life and art, Hokusai changed his artistic name….” – introductory panel in the Sackler Gallery exhibit

Some of Hokusai’s names:
1779–1794 Shunro (age nineteen to thirty-four)
1795–1798 Sori (age thirty-five to thirty-eight)
1798–1809 Hokusai, “North [star] studio” (age thirty-eight to forty-nine)
1810–1819 Taito (age fifty to fifty-nine)
1820–1833 Iitsu, “one again,” referring to an auspicious sixty-year cycle (age sixty to seventy-three)
1834–1849 Manji, “10,000” or “eternity” (age seventy-four to eighty-nine or ninety)

This is an approach I think Madonna would agree with (her latest album is fantastic, in that it sounds precisely like her, in that she never sounds the same), even if her particular name is too emblematic to be easily replaced. Every artist, of every sort, works in phases, and marking them—even honoring them—with a special name seems to make perfect sense. This is frequently done for political, whimsical, or other reasons, but usually one name is replaced with one other; rarely does anyone attempt the incessant name-shifting that Hokusai did.

Hokusaisketch2_1This relates to the idea of taking a pseudonym, several times over. Amantine Lucile Aurore Dupin became George Sand. Marie Henri Beyle became Stendhal. Samuel Langhorne Clemens became Mark Twain. Herbert Ernst Karl Frahm became Willy Brandt. Marion Morrison became John Wayne. Charles Édouard Jeanneret became Le Corbusier. Kurt Erich Suckert became Curzio Malaparte. Charles Lutwidge Dodgeson became Lewis Carroll. Benjamin Franklin became (on occasion, and delightfully) Silence Dogood. William Michael Albert Broad became Billy Idol. Stephen Demetre Georgiou became Cat Stevens became Yusuf Islam. Norma Jean Mortensen became Norma Jean Baker became Marilyn Monroe. And who are you?

But I don’t mean to get too sidetracked; Hokusai’s names were often adopted for their significance. I sure hope to see myself as one again if I turn sixty, and at seventy-four I wouldn’t mind if people were to invoke eternity when calling me. What the exhibition didn’t make clear to me was whether people followed Hokusai’s works as his despite the changing names; he’d become quite famous by the name of Hokusai in his late thirties, and I’m unclear as to whether his fans bought works by Taito, Iitsu, and Manji knowing that they were his or not. History has a way of distorting these things. Our own contemporary J. T. Leroy, or whoever, rose to fame by the age of twenty or so, only to have everyone who had previously fawned over him/her/it lose track of the writings amid the identity debate. I enjoyed watching the whole thing, as it proved how important the identity behind a work is to contemporary audiences, to the point of dismissing the work itself if the identity comes into question. Perhaps Leroy wouldn’t have had such trouble if people were more focused on the writing from the start, as opposed to marveling at questions of age, sex, and other eminently consumable trivia.

Moving on, Painting on

In his postscript to One Hundred Views of Mount Fuji, Hokusai gives us a brief sketch of his view of life: “From the time I was six, I was in the habit of sketching things I saw around me, and around the age of fifty, I began to work in earnest, producing numerous designs. It was not until after my seventieth year, however, that I produced anything of significance. At the age of seventy-three, I began to grasp the underlying structure of birds and animals, insects and fish, and the way trees and plants grow. Thus, if I keep up my efforts, I will have even a better understanding when I am eighty, and by ninety will have penetrated to the heart of things. At one hundred, I may reach a level of divine understanding, and if I live a decade beyond that, everything I paint—every dot and line—will be alive. I ask the god of longevity to grant me a life long enough to prove this true.” [translation by Carol Morland]

What I find remarkable here is that he skips straight from the age of six to fifty. There would be little space for him in today’s art world. But he went ahead anyway. In his incessant work he conversed with any- and everything around him: people, animals, rocks, poems, seasons, trades. Amid the dozens of mass-market illustrated books (manga) he published were titles such as Various Moral Teachings for all Time (at age twenty-four) and Women’s Precepts (at age sixty-eight). All that before he produced anything of significance.

Thinking of all the things Hokusai conversed with in his work, it occurred to me that I care about things born before me because they provide such good conversation. I have great difficulty, not to mention a sense of futility, starting anything of my own without a checking previous references and precedents—such context provides meaning. If I do entertain the delusion of working outside of all previously tread paths, I inevitably (and thankfully) come across something that has already achieved (many years ago, and better) what I had in mind. I was discussing this with a neighbor of mine who paints, critiques, and writes about art, and it all came down to meaning and conversation, in all senses, across medium and time.

MensaWhich brings me to one of my favorite pieces ever, a table top made between 400 and 600 in Byzantium, now at the Metropolitan Museum of Art. It was probably used to celebrateLekythos feasts held at the grave in honor of the dead. Why do I bring this up? Because for me it is an object that visually embodies the very place of conversation—where people gather, meet, often eat or drink, and listen and talk. Next door to this are a bunch of terracotta pieces I’d always grouped with the famous red- and black-figure vessels, but had preferred over the others solely for their white ground. Strolling by them last year with a friend from Greece, I mentioned my favorites, and she replied, “oh, of course, the funerary lekythos.” The “of course” threw me off, since my ignorance had placed them on the wine- and water-bearing Dionysian level of all the others, but I was quickly told they held oil, and were always found in tombs. Looking closer, they all feature scenes of parting or visitation between mourners and the dead. No wonder I found their serene beauty enchanting.

HermeslekythosAncient Greek culture popped up again last weekend, in the most unexpected place. I was at Doug Aitken’s Broken Screen Happening at 80 Essex Street, sponsored by Hermès and Creativetime, where I was somehow admitted despite not being nearly cool enough, judging from the crowd. The highlight of the evening was when musAdamgreenician Adam Green  thanked Hermes, aptly pronouncing it like the Greek god of boundaries and travelers who cross them, as well as orators, literature, poets, commerce, and a bunch of other things—as opposed to the French god of handbags. So many people were talking over the performer that it was difficult to hear. Hermes also acts as translator and messenger between the gods and humans. All of it was just too perfect. Though Hokusai wasn’t granted all the time he wished for, he certainly made the most of what he was given, and I’m sure he and Hermes are having a grand old time giving us hints about ideas we think are our own.

The Hokusai exhibit closed yesterday. All things come to an end eventually.

[In memory of STR and JMD].