Lives of the Cannibals: Isolation

The word derives from the Latin insulatus–made into an island–and it has a nasty sense to it, or so goes conventional thinking. Among its associations: disease, betrayal, failure, separation. It is the fate of the disgraced ruler (Napoleon’s sentence, true to the word’s root), the madman (isolated even from his own limbs by the fastening of straps) and the infected (the soon-to-be-dead, obscured by thick sheets of plastic and extensive breathing apparatus). It is what mothers fear for their children (who must be socially integrated, who must play well with others in order to get along and ahead in the world) and what children fear for their doddering parents (who must be reminded that they still belong to this world). It is a word without much positive association, at least in the minds of most people. We are taught to value plurality, consensus and feedback, and to regard the defiantly singular as suspect.

Such a shame, these negative connotations, especially considering that the word itself is quite properly defined and sourced. But we do understand its associations well–disgrace, insanity, imminent death–and we New Yorkers embrace them. We are (colorfully, proudly) isolation’s wealthy priests, a brotherhood of rejected, contagious madmen–and don’t you shake your head, Cowboy, in your disingenuous shame….I know pride when I see it!–clambering together on several rocks at the edge of the Atlantic. Of course, difference has always been a source of pride, a desirable feature in moderation, something to distinguish (but not to separate). You don’t need a Metro Card to appreciate what’s unique. But it helps. And in fact New York was built for isolation–exquisite machine, and complex, designed to exacerbate difference by density.

Isolation is subjective. There is no observable measurement that guides our estimation of it (its trite signals–social ineptitude, substance abuse, pallor–are too broad, suggesting a host of primary mental and physical disorders to which isolation has been unjustly attached as symptom or result), and yet it is experienced always and only in relation to others. Perhaps that’s why New York is its perfect vehicle. You cannot be isolated from others if there are no others to be isolated from, and fortunately, in this city, there are many, many others (all of them occupying, it seems on some nights, the apartment directly above yours). This is New York’s genius: to pack and load until all around are the bodies and voices of other people, most of whom you will never meet, whose thoughts may or may not coincide with your own, and whose gestures and posture and vocal tone may remind you in some insignificant way of someone you once knew, enough at least to confuse for a moment, to part your lips with the beginning of recognition. The multitude is New York’s special power. Here you will walk the streets and see the face of your best friend, how it was contorted with laughter, and the hands of the man who taught you piano, whose knuckles were enormous; you’ll hear your uncle’s voice, the way it thins its vowels down to string. These recognitions keep you dizzy in the beginning. Then they make you wary and wise. This is how you earn your eyes in New York, the ones that look right past beggars and roll in the wide-open faces of tourists. Things are not as they appear.

Concrete, too, plays a role. The hard surface, a broad palette, does not lend itself to the formation of meaningful human connections. With appropriate irony, we live and work on top of this manmade carapace, choosing to expose rather than protect ourselves, favoring the benefit of an impenetrable surface on which to construct our ambition. It’s better that way–reliable, safe, efficient–and if we imitate its principal characteristic, if we are a touch impervious, then such is the sacrifice we make. We are not here to join hands in fellow feeling.

And there is the anonymity of sophistication, because who would champion fraternité in the thick of such wit and fancy poise? New York City, weary from its better knowledge, is no place to clasp hands and sing songs. Isolation is inherently sophisticated, an exclusive state, and highly transmissible, so it flourishes here, without the annoyance of a lot of mutual identity. When New Yorkers run into each other outside the city, there is acknowledgement, yes, and respect, and even some sense of pleasure at the recognition, but we do not then go out to dinner together. We don’t become friends, no more than we would were we to bump into each other on Seventh Avenue. Such things are for people from Wisconsin. No, sophistication demands restraint, and the city trains us well in that discipline.

It is almost ridiculous to add that the city’s architectural realities reinforce our sense of isolation, so obvious does it seem. The five boroughs offer a wide selection of slots in which we may exist calmly, in compact stacks of residential habitats. We transform warehouses and churches and single-family brownstones into hives of homes, with drywall and wainscoting and original details, and we sit in our rooms and listen to our neighbors, who themselves are listening to their neighbors, who just returned from Elizabeth, New Jersey, with new throw rugs for the kids’ room and a drop-leaf dining table. We covet these small comforts, the better to insulate our tiny segment of space, the better to fashion attractive surroundings, to distract from the stranger who sleeps just inches away, just through that wall, whose obstructed breathing you can hear in the middle of the night. The fabulous terror of isolation is felt best when pressed up against the bodies of millions.

Whatever its ingredients and the means of its formation, New York’s modus operandi and principal issue fuels ambition–professional, creative, romantic. In every moment of individual desperation lies the seed of an artistic triumph, an industrial revolution, an unholy feat of seduction. It is New York’s most appealing paradox–that the greatest of cities maintains its power not by bringing its people together, but by inspiring their isolation.

Lives of the Cannibals: Redemption
Lives of the Cannibals: The Spell of the Sexual
Lives of the Cannibals: Rage



Monday, October 17, 2005

Monday Musing: Leonard Cohen

Skagen

I heard Leonard Cohen’s music for the first time driving from Skagen down to Copenhagen some years ago. Skagen is at the top of the world, at least as far as mainland Europe is concerned. We walked along the beach on a grayish day. You can walk until the sand tapers into a point and then vanishes beneath the North Sea.

Skagen_painter

A group of painters became fascinated with the light and the landscapes there during the mid-nineteenth century. They became known as the Skagen painters. It’s true that the light is special there. It’s diffuse and it’s sharp at the same time, which doesn’t really make any sense but I guess that’s why it is special.

Driving back south again through marshes, dry marshes you pass miles and miles of little trees. My Danish friend told me that the trees are all so small because of the make-up of the soil and all the sand. I have no idea if that’s true or even if my memory is entirely accurate about the trees or the soil. But I have an image of vast stretches of the tiny bare branches of thousands of little trees filtering the strange dying light late in the evening.

We watched the scene from the car window and listened to an album of collected Leonard Cohen songs. We listened to Chelsea Hotel #2:

I remember you well in the Chelsea Hotel,
you were talking so brave and so sweet,
giving me head on the unmade bed,
while the limousines wait in the street.
Those were the reasons and that was New York,
we were running for the money and the flesh.
And that was called love for the workers in song
probably still is for those of them left.

And I remember that we listened to Hallelujah in several different versions.

Now I’ve heard there was a secret chord
That David played, and it pleased the Lord
But you don’t really care for music, do you?
It goes like this
The fourth, the fifth
The minor fall, the major lift
The baffled king composing Hallelujah

Hallelujah
Hallelujah
Hallelujah
Hallelujah

Anyway, with that weird northern light up at the end of the world and those forests of tiny little trees and the mournful throaty singing and those Cohen lyrics that always manage to surprise you with some funny-tragic turn of image it was quite a ride. Not something you’d want to do every day but affecting, memorable.

Well, he has a new album coming out in a week and half called Dear Heather. I don’t know what to expect, really. A lot of his more recent music found him utilizing some rather strange arrangements, cheesy synthesizers and drum machines. I’d say it was all an elaborate con or something but you never can tell with Leonard. He spent the late 90’s at the Zen Center of Mount Baldy where he’d become a monk and took to calling himself Jikan.

He started making drawings and writing poetry like this:
Mystimpr

Seisen has a long body. Her shaved head threatens the skylight and her feet go down into the vegetable cellar. When she dances for us at one of our infrequent celebrations, the dining hall with it’s cargo of weightless monks and nuns, bounces around her hips like a hula-hoop.

This from a man who was said by one journalist to have penned “the most revolting book ever written in Canada.” So, you never know with Leonard. Of course, he seems to have spent much of his time at the Zen retreat drinking, smoking cigarettes, and taking sex breaks. He calls himself a “bad monk, a sloppy monk.” It also turns out that while he was practicing Buddhism up on the mountain his lawyers and accountants stole all his money. When he found out the he’d been fleeced of all but a fraction of his cash he said “You know, God gave me a strong inner core, so I wasn’t shattered. But I was deeply concerned.” For better or worse he’s been stimulated to write a lot of music again and go back into touring.

The lyrics for the song Dear Heather seem promising though, they sound like the Cohen of old, just having gotten a lot older:
Hat

Dear Heather
Please walk by me again
With a drink in your hand
And your legs all white
From the winter

His best music clearly came from a time when he was a miserable wreck. But it will be interesting to hear what comes out of a Leonard Cohen who’s reached 70. Recently, he said of his old age:

There was just a certain sweetness to daily life that began asserting itself. I remember sitting in the corner of my kitchen, which has a window overlooking the street. I saw the sunlight that shines on the chrome fenders of the cars, and thought, “Gee, that’s pretty.”

Fair enough, Jikan. I’ll buy the album.

Monday, October 10, 2005

Critical Digressions: Literary Pugilists, Underground Men

Ladies and gentlemen, boys and girls,

Cover200510_350_2 After being attacked for a number of years by a new generation of literary critics – indeed, sucker-punched, phantom-punched, even body-slammed – “contemporary” (or “postmodern”) prose has hit back: in this month’s Harper’s, one of our favorite publications (less than $10.99 for an annual subscription), one Ben Marcus has donned his fighting gloves – which seem a little big for his hands, his pasty, bony frame – climbed into the ring, earnestly, knocky-kneed, sweating from the hot lights, the camera flashes, the hoarse roar of the audience, the sense of anticipation, broken noses, blood…

Like Dostoevsky’s Underground Man, Ben announces, “I am writing this essay from…a hole…” He continues:

“…it’s my view that…the elitists are not supposedly demanding writers such as myself but rather those who caution the culture away from literary developments, who insist that the narrative achievements of the past be ossified, lacquered, and rehearsed by younger generations. In this climate…writers are encouraged to behave like cover bands, embellishing the oldies, maybe, while ensuring that buried in the song is an old familiar melody to make us smile in recognition, so that we may read more from memory than by active attention.”

Fighting words, ladies and gentlemen! We’d like to tell you that Ben fought a good fight; that he came out swinging; that he staged an upset; that an underdog took on the emerging consensus on contemporary prose, shaped by the likes of James Wood, Dale Peck and B.R. Meyers, and according to Ben, Jonathan Franzen, Tom Wolfe and Jonathan Yardley. But criticism is no fairy-tale world, and Ben is no hero. A welterweight in a heavyweight fight, he doesn’t have enough behind his punch.

The ambitiously titled Why Experimental Fiction Threatens to Destroy Publishing, Jonathan Franzen, and Life As We Know It begins with a peculiar digression on the anatomy of the brain including a quick explanation of the Heschl’s gyri, Boca’s area and the Wernicke’s area (“think of Wernicke’s area as the reader’s muscle”), which may be novel, experimental, but has no business in a literary critique. Perhaps had Ben fused literary theory with neuroscience in a more serious, symbiotic, technically rigorous way, he may have achieved something. But just as Ben gets us thinking about the sort of neural implications of literature, he gets wishy-washy, namby-pamby:  “If we [writers] are successful, we touch or break readers’ hearts. But the heart cannot be trained to understand language…”

Tyson_fingers_shrunk_2 This, introduction may have been overlooked had Ben knocked the reigning heavyweight champions down by the second or third round. But he doesn’t. He quarrels with the prevailing neo-realist sensibilities of critics – that is, “The notion that reality can be represented only through a certain kind of narrative attention – and with those who argue against “literature as an art form, against the entire concept of artistic ambition.” He then has beef with Franzen: “Even while popular writing has quietly glided into the realm of the culturally elite, doling out its sever judgment of fiction that has not sold well, we have entered a time when book sales and artistic meit can be neatly equated without much of a fuss, Franzen has argued that complex writing, as practiced by…Joyce…Beckett and their descendents, is being forced upon readers by powerful cultural institutions…and that this less approachable literature…is doing serious damage to the commercial prospects for the literature industry.” Fair enough but not hard enough.

But though we want to Ben to win this fight because we champion underdogs and such contrarian projects on principle, Ben is quite unable to summon the fierce intelligence and evangelical zeal of, say, James Wood or the flamboyance and shock value of Dale Peck. He may pretend to be the Underground Man but he’s not a sick man…a spiteful man…” In 2001, however, B.R. Meyers, more non-entity than underdog, managed the sort of upset Ben aspires to. Writing in the Atlantic, his thorough, articulate attack began: 

“Nothing gives me the feeling of having been born several decades too late quite like the modern ‘literary’ best seller. Give me a time-tested masterpiece or what critics patronizingly call a fun read – Sister Carrie or just plain Carrie. Give me anything, in fact, as long as it doesn’t have a recent prize jury’s seal of approval on the front and a clutch of raves on the back. In the bookstore I’ll sometimes sample what all the fuss is about, but one glance at the affected prose – “furious dabs of tulips stuttering,” say, or “in the dark before the day yet was” – and I’m hightailing it to the friendly spines of the Penguin Classics.”

A Reader’s Manifesto: An Attack on the Growing Pretentiousness in American Literary Prose caused commotion as the Wall Street Journal, New Yorker, Harpers, NYT, Washington Post and New York Review of Books joined the fray, a real battle royale. And Meyers came out on top: presently, he’s a senior editor at the Atlantic. When you hit hard, it doesn’t really matter what you say. So what’s Meyer’s beef? “What we are getting today is a remarkably crude form of affectation: a prose so repetitive, so elementary in syntax, and so numbing in its overuse of wordplay that it often demands less concentration that the average ‘genre’ novel.” And what is his methodology? He proceeds to categorize contemporary prose in four broad groups – “evocative,” “muscular,” “edgy,” “spare” and “generic ‘literary prose,’” – citing weak passages from the writers who he finds representative of each group; Proulx (Shipping News), McCarthy (All the Pretty Horses), Delilo (White Noise), Auster (City of Glass) and Guterson (Snow Falling on Ceders). Manifestly, Meyers packs a formidable punch.

Peck_1 Of course, even back in 2001, Meyers may have been a non-entity but was no underdog. Literary fashion has been changing well before him with Wood in pages the Guardian, and consequently in the New Republic, where Wood was joined by Dale “The Hatchet Man Peck. Peck, you may remember, famously proclaimed, “I will say it once and for all, straight out: it all went wrong with James Joyce…Ulysses is nothing but a hoax upon literature.” Like Tyson, Peck writes, “Sometimes even I am overwhelmed by the extent of the revaluation I’m calling for, the sheer f***ing presumptuousness of it.” In one critique, in one sentence in fact, Peck excises “most of Joyce, half of Faulkner and Nabokov, nearly all of Gaddis, Pynchon and DeLillo, not to mention the contemporary heirs.” This assertion makes for interesting if idle exercises: we mull, for example, which half of The Sound the Fury Peck would excise if given the opportunity – the first two books, of course, Benjy’s and Quentin’s – and what effect his reductive, retrograde editing would have on the novel as a whole. Peck, like Mike Tyson before him, bites ears off, and often punches below the belt, smack in the crotch. Tyson once said, “I wish that you guys had children so I could kick them in the f***ing head or stomp on their testicles so you could feel my pain because that’s the pain I have waking up every day.”

The New York Review of Books noted that “Like his colleague at the New Republic, the estimable and excellent James Wood, Peck seems to want more novels like the great [19th] century social novels: serious, impassioned, fat.” Were we to step into the ring, brandishing our shiny brown muscles, we would simply but forcefully argue that the world, that civilization, and literature with it, has moved a hundred years forward since the 19th century. Looking fondly back towards realism is quite literally retrograde, like those other Underground Men, Wahabi Islamists urging Muslims to return to the 7th century. The novel, like these critics and the critical canon (that includes the Russian Formalists, the New Critics, Structuralists, the Post-Structuralists, whatever), is grounded in a certain context. It’s is a palimpsest, distilling, processing the anxieties, sensibilities, the diction, the colloquial, news, popular culture, of a particular time and place and people.

Dreiser and Dos Passos, for example – two different writers, the former considered traditional, the latter experimental – were unable to write novels that are relevant today except as history, as part of evolution of the modern novel. On the other hand and off the top of our head, we just finished Roth’s Goodbye, Columbus – his first novel – which features a Jewish protagonist, a class divide, a sectarian divide, specific references and allusions to the fifties in America – including, incidentally, the title itself – but were charmed by the sweet, straightforward adolescent love story (and the voice). Unlike Manhattan Transfer, Goodbye, Columbus remains relevant. Some novels transcend their cultural and temporal trappings.

We dig Roth for different reasons than, say, Melville, Dostoevsky, Dickens. We dig 20th century writers for different reasons than their antecedents: the lyrical and frenetic Marquez and Rushdie, the postcolonial and serious Naipaul and Coetzee, the very contemporary, Franzen and Wallace.

Sure, from Dostoevsky to Wallace, the conventions of storytelling have changed and prose has become more self-conscious but don’t let the Underground Men lecture you that change is good or bad; change is. And we’ll tell you this much: anybody advocating cutting Nabokov down to size should be paraded naked in the ring, weak chest, hairy buttocks, spindly legs exposed, wearing his own novel as a fig leaf. Sure, some contemporary prose has become gimmicky, adjective laden, rife with metaphor (which in a way, is arguably Nabokov’s legacy); and sure, silly alliteration needs to be caught, condemned. Meyers will rightly beat you up for it. That’s his job, and Wood’s and Peck’s. Ben nobly got into the ring but he needs to train harder if he’s going to go twelve rounds with them. Somebody, however, needs to hit back, to keep it real.

As Eddie Scrap Iron Dupris once said (somewhat heavy-handedly), “Boxing is an unnatural act…everything in boxing is backwards: sometimes the best way to deliver a punch is step back…But step too far and you ain’t fighting at all.” We’re not entirely sure if this is relevant but it sure sounds good.

Other Critical Digressions:

Gangbanging and Notions of the Self

Dispatch from Cambridge (or Notes on Deconstructing Chicken)

The Three-Step Program for Historical Inquiry

The Naipaulian Imperative and the Phenomenon of the Post-National

The Media Generation and Nazia Hassan

Dispatch from Karachi

Selected Minor Works: Early Modern Primitives

Justin E. H. Smith

I have recently come across a delightfully obscure 1658 treatise by the very pious John Bulwer, entitled Antropometamorphosis: or, the Artificial Changling. This may very well be the first study in Western history of piercing, tattooing, scarification, and other forms of bodily modification. It is thus a distant ancestor of such contemporary classics as the 1989 RE/Search volume, Modern Primitives.

But if the Voice Literary Supplement once praised RE/Search for its dispassionateness, today a hallmark of respectable ethnography, Bulwer’s science is at once a moral crusade. In each chapter, Bulwer bemoans a different deplorable practice, including “Nationall monstrosities appearing in the Neck,” “Strange inventive contradictions against Nature, practically maintained by diverse Nations, in the ordering of their Privie parts,” and (my favorite) “Pap-Fashions.”

If Bulwer hates nipple rings and dick bars, he is no less concerned about the now rather innocent habit of shaving. He rails in one chapter against “Beard haters, or the opinion and practice of diverse Nations, concerning the naturall ensigne of Manhood, appearing about the mouth.” For him any bodily modification is but a “Cruell and fantasticall invention of men, practised… in a supposed way of bravery… to alter and deforme the Humane Fabrique.”

Bulwer believes that morally degenerate practices can over time lead to actual physical degeneration within a human population. Thus, for him, phenotypic variation in the species is a consequence of cultural bad habits, rather than teleologically driven progress from lower to higher forms, let alone adaptation by way of natural selection. The ugliness of non-Europeans may be attributed to the rottenness of their souls and consequent savage lifestyles. Indian pinheads and Chinese blockheads, whose skulls are sculpted from birth by malevolent adults, are cited as cases of degeneration in action.

200 years before Darwin, then, there was widespread acceptance of the idea that species could change over time. But for moralists such as Bulwer, change could only ever be change for the worse. In this connection, Bulwer denounces the view of a (regrettably unnamed) libertine philosopher that human beings evolved from other primates: “[I]n discourse,” he writes, “I have heard to fall, somewhat in earnest, from the mouth of a Philosopher that man was a meer Artificial creature, and was at first but a kind of Ape or Baboon, who through his industry by degrees in time had improved his Figure & his Reason up to the perfection of man.”

Bulwer believes that the ‘Philosopher’s’ opinion constitutes a symptom of the moral decline of the modern period. For, he thinks, if mutation of humanity over time can occur, it will not, as the Philosopher thinks, take the character of an ascent from beast to man, but rather the reverse, a descent into ape-likeness: “But by this new History of abused Nature it will appeare a sad truth, that mans indeavours have run so farr from raising himselfe above the pitch of his Originall endowments, that he is muchfallen below himselfe; and in many parts of the world is practically degenerated into the similitude of a Beast.”

Evolutionary thinking, then, opens up the possibility not just of progress out of animality, but of degeneration into it, and this was a possibility that the pious, such as Bulwer, were beginning to fear.

If we move forward a few hundred years, we find that the human species still has technology that beats the reed dipped into the anthole, and that we still exercise our freedom to mate outside of estrus. Indeed, not much of anything has changed since the 17th century, either through degeneration or evolutionary progress. One thing that has remained entirely the same is the art of moralistic ranting: we find that, now as then, precisely those who are most concerned about the moral stain of body piercing and tattoos, who are most active in the movement to make visible thongs in suburban Virginia malls a misdemeanor, are the same people who would have us believe that humans were instantaneously and supernaturally created with no kinship relation to other animal species.

It is worth reflecting on why these two crusades, which prima facie have nothing in common, have proven such a durable pair throughout the centuries. I suspect that human thought is constrained (as a result of the way our minds evolved), to move dialectically between two opposite conceptions of animal kinds: that of the book of Genesis on the one hand, positing eternally fixed and rigid kinds with no overlap, and that of Ovid’s Metamorphoses on the other. In spite of the relatively recent ascent of evolutionism to accepted scientific orthodoxy, there has always been available a conception of species as fluid and dynamic. This conception easily captures the imaginations of social progressives and utopians, that is, of those who believe that change for the better is possible and indeed desirable. The numerous monuments to Darwin throughout the Soviet Union (which I hope have not been scrapped along with those to Lenin) were once a testament to this.

Social conservatives on the other hand see fixity as desirable, and tend to conceive of change in terms of degeneration. A bestiary of eternal, non-overlapping animal species would provide for them a paradigm of stability that could easily be carried over from the natural to the social world, while the loss of this fixed taxonomy of natural kinds would seem equally to threaten the social stasis the conservative seeks.

The prospect of change in species over time, then, –including the human species– will be a more useful way of conceptualizing the natural world in times of heady social upheaval; in political climates such as the current one, it is not surprising to see public figures shying away from the chaotic instability of the Metamorphoses in favor of the clear boundaries of the Old Testament.

I am not saying that evolution is just ideology. I believe it is true. I believe that creationism, in turn, is false, and that it is an ideology. And precisely because it is one, it is a waste of time to do intellectual battle with creationists as though they had a respectable scientific theory. Instead, what we should focus on is the rather remarkable way in which folk cosmology –whether that of the Azande, the ancient Hebrews, or Pat Buchanan– may be seen to embody social values, and indeed may be read as an expression on a grand scale of rather small human concerns.

The small human concerns at the heart of the creationist movement are really just these: that everything is going to hell, that the kids don’t listen to their folks anymore, that those low-cut jeans show far too much. Creationism is but the folk cosmology of a frightened tribe. This is also an ancient tribe, counting the authors of Genesis, Bulwer, and Buchanan among its members, and one that need be shown no tolerance by those of us who recognize that change reigns supreme in nature, and that fairy tales are powerless to stop it.

Monday Musing: Be the New Kinsey

Last week my wife and I saw the biopic Kinsey in which Liam Neeson plays the entomologist turned pioneering sex researcher, Dr. Alfred Charles Kinsey. It’s a pretty good movie. Rent the DVD. Kinsey spent the early part of his career as a zoologist studying gall wasps, on which he became the world’s foremost expert. He collected over one million different specimens and published two books on the subject. Then, at some point in the early 30s, while pondering the variety of sexual behavior in the wasps he was studying, he started wondering about the range and variety of sexual behavior in humans. When he looked into it, he was dismayed by the prevalent scientific ignorance about even very basic physiological sexual function in humans, much less complex sexual behaviors. Remember, this was a time when even college-age men and women often had very little information about sex.

TimekinseyBut in this vacuum of knowledge where the angels of science feared to tread, as usual, the ever-confident fools of myth and dogma had rushed in with loud proclamations such as: masturbation causes blindness; oral sex leads to infertility; the loss of one ounce of precious, life-giving semen is equivalent to the (rather more obviously enervating) loss of 40 ounces of blood; and so on and on. We’ve all heard these sorts of things. In addition there was very little information about more real dangers and risks of sexual behavior, such as venereal disease transmission. When Kinsey taught a “marriage” class at the University of Indiana at Bloomington, a type of early sex-ed, his students often asked him questions that he could not answer because the answers simply were not known.

Embarrassment at this state of affairs prompted Kinsey to action. As an accomplished ethologist (one who studies animal behavior in the natural environment) he realized that in addition to studying the physiological sexual equipment of humans and the mechanics of sexual response, it was important to compile data on human sexual behavior “in the wild”, and he undertook the prodigious task of conducting detailed surveys of men and women across the 48 states of America to compile statistics about their sexual behavior. He didn’t reach his goal of interviewing more than a hundred thousand people, but he did make it to the tens of thousands. I cannot go into the details of his methodology here but it has been fairly well defended. In 1948, Kinsey published the first of two books based on his exhaustive sex research that would eventually alter the course of this country in significant ways: Sexual Behavior in the Human Male. Five years later he published Sexual Behavior in the Human Female.

What Kinsey found is well-known now but was absolutely scandalous at the time: the prevalence of homosexuality and bisexuality; the ubiquity of masturbation, especially in males; the common practice (at least much more than anyone had previously thought) of oral sex, anal sex, premarital sex, infidelity, and other forms of “deviant” behavior. While Kinsey simply reported the raw data, never advocating social policy or drawing moral conclusions, the effects of an open airing of these previously taboo subjects had far-reaching effects, not only contributing significantly to the sexual revolution of the 60s, but, importantly, resulting in the eventual massive decriminalization of various sexual practices such as (but not limited to) sodomy and oral sex across the states. There were other results as well: it took until 1973 for the American Psychiatric Association to remove homosexuality from its list of mental illnesses, but it happened at least partly because of Kinsey’s work. Most significant of all, however, Kinsey’s reports went a long way toward lifting the clouds of ignorance and fear that had long whelmed sex.

Now it occurred to me after I saw the movie, that there is an area of human practice (and, yes, it is more-or-less universal) which is today covered in the same clouds of ignorance and fear; which has distorted the well-intentioned aims of the criminal justice system and is filling up our jails; and which is dominated by myth and dogma in much the same way sex was before Kinsey had the courage to defy the taboos surrounding it and clear that fog with his bright beam of information: it is drug use.

What is drug use? I shall define it here for my purposes as a consumption (whether by ingestion, inhalation, injection, absorption or any other means) of a substance with little or no nutritional benefits, simply out of a desire for its effect on the nervous system. This then includes substances ranging from caffeine and nicotine, to alcohol, marijuana, LSD, PCP, ecstasy, cocaine, crystal meth, heroin, and the thousands of other substances that people use to enhance, or at least alter, their subjective experience of the world. And I won’t even get into prescription drug (ab)use from Valium to Viagra. Extremely large portions of all the cultures that I know of use at least some of these and other substances. Of course, most people enjoying a friendly chat over tea are not explicitly aware that they are taking a drug, but they are, and it makes them feel better, more energetic, and more awake. Just as some sexual practices are relatively harmless, while others pose real dangers to those who practice them, certain drugs and related practices are more dangerous than others.

Here’s the problem: there is very little useful information available about drugs. The reason is that there is a reluctance bordering on taboo on the part of government and non-government agencies alike to actually spell out the risks of taking drugs. In the case of illegal or other “bad” drugs, there is an absolute refusal to accept the fact that a large part of the population uses and finds pleasure in these substances, and an attempt to marginalize all drug users as criminals, addicts, and degenerates; just as in Kinsey’s time, absolute abstinence is at present the only acceptable message for the public. “Just say no!” That’s it. Where there should be informed messages of the exact risks of various substances, there is fear-mongering instead: the “this is your brain on drugs” ad on TV showing a frying egg, for instance. The implication is, just as an egg cannot be unfried, once you have used drugs (which drugs? how much? for how long? –these questions are not to be asked, and cannot be answered), your brain is permanently fried, whatever that means. After all, a fried brain is just a metaphor, it does not say anything scientific about exactly what sort of damage may be done to one’s brain by how much of what drug over what period of time. “Cocaine Kills!” is a common billboard ad. Have you noticed that Kate Moss hasn’t died? Why? I happen to know a bunch of Wall Street types who have been snorting lines off little mirrors in the bathrooms of fancy downtown clubs pretty much every weekend for at least a decade, and so, probably, do you. None of the ones I know have died so far. I also know a man who tried cocaine for the first time and ended up in an emergency room. So what is the real risk?

The problem with just telling people “Cocaine Kills!” and nothing more is that because they may see many people doing cocaine who are not dropping like flies, they completely dismiss the message as a crying of wolf. Or, they may think, “Yeah, sure cocaine kills, but so does skiing. Think Sonny Bono. Think Michael Kennedy. Just say no to skiing? I don’t think so. The pleasures outweigh the risks for me.” Why not tell them something useful about what the real statistical risks are? What percentage of the people who do it die from cocaine? What are the chances of dying for every ten times, or a hundred, or a thousand that you take cocaine? In the almost religious zeal to curb smoking, even there the situation about the actual risks is endlessly confusing. I have repeatedly read that 9 out of ten people who have lung cancer were smokers, but this tells me nothing about what risk I am taking of getting lung cancer by smoking. It could be that only a small percentage of the population gets lung cancer, and of those, the smokers are at disproportionately higher risk. There are hugely inflated figures of the number of deaths caused by smoking which are routinely thrown around. I have even seen a poster claiming that 3 million people are killed every year in the U.S. by smoking. That’s more than the total number of deaths every year in the U.S.! What I would really like to know is, on average, how much I am shortening my life with every pack-year of cigarettes I smoke? I just looked at various websites such as those of the American Heart Association, the Centers for Disease Control, the World Health Organization, and countless others, and I cannot find this simple information. Why? Just as Kinsey could not answer many of his students’ questions about sex, if a young person today asks just how risky it is to use ecstasy at a rave, for example, we have nothing to say.

Another problem with this “just say no” approach is that just as the total abstinence approach masks the differences in danger to one’s health of different sexual practices (having frequent unprotected sex with prostitutes is obviously not the same risk as having protected sex with a regular partner) because they are equally discouraged, this approach also masks the differences between the various practices of using drugs. Smoking a joint is not the same as mainlining heroin, but there is no room for even such crude distinctions in this simplified message. There is only the stark line drawn between legal and illegal drugs. Go ahead and have your fourth martini, but, hey, that hash brownie will kill you!

The same unrealistic refusal to acknowledge that drug use is very common (yes, there are always a few polls of high school sophomores and college freshmen, but nothing serious and comprehensive) across all strata of society results in a distorted blanket approach to all drug use, and the same ignorant fear-mongering that used to exist about sex. The first thing to do is to compile, like Kinsey, detailed information on all drug use (or at least the top twenty most used drugs) by employing the best polling techniques and statistical methods we have. Let’s find out who is using what drugs, legal or illegal. Break it down by age, gender, race, income, geographical location, education, and every demographic category you can think of. Ask how often the drug is taken, how much, in what situations. Ask why the drug is taken. What are the pleasures? Poll emergency rooms. Research the physiological effects of drugs on the human body. Write a very fat book called Drug Taking Behavior in the American Human. I am not advocating any policy at all. I am just advocating replacing ignorance and confusion with irrefutable facts. New directions will suggest themselves once this is done. Maybe just as people who engage in oral sex are no longer seen as perverts and degenerates, maybe one day Bill Clintons won’t have to say “But I didn’t inhale,” and George W. Bushes won’t have to lie about their cocaine use. On the other hand, maybe we will decide as a society that Muslims were right all along, and ban alcohol. Go ahead, be the new Kinsey.

Have a good week!

My other recent Monday Musings:
General Relativity, Very Plainly
Regarding Regret
Three Dreams, Three Athletes
Rocket Man
Francis Crick’s Beautiful Mistake
The Man With Qualities
Special Relativity Turns 100
Vladimir Nabokov, Lepidopterist
Stevinus, Galileo, and Thought Experiments
Cake Theory and Sri Lanka’s President

Monday, October 3, 2005

From the Tail: Betting on Uncertainty

I think I know where you stand on the ongoing federal court case in Pennsylvania, where parents have sued to block the teaching of intelligent design in their schools. Your position notwithstanding, only 13% of the respondents to a November 2004 Gallup poll believed that God has no part to play in the evolution or creation of human beings. Fully 45% said they believe that God created humans in their present form less than 10,000 years ago!

What’s going on here? Many (perhaps even a majority) of these respondents were taught evolution in school. Did they choose to disregard it merely because it contradicted their religion? They do seem to accept a whole host of other things during the course of their education which may contradict it as well. For example, there appears to be far less skepticism about the assertion that humans occupy a vanishingly small fraction of the universe. I’ll throw out three other explanations that are often advanced, but which I believe to be inadequate as well:

  1. Natural selection is not a good enough explanation for the facts: Clearly, it is.
  2. Natural selection has not been properly explained to the general public: Sure there are common misconceptions, but proponents have had enough school time, air time and book sales mindshare to make their points many times over.
  3. Religious zealots have successfully mounted a campaign based on lies, that has distorted the true meaning of natural selection: This has conspiracy theory overtones.  There are too many people who do not believe in natural selection — have they all been brainwashed?

My explanation is simply this: Human beings have a strong visceral reaction to disbelieve any theory which injects uncertainty or chance into their world view. They will cling to some other “explanation” of the facts which does not depend on chance until provided with absolutely incontrovertible proof to the contrary.

Part of the problem is that we all deal with uncertainty in our daily lives, but it is, at best an uncomfortable co-existence. Think of all the stress we go through because of uncertainty. Or how it destabilizes us and makes us miserable (what fraction of the time are you worrying about things that are certain?). In addition to hating it, we confuse uncertainty with ignorance (which is just a special case), and believe that eliminating uncertainty is merely a matter of knowing more. Given this view, most people have no room for chance in the basic laws of nature. My hunch is that that is what many proponents of Intelligent Design dislike about natural selection. Actually, it’s more than a hunch. The Discovery Institute, a think tank whose mission is to make “a positive vision of the future practical”, (but which appears to devote a bulk of its resources to promoting intelligent design) has gotten 400 scientists to sign up to the following “Scientific Dissent from Darwinism“:   

We are skeptical of the claims for the ability of random mutation and natural selection to account for the complexity of life. Careful examination of the evidence for Darwinian theory should be encouraged.

In this world of sophisticated polling and sound bites, I think that the folks at the Discovery Institute have gotten their message down pat. To be sure, natural selection is not a theory of mere chance. But without uncertainty it cannot proceed. In other words, Natural Selection is a theory that is not of chance, but one that requires it.  The advocates of Intelligent Design are objecting to the “purposeless” nature of natural selection and replacing it with the will of a creator. It doesn’t really help matters for Darwinians to claim that chance plays a marginal role, and that the appeal to chance is a proxy for some other insidious agenda. Chance is the true bone of contention. In fact, as Jacques Monod put it over thirty years ago:

Even today a good many distinguished minds seem unable to accept or even to understand that from a source of noise, natural selection could quite unaided have drawn all music of the biosphere. Indeed, natural selection operates upon the products of chance and knows no other nourishment; but it operates in a domain of very demanding conditions, from which chance is banned. It is not to chance but to these conditions that evolution owes its generally progressive course.

The inability of otherwise reasonable people to accept a fundamental role for randomness is not restricted to religious people — scientists are hardly immune to it. We know that even Einstein had issues with God and dice in the context of Quantum Mechanics. Earlier, in 1857, when Ludwig Boltzmann explained the Second Law of Thermodynamics by introducing, for the first time, probability in a fundamental law, he was met with extreme skepticism and hostility. He had broken with the classical Newtonian imperative of determinism, and so could not be right. After much heartache over answering his many critics, Boltzmann (who had been struggling with other problems as well) hanged himself while on holiday.

Of course one reason we hate to deal with uncertainty is that we are so ill equipped to do so. Even when the facts are clearly laid out, the cleverest people (probabalists included) make mistakes. I can’t resist providing the following example:

William is a short, shy man. He has a passion for poetry and lives strolling through art museums. As a child he was often bullied by his classmates. Do you suppose that Williams is (a) a farmer, (b) a classics scholar?

Everyone I ask this question chooses (b). But that isn’t right. There are vastly more farmers than classics scholars, and even if a small fraction of farmers match William’s characteristics, that number is likely to be larger than the entire set of classics scholars. (Did you just get burned by your meager probabilistic reasoning faculties?) The psychologists Kahneman and Tversky pioneered the field of behavioral economics, which establishes among other things that our heuristics for reasoning about uncertainty are quite bad. You can probably think of many patently dumb things that people have done with their money and with their lives when a simple evaluation of the uncertainties would have resulted in better outcomes.

So back to getting people to accept uncertainty as an inherent part of the world. As you can probably tell, I am not holding my breath. On evolution, the timescales are too long to be able to provide the incontrovertible proof to change most people’s minds. Maybe a better approach is to reason by analogy. There is an absolutely staggering amount of purposeless evolution unfolding at breakneck speed before our very eyes. I am talking about the Web, the very medium through which you are reading this. In only about ten years a significant portion of the world’s knowledge has become available, is almost instantaneously accessible, and it’s free. Consider these figures from a recent article by Kevin Kelly. The thing we call the Web has

  • more than 600 billion web pages available, which are accessible by about 1 billion people.
  • 50 million simultaneous auctions going on on Ebay,  adding up to  1.5 billion a year.
  • 2 billion searches a month being done on  Google alone.

Think back to what you were doing ten years ago. Did you ever really think that any of this would happen? The scale at which the internet operates was envisioned by none of the engineers and computer scientists who collaboratively attempted to design the basic substrate of protocols upon which it runs. In truth, the innovations and designs of the web come from the collective energies of its users, and not according to an intelligent design or a blueprint. Here the purposeless of evolution is much easier to see. One day in the future some theory will reveal as a simple consequence, why all of a sudden in the years 2004-05, there sprung up 50 million blogs, with a new one coming on line every 2 seconds. This theory of evolution will be framed by a Law and this law will have at its core an indelible, irreducible kernel of chance. And chances are, most people will have a hard time believing it.

Monday Musing: Enchantment and pluralism, some thoughts while reading Jonathan Strange & Mr. Norrell

Throughout much of the writings of the German sociologist Max Weber, you can find the claim that modernity and its rational control over the natural demanded the disenchantment of the world; that is, the exit of the sacramental in material things and the end of sacrament as a means (or rather appeal to the world) to fulfill our roles and ends. The role of the religious and the spiritual dwindle. Science and technology displace magic. But specifically, it displaces magic in the realm of means.

Weber saw this mostly in the rise of capitalism and the modern bureaucracy and in the Protestantism that has, or had, an “elective affinity” to modernity itself.

Only ascetic Protestantism completely eliminated magic and the supernatural quest for salvation, of which the highest form was intellectualist, contemplative illumination. It alone created the religious motivations for seeking salvation primarily through immersion in one’s worldly vocation. . . For the various popular religions of Asia, in contrast to ascetic Protestantism, the world remained a great enchanted garden, in which the practical way to orient oneself, or to find security in this world or the next, was to revere or coerce the spirits and seek salvation through ritualistic, idolatrous, or sacramental procedures. No path led from the magical religiosity of the non-intellectual strata of Asia to a rational, methodical control of life. (The Great Religions of the World)

And that pinnacle expression of and institution for methodical control of the world, the bureaucracy, was notable, according to Weber, precisely for its irreligion.

A bureaucracy is usually characterized by a profound disesteem of all irrational religion . . .(Religious Groups)

Reading Susanna Clarke’s Jonathan Strange & Mr. Norrell, which admittedly I’m only half-way through, I was reminded of Weber (which is not so uncommon). The novel, set in the early 19th century, concerns the reappearance of magic in the modern world. In the novel, magic existed once upon a time, but had disappeared three centuries earlier, at the end of the Middle Ages. Against the backdrop of the Napoleonic Wars, two practicing magicians appear in England—a re-enchantment, of sorts.

Prior to the appearance of the two practical magicians, magic is purely theoretical, the occupation of historians and scholars, but not practitioners. Interestingly enough, these historians and scholars in the novel are also called “magicians.” The magic societies resemble philosophy circles and salons. And the idea of magic in the novel as a metaphor for philosophy is an obvious one, if only because the line between magic and philosophy seems so blurry in the Middle Ages. Merlin certainly appears a philosopher magician, a sage.

The two, Jonathan Strange and his teacher Mr. Norrell, lend their services to the war effort, and we are given an image of magic interacting with the specialized, but also distant and abstract, knowledge of bureaucracy. And it’s a funny image: two separate relationships to means in conflict, with neither depicted in a flattering way.

Enchanted (or mysterious) means don’t seem any more sensible or effective than dis-enchanted (rational, methodical) ones. (At least so far.)

(And I was also disappointed to learn that the connection between “wizard” and “vizier” is accidental.)

I was thinking of these issues in the context of a larger one: namely, why does so much fantasy appear to be conservative. The Lord of the Rings seems clearly to be conservative in its politics, not just Tolkien. And by conservative, I don’t mean that it simplifies politics but rather it harkens back to a time before a monistic conception of the good—as given by religion, usually—collapsed in favor of the pluralism of ends that we enjoy and which defines the freedom of the moderns. To follow John Holbo and invoke Isaiah Berlin, people disagree with the ends of life and not just the means. And the modern world has been set up to allow people to disagree and live their lives in the way they like without too much conflict, at least ideally.

There are exceptions to my claim that fantasy seems to go with conservatism, to be sure: Buffy the Vampire Slayer, for one. But it does seem that the practical representation of magic often takes place against the backdrop of, at least, a locally all-embracing purpose, most commonly war. It’s almost as if the absence of a methodical control of life and the world requires that the ends of life are controlled thoroughly. Conversely, the rationalization of the world appears to go part and parcel with the pluralism of ends. (Of course, Weber, and some of those he inspired including the Marxist Frankfurt School, was terrified that values—monistic or plural—would exit altogether from the modern world under its rationalization, and means would become ends in themselves. Although, it seems that no one can give an example other than the accumulation of money or commodities.)

At least so far, Clarke seems to avoid the conundrum, or appears to make fun of the genre’s political naiveté. (It apparently gets even better, in terms of political richness.)  And it seems to me that to the extent that the backdrop of fantasy can shift from the Wagnerian saga into the quotidian, magic can find a place in the modern world.

Lives of the Cannibals: Redemption

On May 29, 1983, Steve Howe, a 25 year-old relief pitcher for the Los Angeles Dodgers, checked himself into a drug rehabilitation center to treat an addiction to cocaine. Howe was a promising young star, 1980’s rookie of the year, and endowed with the hyperactive, pugnacious demeanor of a natural-born “closer,” the pitcher charged with saving tight games in treacherous late-inning situations. He completed his rehab in late June, but was sent away again in September after missing a team flight and refusing to submit to urinalysis. He tested positive for cocaine three times that November, and was suspended from baseball for the 1984 season, one of several players caught up in the decade’s snorty zeitgeist. Howe returned to the mound in ’85 and over the next 6 years pitched sporadically for the Dodgers, the Minnesota Twins and the Texas Rangers, as well as a Mexican League team and a couple of independent minor-league level clubs in the States. But June of ’92 found Howe busted again, and Fay Vincent, then the commissioner of baseball, banned him for life. An arbitrator later vacated Vincent’s decision, reinstating Howe, and the New York Yankees signed him to pitch in the Bronx. After Yankee relievers suffered a mid-season collapse in 1994, Howe stepped into the breach and, notwithstanding his caged pacing and myriad facial tics, recorded 15 clutch saves and a 1.80 earned run average, winning the enduring affection and respect of Yankee fans, who have a proud history of adopting the troubled and eccentric, just so long as they win.

Welcome to New York, perhaps the most prolifically redemptive island in human history. Granted, islands are built for redemption. Their isolation and exclusivity require new beginnings from their inhabitants, and they tend in general (and New York’s islands tend in particular) to transact life on terms different from other places. In the City, where the hybrid system runs on aggression, aplomb and sex appeal, fatuous Wall Street wizards and Upper-East Side tastemakers serve prison sentences and emerge hotter than ever, redeemed not by God or humanism but by the very fact of their fall from grace. It’s exotica, a matter of salacious interest and a perfect bluff for the social scene, where a big rep is all it takes, and the smart ones ride theirs all the way to a clubby write-up in Talk of the Town. Sure, a prison term is a nuisance, but it’s also useful (if bush-league) preparation for the more exigent realities of life in Manhattan. So it’s no surprise that we should admire the same things in our more middle-class heroes–our athletes and actors, and our politicians too. We want contrition, of course, and we must remember the children, but a little imperfection makes for a compelling character, and we won’t have that sacrificed.

The New York Yankees opened their 2005 season 11-19. It was the worst start anyone could remember, and it came on the heels of the greatest collapse (or comeback, depending on your regional perspective) in baseball history, when, in the second round of the 2004 playoffs, the Yankees were eliminated by the Red Sox despite winning the first three games of a best-of-seven series. In every one of the last nine years, they had made it to the playoffs, and in every one of the last seven, they had won the American League’s Eastern Division title, but 2005 seemed different. They were paying 15, ten and seven million dollars to three starting pitchers of dubious value–Brown, Wright and Pavano–and they had purchased the super-rich contract of Randy Johnson, once inarguably the finest pitcher in the major leagues, but now, at 41, a cranky and unreliable prima donna, whose 6’7 frame and acne-scarred face looked pained and out of place in Yankee pinstripes. Their beloved veteran center fielder Bernie Williams couldn’t throw anymore, and their traditionally solid bullpen hemorrhaged runs nightly. It was a difficult reality for fans who had been treated to a decade of near-constant success, but it was manna for the millions of Yankee haters, whose unfailing passion evinces the team’s historical greatness and cultural significance. In the wake of their ignominious 2004 defeat at the hands of the Red Sox, and finding themselves in last place in the American League East, the Yankees and their fans despaired. It was over.

Enter Jason Gilbert Giambi and Aaron James Small, high school classmates from California and unlikely Yankee teammates, whose personal redemptions spurred the 2005 Yankees to their eighth consecutive division title on Saturday. Giambi, a longtime star slugger, is one of the few known quantities in the recent steroid controversy (and Capitol Hill comedy, where the workout regimens of professional athletes have curiously attained massive political profile), whose leaked congressional testimony marks him as a confirmed (though not explicitly stated) user. Giambi spent most of 2004 on the Yankees’ disabled list, recovering from mysterious fatigue and a suspicious tumor, both of which, it seemed likely to pretty much everyone who gave it any thought, might just be the rightful wages of sticking a hypodermic needle in your ass and suffering nascent breast development, in exchange for increased strength and the ability to heal faster (a superhero’s tradeoff). But if nothing else came clear in 2005, at least Jason Giambi wasn’t on the juice. Never did a hitter look more helpless at the plate than poor Jason. He flailed and whiffed, and the earnest cheerfulness that once endeared him to fans and teammates curdled into delusive optimism. He was done.

But he wasn’t. Through the first two months of the season, Giambi claimed to be on the right track. He still had his good eye, he pointed out, referring to all the walks he earned, and it was just a matter of timing and bat speed after that. Fans and the media were indulgent but skeptical. The Yankees are a known rest-home for aging, overpriced talent, and Giambi’s story, though more dramatic than the trajectory of your average baseball player’s decline, did fit the profile. But, much to everyone’s surprise, he started hitting again, and what he started hitting were home runs–tall flies that took ages to land, and missiles that slammed into the bleachers moments after cracking off his bat. Giambi began driving in runs at a faster pace than anyone else on a team full of standout run-producers, and he continued reaching base on the walks that served as his crutch in those first miserable months, all of which amounted to league-leading slugging and on-base percentages. Jason was redeemed, and his legend is assured now as the star who wanted more, who lost everything to greed and arrogance, and who recovered his glory, which is now vastly more appealing for the fact that it’s tarnished. It’s a real New York kind of story.

As for Aaron Small, his is a story of redemption too, but one more suitable for middle America, which might not take so kindly to the resurrected likes of Steve Howe and Jason Giambi. Like Giambi, Small is a 34 year-old baseball veteran, but a veteran of the minor-leagues, whose only pro success has been the several “cups of coffee” (as baseball cant has it) he’s enjoyed in the majors in 16 years of playing–short stints in the bigs, followed by interminable bus rides back to the minors. This year, Small was called up to plug the holes left by the Yankees’ multimillion-dollar washouts, Brown, Wright and Pavano. Small, it should be noted, is the type of guy who thanks God for minor successes, a tendency not uncommon in local basketball and football players, but one that seems exceedingly peculiar in a glamorous Bronx Bomber. Nevertheless, he has been embraced by New York fans, and their acceptance has everything to do with the ten victories he compiled (against no defeats) in his partial 2005 season. This modest, Southern country boy outpitched every high-priced arm the Yankee millions could buy, and after every game he shucksed his way through interviews, praising his patient wife, praising his remarkably attentive savior, and just generally expressing his shock and pleasure at finding himself in the heat of a big-league pennant race after more than a decade-and-a-half of slogging his way from minor-league town to minor-league town. Small’s story is relevant here because his time is short. His 16-year patience, his redemption, will not remain in the minds of New Yorkers very long, not unless he does something colossally self-destructive–and he better do it quick. We like a little dirt on our heroes, a little vulgarity, because otherwise it’s all hearts and flowers and straight-laced (and -faced) fortitude, and what could be more dull? New York takes pride in its corruptions, and a hero isn’t a New York hero until he’s been dragged down and beaten (preferably by his own hand).

And this is why the 2005 Yankees have a shot at being the most memorable team to come out of the City in years. They’ve seized every opportunity to make things hard this season. Every potential run-scoring at bat, every pitching change and every difficult fielding chance has come with the sour taste of unavoidable failure, the sense that we’re almost out of gas now after a decade at the top. Our trusty veterans have lost their vigor and our big-name stars are compromised–by their egos, their paychecks and their tendency to choke. The obstreperous owner is lapsing into dementia, and even Yankee Stadium itself has entered its dotage. Indeed, what we’re confronted with is the last, limping formation of a great baseball team, occasionally disgraced by its swollen personalities and bottomless, ignorant pockets, trying to fashion for itself a true New York-kind of glory–one that climbs out of the depths, battered and ugly. This is our redemption.

Lives of the Cannibals: The Spell of the Sexual
Lives of the Cannibals: Rage

Poison in the Ink: The Life and Times of Fridtjof Nansen

In 1906 Santiago Ramòn Y Cajal and Camillo Golgi shared the Nobel Prize in Physiology or Medicine for their contributions to neuroscience: Cajal for his contributing work that helped lay the foundation for the Neuron Doctrine, and Golgi for the development of his Golgi stain which was crucial for the work of so many neuroscientists, including Cajal. Unknown to most people however, is that a Norwegian zoologist named Fridtjof Nansen had declared the independence of the cellular nerve unit a year and a half earlier than Cajal, using the same Golgi stain employed by the Spanish histologist. When Cajal was just beginning to learn about the staining technique from a colleague, Nansen had already published a paper stressing the point.

On October 26, 1892, a crowd gathered for the christening of the Fram, a custom-built ship designed to take Fridtjof Nansen and his crew to the roof of the world. Four years had passed since Nansen had become the first European to cross the interior of Greenland, and he now hoped to win the race of becoming the first to reach the North Pole.

Among the guest present at the event was Gustaf Retzius, a colleague from Nansen’s early days as a neuroscientist. During a speech made at dinner that night, Nansen turned to Retzius and said that the field of neurobiology, like polar exploration, involved “penetrating unknown regions” and he hoped one day to return to it.

For all of his good intentions, Nansen never did return, and it would be something he would express regret over many times throughout his life. As he put it, after “…having once really set foot on the Arctic trail, and heard the ‘call of the wild’, the call of the ‘unknown regions’, [I] could not return to the microscope and the histology of the nervous system, much as I longed to do so.”

Those familiar with Nansen probably know him as an arctic explorer and as a world-famous diplomat who was awarded the Nobel Peace Prize in 1922 for his efforts to repatriate refugees after World War I.

But before the arctic expeditions and the humanitarian work, Nansen was a young zoologistNansen_lab_5 interested in biology and the nervous system. He was one of the world’s first modern neuroscientist and one of the original defenders of the idea that the nervous system was not one large interconnected web, but instead was made up of individual cells that Wilheim Waldeyer would later call “neurons” in his famous 1891 Neuron Doctrine.

From a young age, Nansen was fascinated with nature; he loved its “wildness” and its “heavy melancholy” and he was happiest when he was outdoors. When it came time for Nansen to enter the University of Christiania (currently known as the University of Olso), he chose to major in zoology.

During his first year, Nansen answered a call from his department for someone to visit the arctic and collect specimens of marine life. In 1882, he set off for the east coast of Greenland aboard the sealing vessel Viking on a voyage that would last four and a half months.

The trip was a unique turning point in Nansen’s life. It provided him with his first glimpse of the Arctic and instilled in him the desire to cross Greenland’s icy interior.

“I saw mountains and glaciers, and a longing awoke in me, and vague plans revolved in my mind of exploring the unknown interior of that mysterious, ice-covered land,” Nansen wrote.

Upon his return, the 20-year-old Nansen was offered a post as the curator of the zoological department at the museum of Bergen. Nansen gladly accepted the position. His arctic dreams were put aside and the next six years were spent studying the invertebrate nervous system through a microscope.

One of the greatest difficulties Nansen faced in his research involved staining sections of nerve tissue. With the methods available at the time, the most that could be revealed of a neuron was its cell body, the proximal—and sometimes secondary—branch-like extensions of its dendrites and the initial segments of its thread-like axon.

At around this time, word was circulating that an Italian physician named Camillo Golgi had developed a new staining technique, one that stained only a few nerve cells in a section at a time, but which stained them so thoroughly that they were visible in their entirety.

After catching wind of the Golgi’s technique from a colleague, Nansen decided to pay the Italian doctor a visit. Despite arriving unannounced at Golgi’s lab in Pavia, Nansen was surprisingly well received and under the doctor’s careful tutelage, Nansen mastered what would become known as the Golgi stain in only a matter of days.

Upon his return, Nansen applied the Golgi stain to the nerve cells of a primitive fish-like animal called the lancelet. For the first time, Nansen could see clearly all the intricate branches of a neuron’s dendrites and could follow the entire length of an axon before it made contact with another neuron.

Armed with this new tool, Nansen began seeing things that couldn’t be explained by the reticular network theory, the reigning theory at the time for how nervous systems were organized. According to this theory, the nervous system was like a giant mesh net, with nerve impulses—whatever they might be—traveling unimpeded from one area to another.

One of Nansen’s objections to this view was based on a simple anatomical observation. The existences of unipolar neurons, or unipolar “ganglion” cells as they were known at the time, puzzled Nansen and lead him to ask a very logical question: How could unipolar neurons exist if nerve cells fused into one another as commonly believed, he asked. “How could direct combination between the cells be present where there are no processes to produce the combination?”

As their name suggests, unipolar neurons have a single primary trunk that divides into dendrites and an axon once away from the cell body. This is different from the image of neurons that most people are accustomed to, which show numerous dendrites branching off the cell body at one end and a long thread-like axon, terminating in tiny knobs at the other.

Other scientists attempted to explain away unipolar neurons by arguing that they were not very common. The closer the nervous system was examined, they said, the fewer unipolar neurons were found, especially in vertebrates like mammals and humans. Nansen remained unconvinced and pointed to the nervous systems of invertebrates like lobsters which have nervous systems made up almost entirely of unipolar neurons. To Nansen, this was strong evidence that the reticular network theory couldn’t be correct and in an 1887 paper, Nansen made the statement–bold at the time–that “a direct combination between the ganglion cells is…not acceptable.”

Nansen had his own theory about how nerve cells might combine. He proposed that it was in the ‘dotted-substance’ (what modern neuroscience calls “neuropil” in invertebrates and “gray matter” in vertebrates) that nerve cells communicated with one another. Nansen went even further, prophetically declaring that this ‘dotted-substance’ was the “principle seat of nervous activity” and “the true seat of the psyche.”

In the concluding paragraph of his 1887 paper, Nansen insisted that the dotted-substance will no doubt prove to play an essential role in whatever the final function of the nerve cells is determined to be. Unable to resist making one last speculation, Nansen also wrote the following:

“It is not impossible that [ganglion cells] may be the seat of memory. A small part of each irritation producing a reflex action, may on its way through the dotted substance be absorbed by some branches of the nervous processes of the ganglion cells, and can possibly in one way or another be stored up in the latter.”

In this, Nansen was especially farsighted, touching upon what modern neuroscience calls “neuroplasticity,” currently one of the most promising explanations to account for how simple reflexes can undergo modifications that last for minutes at a time and how learning can lead to behavioral changes that can last for a lifetime.

In the spring of 1888, Nansen presented a shortened version of his paper for PhD consideration to a review board in the ceremonial auditorium of Christiania University. In what was described as a heated presentation, Nansen reiterated his firm belief that nerve cells were not fused into a reticular network, that they were instead independent cellular units. Nansen’s conclusions were met with hostility by the review board’s members and he was accused of jumping the gun and getting ahead of his evidence.

Nansen was awarded his degree in the end, but not before one panel member expressed his firm conviction that Nansen’s hypothesis was destined to be forgotten like so many others.

The experience was a taxing one for Nansen. “In the end, there was such a confusion of one thing on top of another…that I believe that had it continued any longer I would have had a nervous breakdown,” he later wrote to a friend. “There was hardly a second to spare; we finished precisely as calculated, but no more.”

In this, Nansen wasn’t exaggerating. He was running out of time. Nansen was scheduled to depart four days after his PhD defense on a cross-country trek across the unexplored interior of Greenland. A long-time dream was finally coming true.

Nansen personally saw to every aspect of the trip. In a plan that critics called dangerous and foolhardy, Nansen proposed to cross Greenland from east to west. It would be a one-way ticket for himself and his team, with no chance of turning back.

“In this way one would burn one’s boats behind one,” Nansen wrote. “There would be no need to urge one’s men on, as the east coast would attract no one back, while in front would like the colonies on the west coast with the allurements and amenities of civilization.”

It took nearly two months, but in the end Nansen proved his critics wrong and his company of six became the first Europeans to cross the frozen island’s expansive interior.

The Greenland expedition gave Nansen his first taste of international fame. It sealed his reputation as an explorer and ended his career as a zoologist. Wanderlust had found its perfect victim, and soon Nansen was making plans to embark on another first.

Nansen_ice_1For his next adventure, Nansen set his sights on becoming the first to circumnavigate the North Pole. In a highly criticized plan, Nansen proposed to freeze the in an ice flow and let it drift along a current that flowed from east to west across the Polar Sea.

But things didn’t turn out quite as Nansen had hoped, and the Fram did not drift close enough to the Pole. In a last ditch effort to salvage the mission, Nansen left the ship, determined to complete the journey on foot. He took with him only one other companion, Hjalmar Johansen, some sled dogs and enough supplies to last three months.

But the harsh conditions and uneven terrain proved to be more than the pair expected, and the two watched helplessly as their original three months stretched on for much longer.

“We built a stone hut, we shot bears and walrus, and for ten months we tasted nothing but bear meat,” Nansen wrote in his journal. “The hides of the walrus we used for the roof of our hut, and the blubber for fuel.”

In the end, a lack of supplies forced the two to turn back before they could reach the North Pole, but they held the record for Farthest North for five years until 1899.

The Fram voyage was Nansen’s last major expedition. As he grew older, Nansen became increasingly involved in politics, first becoming the Norwegian ambassador to London and then a high commissioner for the newly formed League of Nations. From 1919 until his death in 1930, Nansen was a devoted global humanitarian. In 1920, when nations were still trying to rebuild Europe after the devastation of World War I, Nansen was dispatched by the international organization to direct the repatriation of half a million prisoners of war who had not yet been exchanged. Afterwards, Nansen successfully raised funds for famine relief efforts in Russia on behalf of the Red Cross.

For his success in these two tasks, Nansen was awarded the Nobel Peace Prize in 1922. When presenting him with the award, the Chairman of the Nobel Committee had these words to say about Nansen: “Perhaps what has most impressed all of us is his ability to stake his life time and time again on a single idea, on one thought, and to inspire others to follow him.”

The reference was to Nansen’s humanitarian work, but the same sentiment could have just as easily been applied to his numerous other undertakings. Whether he was navigating uncharted landscapes of ice, introducing compassion to the realm of politics, or defending an unpopular view of the nervous system, Nansen readily staked his reputation and often his life on his beliefs. Any of these tasks could have easily occupied a person for a lifetime, but Nansen tackled each unknown with fresh enthusiasm, and was rewarded in many cases with success.

Poison in the Ink: The Makings of a Manifesto
Poison in the Ink: Visiting Trinity

Monday, September 26, 2005

Atelier: Hurricanes, Race, and Risk

Ray Nagin, mayor of New Orleans, has, as of yesterday, officially invited the residents of Algiers to return to their homes. Algiers, a neighborhood of 57,000 people, is situated on the other side of the Mississippi River, away from the main part of New Orleans; consequently, it was largely untouched by the massive flooding from Hurricanes Rita and Katrina, and, unlike much of the rest of the city, it has clean water and electricity. The Ninth Ward, on the other hand, perhaps the hardest hit of all the New Orleans neighborhoods, was the site of a second round of crumbling levees and massive flooding, this time courtesy of Hurricane Rita. The differences between these two neighborhoods – one predominantly white and middle class, the other impoverished and overwhelmingly black – are, of course, largely over-determined. It seems, however, that of all the differences between Algiers and Ninth Ward, the most nettlesome one continues to be the fact of their racial difference, a difference, for sure, that New Orleans, especially given its racially contentious history, is keenly aware of. It is this racial distinction, and the host of inequities that this distinction serves to cover up, that has been so ruthlessly exposed and indicted by the arrival of Hurricane Katrina.

What is unusual about New Orleans is that, historically speaking, racial segregation in U.S. cities has generally followed a purely horizontally- oriented spatial logic. Detroit provides perhaps the best example of this movement: generous FHA housing subsidies encouraged whites to migrate to the outlying suburbs, while those residents who remained in the inner city, an overwhelming number of whom were black, were left to grapple with the difficulties of an inner city that was increasingly dilapidated, de-industrialized, and under-serviced. The topography of New Orleans is a spatial manifestation of these same generous and highly racist Federal Housing Administration loans; we see a similar urban sprawl effected by the movements of whites out of the center of the city. The history of New Orleans, when seen through its longue durée, adds a particular Cajun piquancy to the normal ways in which space is meted out in relation to race. Of all the amenities available to white New Orleans, its most easily forgotten has always been its relative safety from the contingent forces of nature. While New Orleans had its last colossal flood in 1927, the Ninth Ward has suffered several smaller ones: the Industrial Canal, which cuts through the Ninth Ward, and which failed during Katrina’s onslaught, has failed three times before, once during Hurricane Flossy in 1956 and again during Hurricane Hilda in 1964 and Hurricane Betsy in 1965.

Because of its peculiar geography, New Orleans has always been a disaster waiting to happen; consequently, real estate values in New Orleans reflect and anticipate this impending danger: how badly a given flood or hurricane affects you is determined primarily by where in New Orleans you live. The Ninth Ward is both the poorest and the lowest part of New Orleans. This confluence of qualities particular to the Ninth Ward make sense once we examine those gross inequities that Hurricane Katrina served to expose. In New Orleans, differing abilities to insure against the contingencies of the future are not only bifurcated along an axis of race, but are also physically materialized within the built environment. The spatial features of New Orleans only truly make sense, however, if we take into consideration the ways in which race serves to cover over these economic and spatial inequalities.

In the context of the United States, the construction of race has historically manifested as a black and white binary; such a forced construction needs perpetual maintenance, of course: the case of Plessy v. Ferguson (1892) – brilliantly analyzed in Saidiya V. Hartman’s Scenes of Subjection– is only the most dramatic instantiation of a wide variety of legal and social mechanisms employed to maintain the fiction of racial difference. This fiction – to be cynical for a moment – continues to provide a necessary service: blackness has always functioned in this country, albeit in historically varied ways, as an alibi for those economic inequalities that are an inherent feature of capitalism.

Blackness serves to simultaneously elide the cause and corporeally represent the effect of capitalism’s inherent inequality. By reifing the equivalence between blackness and say, poverty, there is a retroactive, normative ‘logic’ that comes to the fore which precludes blackness from being seen as a bodily attribute that has been constructed to always already represent a category of people positioned as poor and unequal. This reified equivalence insists upon a direct causal relation that equates blackness in its ontological essence with the existential fact of being poor and unequal. Blackness, to function as it does, requires an illogical confluence between the realm of appearances and the realm of “essences”. To the extent that supposed cultural or economic inequalities come to be represented and representable to society by the appearance of black skin, the structural inequalities of our economic system are, to a certain extent, made to disappear.

As the flood waters recede, so too will the media coverage; perhaps it would do us all well to continue attending to a disaster whose aftermath has exposed more acutely and more incisively than perhaps any other event of the last decade the insidious function of race within the United States. Forced into the national spotlight by a gross and perhaps even criminally negligent mishandling of those worst-off residents of New Orleans, the vast majority of them both poor and black, race (accompanied as always by its steadfast companion, racism) has finally come clean: the sorry truth is that racial distinctions (and the inevitable hierarchy and exploitation that attends such distinctions) are as American as apple pie; race was violently stitched into our nation’s very fabric, right from its beginning. It is high time we looked carefully, without flinching, at that warped and misshapen pattern that our nation has woven.

Planks from the Lumberyard

Spider Holes & Spider Goats: Tuning in the (White) Noises from the Margins

As if anyone needed further reason to become even more paranoid (or is it “perceptive”?), a marine involved in the capture of Saddam Hussein tells us that the “spider-hole” scenario was a carefully contrived spectacle directed by a “military production” unit; apparently Hussein was actually found in a “modest home in a small village” the day before. And, although by now it’s been long forgotten, the same seems to be true of the classic image of our troops assisting Iraqis with the toppling of the Saddam statue in Firdos square, which was almost certainly a thoroughly staged spectacle meant to shift public opinion (which, by and large, it did). These stories made only the tiniest disturbances in the mainstream media’s coverage, and were quickly lost in the endless maelstrom of disastrous news pouring in from both home and abroad.

Despite the surfeit of available information and the variety of news sources, it’s hard to know who or what to believe these days: when we are made to rely on questionable evidence to understand events we have not witnessed, the whole epistemological structure crumbles. Connections are indeed everywhere, but it takes a certain degree of sense and maybe even taste to follow them; it seems doubtful, for instance, that the resemblance of the satellite image of Hurricane Katrina to an unborn fetus bears any theodical significance.

The curious paranoid, however, will find his strangest and worst fears confirmed (and plenty of fuel to feed his fire) in reading some choice bits from the FBI files on “unusual phenomena,” now readily available on the FBI’s website thanks to the freedom of information act. There you will find extraordinary accounts of the FBI’s investigation into the rampant cattle mutilations in the midwest in the 70s and 80s; or the remarkable story/hoax of the “Majestic 12” , a secret ensemble of the country’s top scientific minds gathered to collaborate on decoding top secred alien codes found at Roswell. It’s unsettling to read through these files and consider how many man hours and tax dollars must have been spent investigating each of these cases which, to the contemporary field officer, must have seemed even more strange. And as any viewer of the X-Files could tell you, what better way to expose an event as a hoax than to offer the files freely to the public. If you are wondering, by the way, the source of the ongoing cattle mutilations has never yet been determined.

Skeptical? Let’s turn for a moment to “nature.” Have you heard about the spider-goats of Plattsburgh, New York? These two genetically-modified goats are surrounded by razor wire and guarded by armed security for the spider silk they produce in place of their goat milk, which is 3 times stronger than kevlar and intended for use in weaving bulletproof clothing, medical sutures, or superstrong ligaments. Perhaps you’ve heard of the parasitic hairworm, the zombifying parasite that rewires the grasshopper’s brain, inducing it to take a suicidal leap into open water? Or the wasp that injects its larvae into the orb-weaver spider, where it grows until the day before killing its host, it rewires its brain so that the spider weaves a web custom-made for the larvae to pupate in? And I’m sure things get far weirder still.

Although these strange noises issuing from the margins are rarely picked up by the major media, they resonate on more than just the paranoid registers. At the very least, they make for interesting reading; at best they provide non-normative perspectives and logics through which to receive and evaluate the larger flows of information. Allowing for the possibility that things are darker and more strange than they appear need not necessarily be considered the the first step down the fun-house hall of paranoia. Between spider holes and spider goats lies a great swath of unprocessed and poorly understood phenomena, things that only the figures on the margins seem to take seriously, be they (mad?) scientists or independent media hounds. And who knows but that, on the lower frequencies, the noise generated by the alt. zines and indy media of the world, speaks to you?

monday musing: 7 train

It’s a rather long story as to why, but Stefany, my wife my love, and I have just spent roughly twelve hours riding back and forth on the 7 train in Queens, New York. For those who don’t know, the 7 train runs from Times Square in Manhattan out to Main Street in Flushing, Queens. It’s a trip from one world into another. No, it’s a trip through several worlds and a number of levels of experience on top of that.

A few years ago John Rocker, a pitcher for the Atlanta Braves, hurled some attention the 7 train’s way with notable comments to Sports Illustrated. He said of New York, “It’s the most hectic, nerve-racking city. Imagine having to take the [Number] 7 train to the ballpark, looking like you’re [riding through] Beirut next to some kid with purple hair next to some queer with AIDS right next to some dude who just got out of jail for the fourth time right next to some 20-year-old mom with four kids. It’s depressing.”

Imagine.

Of course, it’s easy to make fun of a silly hick like John Rocker. It’s rather more difficult to explain just how wonderful and sublime it is to meander across the Queens landscape on an early Fall afternoon dipping down into the street life of particular neighborhoods and then ascending again to the platforms of the friendly old train. It straddles whole blocks. It dominates an entire chunk of Queens Blvd., and then Jackson and before that it weaves through the massive warehouses of Long Island City like an ancient snake that has its own well-worn paths.

The 7 train is great in a million ways but it really shows off after Hunter’s Point when it gets to burst out of the tunnel and go above ground. It’s a cocky train. That probably comes from sitting around at Times Square and 5th Ave. and Grand Central. The 7 train knows the bright lights and the glamour. But that’s not where it stays or where it spends its time. The 7 train heads out to Queens and it has its heart there.

Not able to get its head straight after the blast across the East River the 7 train bounces around the lost blocks of LIC like its trying to find someone it knows or shake someone it doesn’t. The developers in LIC spent so long concentrating on the neighborhood as naught but a residential appendage of Manhattan that that is what it feels like. It’s a vampire living for someone else.

But the rest of Queens was made for people to live in and they do. When the 7 train settles down on Queens Blvd. and then works its way into Woodside and Jackson Heights you can feel its gentle chugging on the old wooden tracks and you can hear a kind of metallic, mechanical confidence. If you have an ear for such things.

The train is filled with everybody, by the way. There is no greater definition of everybody than the 7 train. There is no more powerful statement of ‘everybody’ than scanning your eye across a stretch of 7 train at anytime, any day, whenever you like it and some struggling fuck-up from Mexico or Bangladesh or Korea or Uzbekistan or Ecuador or Kenya or Romania has a kid on his lap and is stroking the kids hair and saying in whatever mumbly-talk incomprehensible language “go to sleep my sweet child go to sleep my lovely child I’m doing everything for us that I possibly can.”

If you ride the 7 train enough like I do you see that kind of shit all the time. You might think you get blasé about it but you don’t. It makes some part of your chest cavity swell up dumb and sputtering and overdrawn. That might be one response to John Rocker types if it could be expressed more clearly.

You’ve got tons of Indians and other flavors of South Asian around the 60’s and 70’s and then it’s dominated by Latinos in the blocks after that. It’s not the same New York as other places here and that doesn’t mean it’s either better or worse. What is remarkable is the sense of transference that occurs. Manhattan is an international place but it brings all the world into its orbit. Queens reverses that.

I’m sure that walking around certain blocks in Corona Park more closely approximates walking down a specific street in some little po-dunk town in the countryside of Columbia or Peru than anything else in the world. It’s like the fucking Incan empire had a second wind on a few of those streets.

And then boom, you’re in the frickin far East. Korea, China, Korea, China, a touch of the Philippines and throw in whatever else. Main Street is just simply Asia, which kind of delights the mind to consider for a moment.

Anyway, I hope this is how all things get. I hope all cities get smooshed up and tossed around like Queens did. I hope the 7 train isn’t something just accidental that happened and will go away, will get lost somehow or forgotten. When the sun is sliding away to the West and you’re standing on one of the sparse platforms of the 7 train you see Queens like a weird urban plain at the foot of the Manhattan skyline. It’s sweet and quiet but for the slow rumble of the subway cars doing their rounds day and night. This is a new kind of nature that I love and I’ll speak for as best I can. It’s as good as anything.

Dispatches: Optimism of the Will

The first time I saw Edward Said, in 1993, I was an undergraduate studying literature at Johns Hopkins, where he had come to give a lecture.  An extremely pretentious young person, I arrived in the large hall (much larger than the halls in which other visiting literature professors spoke) with a mixture of awe and, I’m afraid to say, condescension.  This was born of the immature idea that the author of Orientalism had ceased to occupy the leading edge of the field, postcolonial studies, which his work had called into being.  At that time, the deconstructionist Homi Bhabha and the Marxist Aijaz Ahmad were publishing revisions of (and, in the case of Ahmad, ad hominem attacks on) Said’s work, and Said himself seemed to be retreating from “theory” back to some vaguely unfashionable (so it seemed to me) version of humanism. 

There are interludes in which a thinker’s work, no matter how enabling or revolutionizing, are liable to attack, to labels such as “dated” or “conservative,” from more insecure minds.  In this case, the actual presence of Said destroyed those illusions utterly.  Seated on a dais at a baby-grand piano, he delivered an early version of his reading of “lateness,” on the late work of master figures such as Beethoven and Adorno.  In a typical stroke, Said’s use of Beethoven’s late work as one example, and then Adorno’s late work on Beethoven as a second example, highlighted the mutual relationship between artist and critic, each dialectically enabling the other’s practice.  The further implication, of course, that Said himself was a master critic entering such a late period (he had recently been diagnosed with cancer) was as palpably obvious as the idea that Said would say such a thing aloud was preposterous.  And on top of it all, he played the extracts from Beethoven he discussed for us, with the grace of a concert pianist (which he was).  I left the auditorium enthralled.

Within the next year, I was lucky enough to be invited to dinner with Said by my aunt Azra, who was one of the doctors treating him for leukemia.  Seated next to him, I challenged him on several subjects, with the insufferable intellectual arrogance of youth.  His responses were sometimes pithy and generous, sometimes irritated and indignant.  On Aijaz Ahmad, who had been attacking him mercilessly and unfairly, he simply muttered, “What an asshole.”  How refreshing!  When I asked him why we continued to read nineteenth-century English novels, if they repressed the great human suffering that underwrote European colonial wealth, he gave the eminently sensible answer, “Because they’re great books.”  At another dinner, at a Manhattan temple of haute cuisine where he addressed the waiters in French, I complained that the restaurant’s aspirations to a kind of gastronomic modernism were at odds with their old-fashioned, country club-ish “jacket required” policy.  He raised an eyebrow at me and dryly remarked, “I hadn’t even noticed the internal contradiction.”  Score one for the kid.

In 2003, as a graduate student in English at NYU, I rode the subway up to Columbia each week for a seminar with Said, which turned out to be the last one he taught.  Wan and bearded, Said would walk in late with a bottle of San Pellegrino in hand and proceed to hold forth, off the cuff, about an oceanic array of subjects relating to the European novel (Don Quixote, Gulliver’s Travels, Sentimental Education, Great Expectations, Lord Jim, etc.), alternately edifying and terrifying his audience.  He had an exasperation about him that demanded one to know more, to speak more clearly, to learn more deeply in order to please him.  Some found the constant harangues too traumatic for their delicate sensibilities; I loved to have found a teacher who simply did not accept less than excellence.  It was a supremely motivating, frightening, vitalizing experience.  In a class on Robinson Crusoe, a fellow student became confused about the various strands of eighteenth-century non-conformist Protestantism, prompting Said to irritatedly draw a complex chart of the relations between Dissenters, Puritans, Anglicans, etc.  Similar demonstrations of the sheer reserves of his knowledge occurred on the subjects of the revolutions of 1848, the history of Spanish, and the tortuous philosophical subtleties of Georg Lukacs’ Theory of the Novel, among other things.  Said had a whole theory of the place of nephews in literature (not the real son, but the true inheritant), and he made himself his students’ challenging, agresssive, truth-telling, loving uncle.

What was most impressive, perhaps, was that in a discipline in which rewarding fawning acolytes is the norm, Said never once allowed a student to make facile, moralizing remarks about ‘imperialism’ or ‘Orientalism.’  His belief was that in order to mount any kind of critique of these works, one had to master them first in their own context, on their own terms.  He wasn’t interested in hearing denunciations of colonialism; rather, he was obsessed with getting across the sheer formal complexity, the deep symmetries and ironic gaps, of great artworks.  He demanded that we memorize information of all kinds, from the relevant facts about historical events to the birth and death dates of authors.  He screamed at a student for not knowing what a nosegay was, the key to a climactic scene in Flaubert.  His passion, a powerful negative vaccine, to paraphrase his comments on Adorno, infected those of us who weren’t frightened into disengagement.  I began to realize that here was a unique resource, an irreplaceable historical repository of culture and information, personified in the form of this dapper, indignant man.  Said represented not only a set of unmatched comparatist knowledges, but a collection of rigorous reading practices and an unequaled example of thorny, courageous commitment to difficulty.  And around this time, I found myself realizing that irreplaceable or not, he wouldn’t be around much longer.

One day, Said yelled at me publicly for misprounoucing my own name.  I had Americanized its proununciation for the benefit of a visiting professor, John Richetti, who I was questioning.  “Your name is Us-udth!” Said cried, “It means lion in Arabic!  Never mispronounce it for their benefit!”  (Richetti was an old friend of Said’s and found being characterized as one of “them” highly amusing; I ran into him a year ago and we laughed about it.)  Afterwards, Said walked over and put a hand on my shoulder.  “Sorry about that. We can’t change ourselves for anyone.  Opposition,” he intoned, quoting Blake, “is true friendship.”  That encounter marked a turn for me. I began to visit Said in his office, waiting while he took phone calls from friends like Joan Didion, and telling him about my work.  He had the special ability to make one feel that one could achieve anything – maybe it helped that he set the bar so high himself.  Despite his heavy criticism of my use of certain theoretical vocabularies he had moved past (“the merest decoration,” he called them), the last word of his handwritten comments on my paper inspired me and continues to inspire me: “Bravo.”  As a favorite aphorism of his from Gramsci goes, pessimism of the intellect, optimism of the will.  Yet, during this period, he would intimate to me that things were not optimistic with his health at all.  One day he declined my request that he read a chapter I had written, saying only, “I don’t have time.  You know, Asad, I’m not well.”  The unsentimental, factual tone of resignation told me everything I didn’t want to know.

I remember September 25, 2003 vividly, the phone call I received from my aunt Azra with the inevitable news, the sick feeling with which I arrived at the class I was teaching, and then this: a strange, powerful feeling of indignation came over me, and I found myself needling my students, finding myself irritated when they didn’t know something, and applauding zealously when they did made a breakthrough.  I was, I realized, channeling or imitating the ornery yet loving spirit of the old lion.  And since that time, I’ve suffered more losses, of people I love to illness and absence, and I have thought of him, bravely refusing to stop expecting more.  In his last decade, as the situation in Palestine and Israel worsened and beloved friends such as Eqbal Ahmed and Ibrahim Abu-Lughod died, I know Edward felt more alone in the world.  With his passing, though we try to forget it, the world has equally became an emptier, lonelier place.  Without the superbly contradictory, fearfully charismatic, bravely heartfelt Edward Said, it is also a far less cosmopolitan place.  And in the last two years, despite events that have made my world much emptier, much lonelier, I have remembered how to transform irritation with this fallen world into action, how to keep, in the face of all, indignantly hoping for better.

Two years ago yesterday, Edward W. Said died at the age of 67, having achieved eminence in criticism, literature, music, and politics, having served as an exemplar of the one doctrine that perhaps he in his uncompromising way would have accepted uncompromisingly, humanism.  Bravo, Edward.

Dispatches:

Everything You Always Wanted to Know About Vince Vaughan…
The Other Sweet Science
Rain in November
Disaster!
On Ethnic Food and People of Color
Aesthetics of Impermanence

Poetry and Culture

Australian poet and author Peter Nicholson writes 3QD’s Poetry and Culture column (see other columns here). There is an introduction to his work at peternicholson.com.au and at the NLA. All postings at 3QD (September 26, 2005–March 3, 2008) are copyright, the author. All rights reserved. Apart from fair use provisions under the Commonwealth of Australia Copyright Act 1968, and its amendments, subsequent copying in any other manner requires written permission of the author.

The following essay can also be read online at Blesok in an English/Macedonian bilingual edition. Blesok No. 56 Volume X September-October 2007 ed. Igor Isakovski; available in print, Prosopisia Vol 1, 2008, Jayanta Mahapatra, Anuraag Sharma, Pradeep Trikha eds., Ajmer, India 44–56

To Seek and Find: Poetry and Limitations of the Ironic Mode in the New Millennium

The catastrophic events of September 11, 2001 were, obviously enough, an epic  moment in world history. And in cultural history too. Here, with Dantesque finality, was a brutal confrontation between annihilating fundamentalism and capitalist pluralism. Art is political, and the implications for art arising out of this attack have complex resonances. Artistic periods never end with punctuation marks of such cataclysmic force, and doubtless, in years to come, there will still be people bringing their lack of seriousness  onto us in the name of some tail-end of the expected modernist nirvana. September 11 should have brought us to a political and artistic reckoning. Subsequently, Australian artists have every reason to similarly confront the tradition within which they work and create, after the outrages in Bali on October 12, 2002. What have modernism and postmodernism given us, and what might be the limitations of their aesthetic cultural agendas. And where will we go from this point on.

One might have read Nietzsche’s Beyond Good and Evil as a brilliant intellectual exercise without thereby adhering to his scorn for what he termed ‘slave morality’. Perhaps one could claim Nietzsche as the ‘godfather’ of the present loss of nerve amongst poets, though that would be unfair—Duchamp’s Fountain looks to be a more likely progenitor. In fact there was no getting beyond good or evil, even as love, just as there were clear limitations to the philosophical and artistic liberation proposed by modern and postmodern sensibilities. If Voltaire was the instigator of the enlightenment, then, surely, an act of terror was the symbolic black end of one loop of cultural experimentation which not all the references to Joyce, Eliot and Beckett, Mallarmé, Kafka or Sartre, Schopenhauer, Heidegger or Foucault, could summon back from entropy. Could the imagery be any starker: the contrast between art that indulged itself, increasingly in the ironic mode, at the cost of any semblance of responsibility to its increasingly unimpressed and diminishing readership; and there, in countless desperate acts, the brittle certainties of the funded fun future made seemingly redundant. In poetry, amenable sensibilities had a propaganda effort made on their behalf worthy of Goebbels, but the prospective audience was never convinced, either by the art itself or the slurry of theory surrounding it. The gentility principle might have driven many to the desperate shores of a verse technique where a confessional mode almost became a therapeutic cry for help;—much contemporary verse politicking espoused a similarly-perceived principle. There, the understood ground rules were based on an a priori assumption that what had passed before during the entire history of poetry was no longer adequate to meet the expressive demands of the brave new world. Since stem cell research and silicon chips could only preserve a sense of well-being to a certain extent, some were now going to open up a newly-evolved and superior verse technique that would conquer the deconstructed past and lead us into a freshly-felt and apprehended poetics. Or not. It was all very well to get enthusiastic about the Modernist ethic as espoused by a le Corbusier. Actually living in the buildings put up proved to be another matter altogether. And living in, or with, the poems put out by the critical establishment as similarly worthy of merit often found readers abandoned in a maze that could lead them up a desolate cultural garden path.

A large part of the critical ethos of our culture, with its net of conservative and avant-garde sensibilities, now seems an inadequate systematisation of the complexities within important works of art. In the sudden and unexpectedly given act of courage, grace or death, or the long slog toward some human dignity—the aid worker getting down in the dust and the blood, the teacher supplanting ignorance with learning—there was an alternative poetic act that had no need of accommodating aesthetics. As artists, we had learned to corral art into convenient and limiting holding pens; the animals inside were then sold off to the highest bidder. Some gave good money for New York expressionism; others paid handsomely for suicide chic; over at  the ISCM they put down a fortune for the Boulez electronic extravaganza. But something strange happened along the way because it looked increasingly as if apparently outmoded nineteenth-century art had got beyond whatever forefront was being temporarily talked up. Tristan und Isolde sounded the depth of our skinful, but there was a Verklärung waiting in the wings, and the contemporary had no time for transfigurations. Emily Dickinson’ s poetry startled with its savage joy; Goya dragged revolutionary tumult to the edge of the canvas, seething with the imagery of disaster. But art aficionados, safe in their Western enclaves, mute herds trading their tame emails, had entirely missed the point. One was never in advance of the immediate historical moment, however seductive it seemed to want to have it otherwise. There has been some not-very-logical wish-fulfilment in the poetry world based on a futile desire to appropriate a time-traveller’s gold points reward scheme for being ahead of the rest. Certainly, talk about art, and theorising about art, reached the point where secondary considerations—the talk about art—was in danger of supplanting the primary consideration—the art. If Australians grabbed at people who ran or swam fast, or bashed balls of various kinds skilfully, as a desperate remedy for a failure to confront their destiny—Aboriginality, salinity, harsh political reality—the rest of the Western world showed that it was no less given to avoidance of reality too. Foreign policy had failed the poverty of millions; political imperialism had given itself over to triumphalism; fanatical hatred made suddenly clear the terrible cost of the partial, self-congratulatory view. Mandarin encyclicals sent forth from politically-correct, or incorrect, clearing houses had nothing to do with the creation of genuine works of art; to see so many poets set up house in them was just one more sign that the poetry world was diminishing in strength, diversity and vitality despite the fact that more poetry was now published than at any time in history. But how do you employ a poet, since any reasonably-good poet is going to be a Cassandra given to  psychic keening: that never went down well in the staff room.

It is easy to make accusations of parochialism and to portray the reaction I have just outlined as retroactive. But if artists are claiming to do something important and worthy of our time, it is essential that we also remain worthy of the  inheritance of freedom and democracy that gave us the option to write, compose or paint—digitise, download or deconstruct—as we wished. Aesthetic freedom never meant anarchy; being an individual in art did not mean you could indulge your sensibility, because the resulting artistic ivory towers were just as certainly going to go down in flames. Fortunately, this could occur without violence (though perhaps not the ‘violence from within that protects us from a violence without’ Stevens). Despite a century of despotism from various factions in the poetry world, from which one might have thought people had learned some principles of democratisation, and putting all the stylish folderol of HTML to one side, we were just getting the old Stalinist power plays all over again. Nothing had really changed in the minds of these people; they were dully intent on repeating their one-note aesthetic agendas. Thus when it came time to anthologise what we usually got was a series of fetishised poems meant to underline the editor’s subjective aesthetics—we hardly ever got the best poems by the best poets. And these people could not resist utilising the means of production. You could draw a comparison between the appalling collectivist farms at the height of the Communist period in Russia and many a poetic enclave. Just as the collectivist farm failed and millions starved, so the self-enclosed poetry collective saw off any untoward intellectual or poetic disturbing element. The purifying flame of excommunication hovered in the background. The end result was always the same—death of the system and the extinguishment of its hopes. Only the seeding ground of the house of all nations could breed the soil from which a civilised sensibility could emerge. Strangely enough, it was the Russian poets who seemed to get beyond the usual politics. What a roll-call of talent and individuality they managed amongst all the turbulence.

When considering the history of poetry during the twentieth century, it looked increasingly clear that the poets who mattered enough to become part of the culture that was going to last were going to be those instinctive poets who wrote because they had to, not spruik lines at the behest of a grant. There was Auden’s cosmopolitan insouciance, Frost’s dark pastoral, Stevens’ marmoreal aesthetic grandeur. Limestone cliffs, glittering birches, dazzling Key West reefs: the aesthetic was personal, political. It never made the mistake of romanticising itself through adoption of theory or of using language as a game, because poets of their stature knew that art was far too serious not to take language seriously too. Though clusters of theory gathered around their poetry, they had no need of it. If it was merely funny to hear a teenager refer to the ‘genius’ of the latest ersatz pop star, it was truly terrifying to read the German composer Stockhausen referring to the September 11 events as a ‘grosseste Kunstwerk’. Here was the aesthetic response gone completely awry. A century of aestheticising and ironising experience had reached beyond the protecting field of common sense.

[Part 2 of this essay continues here.]

                                                                       *

                   A Definition

It lives
When the gold myrtle wreath
From an uncertain tomb
Is put on display under glass,
Is sex and death
In each of their various fashions,
Tastes of salt,
Smells of hot bitumen
Or a handful of crushed leaves.
It rids the boredom of known stuff
And gossip that doesn’t amaze
In a shiver scalping our skin.
It can’t be polite—
Mucus, scar tissue, fluids
Best not mentioned
Rush to its page
That we sometimes write,
Sometimes sleep with,
Sometimes kill with.
Our depression won’t exhaust it.
Think of a cleaver stuck in your thigh,
Skin made mortal,
Or the crimp on the face
When we stand on the edge of large things—
A hard birth, the end of the affair,
That loved thing whose name makes us sweat.
It isn’t money,
Though money might buy
Something of it
(Cézannes on the wall,
The rights to Fellini’s next film).
It might come
Just as you’ve ironed the ninth shirt
And feel like throwing the kid
Who hasn’t shut up for three hours
Out the window along with the bills
(But the child
Is made wholly of this thing—
It can shred as years intervene).
Then, for each expert
Who sets down its plan,
The real thing goes off at tangents.
It won’t fit in troughs,
Glinting, flittering over books,
Breaking Olympic records.
Try to put a sack over it,
Hold it under water—just like Johnny,
It’ll be back, grinning.
So, whatever you might think
About its demise, it will be around,
The warmth behind our monotony,
That passion in the slipstream,
For it lives and keeps on:
That’s what poetry is.

Written 1989 Published 1997 A Dwelling Place 20–21

               

Monday, September 19, 2005

Selected Minor Works: Replacing William Safire

Justin E. H. Smith

William Safire’s recent retreat to half-time duties at the New York Times
may no doubt be taken as an indication that he is not long for this world.
I confess I cannot help but fantasize about the position this will open up,
not of course that of right-wing bloviator at the heart of the liberal media
establishment, but that of our nation’s leading language maven.  Give his
op-ed column to some cocky veteran of the Harvard Crimson.  I want ‘On
Language’.

This title, under Safire’s reign, has been something of a misnomer.  He
purports to write on language, but for the most part writes on a particular
language.  The language he writes on is also the language he writes in, and
it is, to be sure, a historically significant and widely spoken one.  But
language itself is one thing, languages are quite another.  Safire would
know this if he were willing to venture out a bit and consider a language,
such as French, that captures, in distinct terms, the distinct concepts of
language per se, on the one hand, and this or that language on the other.

Even if he were to concede that it’s not langage but this or that langue
that interests him, surely there are others besides English that would
warrant attention.  The Uralic family, for example, consisting in the Finno-
Ugric and Samoyed branches, has some interesting features.  Yurak, one of
its lesser children has ten distinct moods for its verbs: indicative,
narrative, potential, auditive, subjunctive, imperative, optative,
precative, obligative, and interrogative.  Yurak’s cousin Selkup attaches
conjugational suffixes to verbs to express different modes of action,
including the continuative suffix, the breviative, the frequentative, the
plurative, and the usitative.

It’s just a hunch, but I’m pretty sure Safire wouldn’t have a thing to say
about the usitative suffix.  And yet this is assuredly a bit of language,
employed competently by hunter-gatherers out in the tundra, and described
beautifully, with breathtakingly foreign extracts of written Selkup too
dense with diacritical marks to reproduce here, in Björn Collinder’s
magisterial Survey of the Uralic Languages (Stockholm, 1957).

But let us return to the Indo-European family.  If I were allowed to write
‘On Language’, I would devote much space to negation and to definite articles,
drawing rich examples for comparison from the Slavic, Romance, and Germanic
branches of this distinguished dynasty.

I would meditate on a curious parallel between the French split negation,
“ne…. pas” or “ne… rien”, and a certain vulgar means of denying in
English.  Consider the French for “I saw nothing”:  “Je n’ai vu rien.”
Consider, now, the structural similarity to this of the colloquial “I didn’t
see shit.”  I have no developed theory to offer, but it seems to me that
this counts as a split negation in English, and that ‘shit’ is doing exactly
the same work as the French ‘rien’.  That shit and nothing are substitutable
is a fact perhaps of interest to psychoanalysts as well as linguists.  Here
I’m only pointing it out.

I have more developed ideas about definite articles.  One thing that has
long troubled me is the existence of languages, such as Russian and Latin,
that can do entirely without them. I have seen some of Bertrand Russell’s
work on definite descriptions translated into Russian, and there the
translator was forced to simply retain the English article.  But one wonders
if the problem that concerned Russell would have come up at all if he had
been a monolingual Russophone.

The absence of ‘the’ in Russian troubled me greatly recently as I struggled
to translate Aleksandr Blok’s melancholy and Nietzschean poem about
Leningrad, the one that begins “Noch’, ulitsa, fonar’, apteka.”  Is he
writing about a night, a street, a lamp, and a pharmacy, or the night, the
street, the lamp, and the pharmacy?  Can this question even be answered?

In order to preserve the original Russian’s meter, I decided to leave out
the definite articles in the first stanza, and put them in in the repetition
of the same terms in the second stanza, thereby yielding the extra syllables
needed to make the English rendition flow.  Here is the result:

Night. Street. Lamp. Pharmacy.
Meaningless and murky light.
Live another quarter century.
It will be as now. No hope of flight.

You’ll die, you’ll begin again from the start.
Just as before, it will all repeat.
The night.  The canal’s icy ripple.
The pharmacy. The lamp. The street.

Now the question I’ve been unable to answer is whether the repeated use of
‘the’ at the end is poetic license on my part, or whether the original
Russian nouns entitled me to insert whatever articles I felt were needed,
and for whatever reason.  Again, they are there at the end, and not in the
beginning, only to preserve meter, and not because the meaning of the
Russian seems to require them more in the second stanza.  But are they truly
not there in the Russian, are they equally there and not there, or is there
simply no fact of the matter?

If Arthur O. Sulzberger is interested, I will be happy to meditate on this
further, as on related questions, in the Sunday Times.  It is much more
likely, of course, that the same young cock from the Crimson who got the op-
ed column, or perhaps his roommate, will get the language column as well,
and he will expatiate on the origins of words like ‘synergy’ and approvingly
rehash the witticisms of Winston Churchill.

Having come to terms with this harsh reality, I look forward to offering my
thoughts on language, as well as art and culture, to you, the good readers
of 3 Quarks Daily, every third Monday in my new column, ‘Selected Minor
Works’.

Grab Bag: Bite Your Tongue, Movies Turn Dumb

In “Summer Fading, Hollywood Sees Fizzle,” published in the New York Times on August 24, Sharon Waxman discussed the decline of ticket sales in the context not of a shifting economy or social landscape but instead of the increasingly lacking quality of mainstream movies. The piece was a helpful reminder to discontented viewers that they are not alone. This may seem silly: many of us hear our friends bitching and moaning about movies all the time, but we also hear our friends bitching and moaning about the current state of the government, about obesity in the US, about cultural appropriation, about any number of liberal topics met only with clichéd observations and statements of the obvious. But while our government continues it spiral downward with increasing momentum, while obesity climbs, and while kaballah water is sold at Wal-Mart, Waxman’s presentation of a panicked film industry provides some hope that perhaps America is finally taking a stand against the monolithic empire of Hollywood. It is no longer the white elephant, but an issue in which the industry must respond to money, the thing that whispers throughout its collective home with deadly quiet and unnerving interminability.

That a slow in the flow of money has worried the industry of course comes as no surprise, but what seems to be happening is that excuses are wearing thin. No longer are studios entirely attributing dropping ticket sales to dvds, tv, home entertainment, bad weather, good weather, higher gas prices et cetera. Instead, the industry appears to be finally looking inward to reassess its product.

New Hollywood movies seem to suffer from several problems, some superficial and some more fundamental. The superficial problems lie in the changing conventions of the industry. While conventional formulas still shape movies, they are increasingly muddled and—oddly—simultaneously too specific. The crossbreeding of genres has spread conventions so thin that meanings can be confused or even contradictory. For example: Mr. and Mrs. Smith as an action movie sets up, plays out, and resolves glossy and unemotional violence in a typically Hollywood way, with predictable style and visual effects. As a romantic comedy too, the film uses a well-worn and well-known vehicle, a couple is living a static and cold life that is re-impassioned through some kind of hardship or trauma. The blending of these two systems, however, resulted in sloppiness all around: the movie didn’t have to have well choreographed or stimulating action sequences because of the romantic subplot while the romance didn’t have to be explained or even make sense because of the action.

Similarly, a movie like the Fantastic Four was enough of a comedy that its cartoonish CGI wasn’t as glaringly offensive and yet it was still at its core an action movie and so its comic simplicity was permissible. These constant justifications leave most mainstream movies that try and blend genre more a hodge-podge than an interweaving and we are constantly distracted from our questioning through the inclusion of more disconnected plot material.

My constant and first complaint when leaving new blockbusters recently has been about questionable logic. I am perfectly happy watching movies that are silly, ridiculous, fantastical, and minimal as long as there is some internal logic and consistency. New movies, though, are breaking standards of providing background information and causality that fifty years ago would border on avant-garde. But in today’s movies there are no ends to this choice. There is no refutation or exploration of narrative convention, no tongue-in-cheek homages or implicit criticisms in the oftentimes bizarre flow of plot information. Even in a pseudo-documentary like March of the Penguins, so many details of the story were left out, so many questions went unanswered, and so much of the film relied on picturesque imagery that I left the theater more confused than when I arrived about the subject. Of a documentary.

Film has successfully, since its beginnings, built a language around itself through which it expresses a kind of reality that the spectator not only observes, but engages with as well. We are asked to accept non-realistic sound and image as realistic, and we do so because we are so used to it. Editing is a typical example: while we don’t see the world through a series of edits, we never question this basic mechanism when watching a movie. Space is disrupted, it is extended and contracted, but we are never disoriented while watching it because we are a part of a larger cinematic reality, which we perceive differently. There are countless other examples of this same operation in film—from framing to sound use to camera angle and color manipulation—that all together build a basic vocabulary of the medium.

Genre takes the notion of vocabulary and hones it so precisely that it eventually functions as an equation into which each film provides variables, essentially introducing to the vocabulary a syntax that organizes the smaller formula-parts. Within this tight-knit code meaning is created through small changes. Take a standard horror plot but instead of the blob make the bad guy a space-monster, and suddenly the allegory shifts from fear of consumption to one of xenophobia. This same plug-in method exists in most genres, for example the feminist—albeit old-fashioned—western 40 Guns or the homosexual twist of the melodrama Far From Heaven, both of which heavily rely on the spectator to know the formula and thus understand the significant changes.

Over the course of the last century then, the silver medium has developed specific genres and specific stylistic conventions that have, through their repetition, crafted the ideal audience. New blockbusters, though, are losing this consistency by changing this vocabulary, partially by tying it in more closely to television. The difference between the two can be difficult to isolate, but it certainly has something to do with the cadences of dialogue and comic delivery, the “situational” humor. Likewise though, tv has become more cinematic, especially seen in the HBO series shows such as the Sopranos and Entourage. Whatever the difference, going to the movies has begun to feel—to those who have experienced it—like watching television for the first time in months and months: you are utterly disoriented in a world that you know you should “get,” in which every allusion and every reference soars above your head. Unlike television, however, film is not serial (sequels excluded), and the same modes of storytelling cannot be adapted to both.

Ultimately, it seems as though the language of cinema has finally begun to get ahead of itself, that the very formulas it established have been abstracted to a point that is so basic that the spectator is either bored by it or, in my case, dumbfounded by it. Film has almost followed the same, albeit more condensed, trajectory of other visual arts: from a time of documentary realism to increased experimentation and ultimately visual language built upon a foundation of conventional symbols. This language, however, has begun to grow incomprehensible to the very audience whose continued acceptance of it allowed its conception in the first place. The industry then has to take a more theoretical approach to the problem and figure out a way to re-connect to the audience not through the stories it tells, but by the way in which it expresses those stories. Generic and visual conventions need to return to an earlier state in which their very clarity generated interest, in which stories were easier to understand and in which language was not always foreign.

Monday Musing: General Relativity, Very Plainly

[NOTE: Since I wrote and published this essay last night, I have received a private email from Sean Carroll, who is the author of an excellent book on general relativity, as well as a comment on this post from Daryl McCullough, both pointing out the same error I made: I had said, as do many physics textbooks, that special relativity applies only to unaccelerated inertial frames, while general relativity applies to accelerated frames as well. This is not really true, and I am very grateful to both of them for pointing this out. With his permission, I have added Sean’s email to me as a comment to this post, and I have corrected the error by removing the offending sentences.]

In June of this year, to commemorate the 100th anniversary of the publication of Einstein’s original paper on special relativity, I wrote a Monday Musing column in which I attempted to explain some of the more salient aspects of that theory. In a comment on that post, Andrew wrote: “I loved the explanation. I hope you don’t wait until the anniversary of general relativity to write a short essay that will plainly explain that theory.” Thanks, Andrew. The rest of you must now pay the price for Andrew’s flattery: I will attempt a brief, intuitive explanation of some of the well-known results of general relativity today. Before I do that, however, a caveat: the mathematics of general relativity is very advanced and well beyond my own rather basic knowledge. Indeed, Einstein himself needed help from professional mathematicians in formulating some of it, and well after general relativity was published (in 1915) some of the greatest mathematicians of the twentieth century (such as Kurt Gödel) continued to work on its mathematics, clarifying and providing stronger foundations for it. What this means is, my explication here will essentially not be mathematical, which it was in the case of special relativity. Instead, I want to use some of the concepts I introduced in explaining special relativity, and extend some of the intuitions gathered there, just as Einstein himself did in coming up with the general theory. Though my aims are more modest this time, I strongly urge you to read and understand the column on special relativity before you read the rest of this column. The SR column can be found here.

Before anything else, I would like to just make clear some basics like what acceleration is: it is a change in velocity. What is velocity? Velocity is a vector, which means that it is a quantity that has a direction associated with it. The other thing (besides direction) that specifies a velocity is speed. I hope we all know what speed is. So, there are two ways that the velocity of an object can change: 1) change in the object’s speed, and 2) change in the object’s direction of motion. These are the two ways that an object can accelerate. (In math, deceleration is just negative acceleration.) This means that an object whose speed is increasing or decreasing is said to be accelerating, but so is an object traveling in a circle with constant speed, for example, because its direction (the other aspect of velocity) is changing at any given instant.

Get ready because I’m just going to give it to you straight: the fundamental insight of GR is that acceleration is indistinguishable from gravity. (Technically, this is only true locally, as physicists would say, but we won’t get into that here.) Out of this amazing notion come various aspects of GR that most of us have probably heard about: that gravity bends light; that the stronger gravity is, the more time slows down; that space is curved. The rest of this essay will give somewhat simplified explanations of how this is so.

THE PRINCIPLE OF EQUIVALENCE

Just as in special relativity no experiment that we could possibly perform inside a uniformly moving spaceship (with no windows) could possibly tell us whether we were moving or at rest, in general relativity, no experiment we can possibly perform inside the spaceship can ever tell us whether we are 1) accelerating, or 2) in a gravitational field. In other words, the effects of gravity in a spaceship sitting still on the surface of the Earth are exactly the same as those of being in an accelerating spaceship far from any gravitational forces. Yet another, more technical, way of saying this would be that observations made in an accelerating reference frame are indistinguishable from observations made in a classical Newtonian gravitational field. This is the principle of equivalence, and it is the heart of general relativity. While this may seem unintuitive at first, it is not so hard to imagine and get a grip on. Look at the spaceship shown in Fig. 1 (in the next section, below) and imagine that you are standing on its floor while it is standing upright on the surface of Earth, on the launch pad at Cape Kennedy, say. You would be pressed against the floor by gravity, just as you are when standing anywhere else, like on the street. If you stood on a weighing scale, it would register your weight. Now imagine that you are in deep space in the same ship, far from any planets, stars, or other masses, so that there is no gravity acting on you or the spaceship. If the spaceship were accelerating forward (the direction which is up in Fig. 1), you would be pressed against the floor, just as when an airplane accelerates quite fast down the runway on its takeoff roll, you are pressed in the opposite direction against your the back of your seat. If the acceleration were exactly fast enough, you would be pressed against the floor of the spaceship with the same force as your weight, and at this rate of acceleration, if you stood on a weighing scale, it would again register your weight. You would be unable to tell whether you were accelerating in deep space or standing still on the surface of Earth. (You could perform all of Galileo’s experiments inside the spaceship, dropping objects, rolling them down inclined planes, etc., and they would give the same results as here on Earth.) Are you with me? What I am saying is, for a gravitational field of a given strength in a given direction (like that at Earth’s surface toward its center), there is a corresponding rate of acceleration in the opposite direction which is indistinguishable from it.

I am afraid of losing some people here, so let me pump your intuition with a few examples. Have you ever been on one of those rides in an amusement park (the one I went to was called the Devil’s Hole) where you stand in a circular room against the wall, then after the room starts spinning quite rapidly, you are pressed strongly against the wall and then the floor drops away? It can be scary, but is safe because you are accelerating (moving in a circle) and this presses you to the wall just as gravity would if you turned the whole circular room on its side (like a Ferris wheel) and lay on the side of it which is touching the ground. Most gravity defying stunts, like motorcyclists riding inside a wire cage in the shape of a sphere, rely on the effects of acceleration to cancel gravity. You’ve probably seen astronauts on TV training for weightless environments inside aircraft where they are floating about. This also exploits the principle of equivalence: if the plane accelerates downwards at the same rate as a freely falling object would, this will produce what could be described as an upward gravitational force inside the plane, and this cancels gravity. Of course, from an outside perspective, looking through the plane windows, it just seems that the plane and the people in it are both falling at the same rate, which is why they seem to be floating inside it. But inside the plane, if you have no windows, there is no way to tell whether you are far away from any gravitational field, or simply accelerating in its direction. All this really should become quite clear if you think about it for a bit. Reread the last couple of paragraphs if you have to.

BENDING OF LIGHT BY GRAVITY

Rockets_copy_2Consider the leftmost three drawings of the spaceship in Fig. 1. They show the spaceship accelerating upward. Remember, this does not mean that it is moving upward with a steady speed. It means that it is getting faster and faster each instant. In other words, its speed is increasing. Now we have an object, say a ball, which is moving at a steady (fixed) speed across the path of the spaceship from left to right in a straight-line path perpendicular to the direction the spaceship is moving and accelerating in (up). Suppose, further, that there is a little hole in the spaceship just where the ball would strike the exterior left wall of the spaceship, which allows the ball to enter the spaceship without ever touching any part of it. Imagine that the spaceship is made of glass and is transparent, so you can see what happens inside. If you are standing outside the spaceship, what you will see is what is shown in the leftmost three drawings of the spaceship in Fig. 1, i.e., the ball will continue in a straight line on its previous path (shown in the figure as a dotted line), while the spaceship accelerates up around it (while the ball is inside the ship). Here’s the weird part: now imagine yourself standing still on the floor of the spaceship as it accelerates upward. You experience gravity which presses you to the floor, as described above. Now, you see the ball enter from the window in the left wall, and what you see is that it follows a parabolic arc down and hits the opposite wall much lower than the height at which it entered (shown in the rightmost drawing of Fig. 1) just as it would because of gravity if the spaceship were standing on the launchpad at Cape Kennedy and someone threw a ball in horizontally through the window. Do you see? One man’s acceleration is another man’s gravity!

You can probably guess what’s coming next: now imagine the the ball is replaced with a ray of light. Exactly the same thing will happen to it. The light will follow a parabolic arc downward and hit the opposite wall below the height at which it entered the spaceship, when seen by the person inside. The reason that you normally don’t see light bending in any spaceships is that light travels so fast. In the billionths of a second that light takes to get from one wall to the other, the spaceship doesn’t move up much (maybe billionths of an inch) because it is moving much slower than light moves. This small a deflection is impossible to measure. (This is just as you don’t see a bullet fired horizontally bending down much over a short distance, even though it is following a downward parabolic path to the ground. And light is a lot faster than bullets.) This bending of light must be true as long as we assume the principle of equivalence to be true, because if it weren’t, we could then perform optical experiments on the ship to decide whether we are in an accelerating frame or a gravitational field. This is forbidden by the principle of equivalence. And since we now see that light will bend in an accelerating spaceship (seen by someone in the ship) and since we also know that the person in the ship by definition has no way of knowing whether she is accelerating or in a gravitational field, light must also bend in a gravitational field. (Otherwise the person would know she is accelerating.) It’s really that simple!

The most famous experiment which confirmed the correctness of GR and made Einstein world famous overnight, was the observation of the bending of starlight by the Sun’s gravity in 1919, which I mentioned briefly in my June SR column. Also, in case you are wondering why light is bent by gravity even though photons have no mass at rest, it is because light is a form of energy, and as we know from special relativity, energy is equivalent to inertial mass according to E = mc2. All energy gravitates.

GRAVITATIONAL TIME DILATION

Circle_1_copy_3This time, let’s consider what happens with a rotational motion. Look at Fig. 2. It shows a huge disk. Imagine that we put two clocks on the disk: one at the center at point A, and one at the edge at point B. Also put a clock at point C, which is on still ground some distance from the disk. Now imagine that the disk starts rotating very fast as shown by the arrow. Now we know that the clocks at points A and C are not moving with respect to each other, so they will read the same time. But we also know that clock at B is moving with respect to the ground, and by the principles of special relativity must be running slower than C. And since C must be running at the same rate as A (they are not in motion relative to one another), B must also be slower than A. This will be true for an observer at C on the ground as well as at A on the disk, but their interpretations of why the clock on the edge at B is slower will be different: for the ground observer at C, the clock at B is in motion, which is what slows it down. For the observer at A, however, there is no motion, only a centripetal acceleration toward the center of the disc, and it is this acceleration which accounts for the slowing down of the clock. The further A moves toward the edge, the stronger the centrifugal force (and the centripetal acceleration), and the slower the clock he has runs. Since acceleration is indistinguishable from gravity (A has no idea if he tends to experience a force toward the outside of the disk because the disk is rotating, or whether the disk is still and he is in a gravitational field), clocks must also slow down in gravitational fields. This slowing down of time by gravity has been confirmed by experiments to a very high precision.

THE CURVATURE OF SPACE

We just looked at time. Let’s see what happens with space. Take the same disk from Fig. 2 and replace clock B with a ruler. Place the ruler at B so that it is tangent to the disk. For the same special relativistic reasons that the clock at B will run slower, the ruler at B will be contracted in length. Use another ruler to measure the radius from A to B. since the rotational motion on the disk is always perpendicular to the radius, this will be unaffected by motion. Since only the ruler at B is affected, if that ruler is used to measure the circumference of the disk, the ratio of that measured circumference and the measured diameter will not be Pi (3.1415926…), but a smaller number, depending on the rate of rotation of the disk. This is a property and an indication of a curved surface. But the rotation (accelerated motion) is equivalent to a gravitational field as we have already seen, so we can say that gravity causes space to become curved.

There is much, much, much, more to this grand theory, and I have but drawn a crude cartoon of it here in the small hope that I might impart a bit of its flavor, and indicate the direction in which Einstein moved after publishing special relativity in 1905. Andrew, this is the best I can do.

Thanks to Margit Oberrauch for doing the illustrations.

Have a good week!

My other recent Monday Musings:
Regarding Regret
Three Dreams, Three Athletes
Rocket Man
Francis Crick’s Beautiful Mistake
The Man With Qualities
Special Relativity Turns 100
Vladimir Nabokov, Lepidopterist
Stevinus, Galileo, and Thought Experiments
Cake Theory and Sri Lanka’s President

Monday, September 12, 2005

Planks from the Lumberyard: Bathroom Pastoralism, or, The Anecdote of the Can

First, a note to the reader about wood. “Lumber,” a word that we now associate with the Home Depot and deforestation, once denoted the contents or printed products of the mind, which, in turn, was sometimes known as the “lumber-room” (see, for instance, page 54 of Tristram Shandy ). The title of my column, then, is meant to serve as a modest attempt to resuscitate the lost sense of this sadly degraded word, and to suggest something of the ungainly mental labor required of me to salvage and sculpt the unhewed thought-timbers piled up in my mind’s lumberyard.

Not so very long ago I had occasion to spend an afternoon sipping from the green mouths of a series of Rolling Rocks at a party in Williamsburg, amongst a congerie of artists, writers, poets, and other plucky, earnest, (and unemployed?) persons. Knowing no one but the cousin who brought and promptly abandoned me, I stayed close to the walls, sipping beer and hovering at the periphery of several groups engaged in various and strange conversations about lives, friends, and relationships about which I knew nothing. In spite of the pleasures of drinking free beer in the early afternoon, I felt little connection to or interest in the people or the talk until one of the conversations turned to the old typewriter one of the hipsters had famously placed on a milk crate before the commode in the bathroom of her apartment. Recorded on the scroll of that writing machine tucked away in that most private of spheres was a long, peculiarly thoughtful, and digressive conversation, perpetuated and sustained by the excretory ruminations of the apartment’s occupants and anyone else who might have spent time asquat before the typewriter in their ceramic and tile salon.

The unexpected beauty and pertinence of this image, so Jack Kerouac-y in its way, has stayed with me, and I’ve often thought about what it is that makes it so compelling to me, this woman’s transformation of her bathroom into a kind of ad hoc public sphere, a place where ideas are expressed, exchanged, and contested. In part, I think, it has to do with the successful integration of two sets of seemingly opposed desires and spaces that are rarely brought together harmoniously: the desire to express oneself publicly from within the safety and quietude of the private sphere, to fuse solitude with sociability, to link the personal and the public.

The pastoral ideal has long been an important way of ordering meaning and value in American culture; the desire to move from the sophistication of our urban centers to the simplicity of country life (to “light out for the territories”) has provided a cardinal metaphor and powerful symbol for organizing the contradictions of American life. In the 1960s, Literary critic Leo Marx offered his notion of the “middle landscape” to explain how the peculiarly American desire to retreat from civilization is often reconciled with the opposing desire to benefit from industrial, technological, and urban developments. The middle landscape, symbolized by his image of the “machine in the garden” (and visually encapsulated by George Innes in his 1855 painting of the Hudson River Valley), contains the possibility of balancing the opposing forces of technology and nature, sophistication and simplicity, city and country. Many have considered the emergence of suburbia as the modern manifestation of the middle landscape, which makes sense, but I’d like to propose an alternative candidate for middle landscape: the bathroom.

Although generally neglected as a fundamental space of everyday life, the bathroom, like the middle landscape, is a space where social and psychological contradictions are made most manifest. It is the scene of what Julia Kristeva called “abjection,” a site where the intricacies of contemporary plumbing technology saves us from having to confront or acknowledge the material symbol of our own carnality (“This thing of darkness I acknowledge mine,” as Prospero once said in a different context); where our social and natural selves commingle, where the dream of pastoral peace, simplicity, and contentment coexists with the technological infrastructures of industrial design.

(A strong case could be made, by the way, for the bathrooms at the DIA Beacon as the nearest physically existent approximations of the Platonic ideal. Situated in hard-to-find and rarely visited nooks in the sprawling exhibition space of this lovely museum, which is itself encircled by the still-fresh green breast of the Hudson River valley, their pristine loos offer a perfect zen-like enclave for the pensive, art-addled museum-goer. Unlike, say, the cans at the Met, these are quiet, secluded, clean, and peaceful….)

The Japanese seem to understand and accept the centrality of the bathroom to psychic, social, and biological life, as evidenced by their awesomely superior toilet design. (There are, for instance, toilets equipped with delicate temperature control jet sprays, with glow in the dark seats with sensors that automatically open in the presence of an approaching human subject.) Canadian poet of the Yukon, Robert Service, author of the magnificent “Cremation of Sam McGee,” and a man well-acquainted with the exigencies of the body, is one of the few poets to have taken up the subject of the bathroom. His poem, Toilet Seats” is not particularly good but mildly amusing and well-worth a read as a singular instance of bathroom poetry. Aside from that, there is, to my knowledge, scant acknowledgment or representation of this most basic of mental spaces in literature. This seems to me both strange unaccountable.

Within the complex contemporary social landscape, irradiated by the importunate forces of our vibrant consumer culture, exerts a thousand pressures on the individual who might wish to maintain an independent existence. The bathroom exists as a privileged site of quiet contemplation and thought, a space where much of our best thinking takes place, a space even of occasional revelation. It is with a modicum of embarrassment that I awkwardly express my own fondness for this most neglected of spaces, this bastion of mental life. So I continue to scan the literary horizon for a writer bold enough to take up this most marginalized of middle landscapes. In fact, I think I’ll grab a book from the shelf and resume my search presently….Nature calls.

Critical Digressions: Dispatch from Cambridge (or Notes on Deconstructing Chicken)

Ladies and gentlemen, boys and girls,

Cambridge After sojourning in Tuscany and Karachi for the summer, we have returned to the East Coast, to Cambridge, for the fall. Upon arrival, we spent the afternoon under the pigeon infested trees outside Au Bon Pain, leafing through the Boston Phoenix and the Weekly Dig. We overheard a woman in a white summer hat remark, “If the weather were always like this, Boston would the most popular city in the world.” Although her premise is tenuous, on days like these, there’s a sense of occasion here, an almost pagan celebration of nature. Lucid, incandescent skies had brought the denizens of Cambridge out in their Sunday best. We observed pale, lanky limbed academics in revealing skirts; teenage punks in torn leather and grimy boots; and fresh-of-the-boat families sporting tight pants and fanny packs, gawking at the spectacle: old men playing chess for money, bold panhandlers soliciting funds, the jazz band strumming “Take Five” in the Pit.

Although we participated in the festivity, come evening our vigor waned and we felt hungry. We realized, however, that our options were limited: Harvard Square may be a melting pot but it offers lackluster ethnic dining, whether Chinese, Indonesian, Italian, Indian or Arab. (To be fair, there are two exceptions: Smile Café’s chicken larb is excellent and the menu of the Tibetan place in Central Square features this delicious minced meat and turnip dish.) And suddenly, we felt pangs of nostalgia – nostalgia for nihari, for Karachi.

In Ha Jin’s next novel, the protagonist is a poet, a Chinese immigrant to America. In one of his poems, he writes about the handful of the dirt from his backyard that he carries around with him in his portmanteau. In a way, the poem and sentiment is a response to Cavafy’s “The City”:

You said: “I’ll go to another country, go to another shore;
Find another city better than this one.
Whatever I try to do is fated to turn out wrong
And my heart lies buried like something dead.
How long can I let my mind moulder in this place?
Wherever I turn, wherever I look, I see the black ruins of my life, here,
Where I’ve spent so many years, wasted them, destroyed them totally.”

You won’t find a new country, won’t find another shore.
The city will always pursue you.
You’ll walk the same streets, grow old In the same neighborhoods, turn gray in these same houses.
You’ll end up in the city.
Don’t hope for things elsewhere:
There’s no ship for you, there’s no road.
Now that you’ve wasted your life here, in this small corner,
You’ve destroyed it everywhere in the world.

Home might mean a few hundred circumscribed square yards to many; dirt. It might mean a bed to others – a threadbare chair, a red wheelbarrow; it might have to do with family and nation and tradition, with shared history, collective memory; it might be an idea; or it just might be a filet. Indeed, there is substance to the adage, “You are what you eat.” We may not carry dirt around but when traveling from Pakistan, we do carry carefully wrapped cellophane packets of various powdered spices in our suitcase. Wherever we are in the world, then, we can feed and nourish our self; wherever we are in the world, we can feel at home.

In It Must be Something I Ate, Vogue’s food critic, Jeffrey Steingarten maintains that “In all of Nature’s Kingdom, only mammals, female mammals, nourish their young by giving up part of their bodies. For us, food is not just dinner. Our attitude toward food mirrors our feelings about mothers and nurturing, about giving and sharing, about tradition and community…” We agree. Being Pakistani, we associate savian with Eid, korma with weddings, mangoes with summer. Furthermore, those who fancy themselves cosmopolitan, boulvadiers, men of the world, associate dining with culture, even civilization. They have sushi at Nobu, truffles at Da Silvano, lamb chops at the Grammercy Tavern.

CarlitoSimply put, food defines us as we define food. The Guardian’s Lisa Hamilton avers, “Frankly, I’ve never had good sex with a vegetarian. I like men who eat properly, who like their steak bloody, their eggs Benedict runny. Fastidiousness is as unappealing in the kitchen as it is in the bedroom; there’s something emasculated about a man who let’s himself be faced down by escargot. Logically, someone as obsessed b the food/sex correlation as I am would select lovers accordingly; but as with crème brulee, I never quite had the discipline to resist what I knew would turn out badly (hence the vegetarian. He had little round glasses and did yoga. Really.) However, experience did prove that whether or not a man knows his artichoke from his elbow, when it comes to cooking, if not to sex, the clichés of national stereotypes hold true.” We’re not sure if Ms. Hamilton ever got it on with a Pakistani. Rest assured, we are carnivores. We make meat.

The following is a proprietary recipe for a dish we call (and have presently coined) Killer Karahi Masala:

Ingredients (and other materials)
1 chicken (or a packet of drumsticks and filets)
2 large onions, chopped
1/3 cup of vegetable oil
10 dried red chili peppers
1 teaspoon of salt
1 teaspoon of red chili powder
1 teaspoon of garam masala (available at any Pakistani grocery store)
1 teaspoon of coriander powder
2 large tomatoes, diced
1/2 teaspoons of garlic paste
1/2 teaspoons of ginger paste
1 clove of garlic, chopped
1 thing of ginger, chopped
One of those plastic lemon things with lemon juice inside it
1 Corona
1 Dunhill
Carlito’s Way” Soundtrack (not the original score)

Instructions:
Close your eyes. Summon primal hunger. (You cook better when hungry.) Play first track on CD, Rozalla’s “I Love Music.” Pour oil into a casserole with diced onions and dried red chili peppers and turn up the heat. Strip and wash chicken. When onions become translucent, add chicken. (Wash your hands.) This should be around the time of KC and the Sunshine Band’s “That’s the Way I like It.” Add salt, red chili powder, coriander powder, garam masala, garlic and ginger paste. Throw in a diced tomato. Stir together and cook for half an hour on medium heat. Keep stirring. Then add chopped garlic and ginger. Have Dunhill, drink Corona; celebrate, you’re almost done. Fifteen minutes later, add the second diced tomato and squeeze the lemon thing over the dish as a sort of garnish. Serve hot (with tortillas as chapati proxies). “You Are So Beautiful” should be winding down in the background.

In our depleted state, however, we couldn’t venture to Broadway Market for groceries. We didn’t have it in us to make Killer Karahi Masala, or even a runny eggs Benedict. We somnambulated to Pinocchio’s for a steak-and-cheese and then, in this small corner, slept, full but incomplete.

Other Critical Digressions:
Gangbanging and Notions of the Self
The Media Generation and Nazia Hassan

The Naipaulian Imperative and the Phenomenon of the Post-National
Dispatch from Karachi
Live 8 at Sandspit
Chianti and History

Lives of the Cannibals: The Spell of the Sexual

They say that spring is the season of love, and they may be right. April’s days may be soaked in the hormones and pheromones of a renewed reproductive cycle, and in this we are undoubtedly biology’s sensual puppets. But let us firmly agree that we will have no truck with biology here. On the streets of New York, biology is perhaps useful as metaphor, but quite beside the point and ridiculously unsophisticated for our purposes. So we will dispense with eggheaded myopia, and in that spirit we will also reject the modern world’s most egregious conceit–love. This cloying trope may be appropriate for the foil covers of paperback novels and the illustrated pages of children’s books, but for a teeming metropolis it is a quaint notion, a rumor to sustain the lonely and unattractive.

They say that spring is the season of love, and they may be right, but on the streets of New York, spring is summer’s ragged doormat and love is a faintly Midwestern excuse for the gluttonous satisfaction of lust. 

Cruelty
Suggested by such details as the curl of a lip and the severe line of a jaw, cruelty is at the heart of sex, though you’d never know it to hear people talk. That cruelty is inextricably woven into the sexual experience disturbs many, largely because it doesn’t jibe with the civilized construct of love. But of course the mechanics of sex demand a certain precise violence, and often the quality of our sexual experience is indicated by kabuki-like displays of pain and heartlessness. Despotic New York is, in its concrete miles and Darwinian efficiency, cruel’s storied hometown. Just watch the summertime girls on Wall Street and in Times Square as they distribute their weight on nail-thin four-inch heels, and applaud the steel in the eyes (and the pants) of the conquering businessman, whose rigorous assessment of the physical merits of these very same girls is a feat of conscienceless objectification unmatched since stalwart American traders arrived on Africa’s Gold Coast. Even the City’s visual parameters signal cruelty’s central role: Who will be kind in a city without a sky? It speaks to New York’s defiant self-sufficiency–to see nothing but what we made with our own hands, to observe the struggle that plays out daily in the rippling heat of our walled streets. It prepares us to fuck.

Flesh & Heat
For a city where sunbathing is essentially limited to fire escapes and rooftops and the odd patch of park, skin is surprisingly ubiquitous. In spite of or in concert with the pronouncements of New York’s own fashion vanguard, flesh is the order of the summer season, and, for better or worse, there is a democratic quality to this ritual of exposure. Imperfections of the body are courageously displayed, plumped and framed in elastic and cotton, and to hell with sensible shame for the homely. Our women fear nothing, and no matter your wishes you will witness the shiny rhythms of their flesh. Many visitors make the mistake of assuming that New York’s men will not bare their chests, that urban settings don’t lend themselves to traditional demonstrations of virility. These naifs are shocked in high summer, when the articulated pecs and corrugating abs hit Seventh Avenue in force. (That a substantial number of these men are gay is of little moment here, and only serves as further testament to the city’s simmering carnality.)

Density
There are no secrets in this city, no privacy in the stacked lives of our high-rises and tenements, where we gain sonic (if not visual) access to the perversions and loneliness of our neighbors. New York can reasonably claim the concentrated libidinal force of six million–a deeply conservative estimate, given the total commuting population of the metropolitan area, the relatively low birth rate, and our seniors’ migratory tendencies. This vast sensual endowment transforms the city’s public space into a hothouse for various strains of latent sex. On subway platforms, where skin candies in seconds, we pace and seethe, preparing to press our glistening flesh against the glistening flesh of strangers. We are accustomed to the tight quarters of this vaginal system, which provides us a durable metaphor for the sublimation of violent desire. Although the train’s arrival is a small ecstasy, discreet relief from the pressure of restraint, it’s not nearly enough. Anyway, etiquette is a precious gloss here, an exotic curiosity for the foyers of Park Avenue apartments. By the humid height of July we have all but forgotten our modesty, and no longer do we draw the traditional distinctions between bodies. Instead, the perceived intrusions of eyes and limbs subside, and we become a single damp mass. Millions writhing as one.

Ambition
Gassy fumes of ambition stifle the breath of this city. Without some method of burn-off, some practice to spend its propulsive energy, New York would unmoor itself from the continent and take flight like a lost balloon. Thankfully, the city’s sexual black hole siphons ambition from our lungs daily, granting us peace enough to sleep some nights. The same vigor with which we pursue innovation and growth is just as easily blown on the swollen implication of sex, and so in the summertime, on radiant streets and when business is slow, ambition finds its satisfaction in exhibitionistic displays of power and availability. Mincing and posing, we pout for one another, we straighten our backs and expand our chests, and the sport becomes an end in itself. Ambition’s casualties–those who are stepped on and surpassed in their attempts to live up to this vertical city–find relief on the same streets, obliterating their professional disappointment with the easy dominance of sexual reduction.

Finally, it must be said that the spell of the sexual finds fuel in death. Confronted with incomprehensible violence, we revert to our simpler selves, manufacturing comfort from the appeasement of our bodies’ appetites. There is nothing wrong with this. But we would be wise to monitor our devolution closely. One day we may find ourselves so taken with the reductive ecstasy of the sensual that we have forgotten the creative dynamism it displaced. Considering New York’s global stature, this would be a tragic loss, even for those insensible to the city’s genius. This tiny patch of islands and shorelines on the East Coast of the United States is a vital asset, worth far more than the product of its sensual economy. It would be a terrible shame to suffocate under the oppressive cover of our lavish fantasy life.

Lives of the Cannibals: Rage