Vollmann, Crane, and Adventure Journalism

The only real surprise about William T. Vollmann winning the 2005 National Book Award for Europe Central was that merit was rewarded. In literature as in life this is not always the case. I have been reading Vollmann since my college days in the mid-1990s, when a love affair with a Canadian caused me to pick up Fathers and Crows, the second of the Seven Dreams series, about Jesuits in Quebec. (This might be mere sentimentality, but I still believe it’s his best book.) The Vollmann award means that serious novels are still being taken seriously, despite Norman Mailer’s comments at the ceremony to the contrary during his depressing Lifetime Achievement speech. (“It’s a shame in the literary world today that passion has withered, producing fiction that is all too forgettable,” said Mailer. “I’m watching the disappearance of my trade. The serious novel may be in serious decline.”) Does Vollmann publish too much? I leave the question open – it’s not rhetorical. Vollmann’s style is perhaps overly mannered and has not developed much over the years (he started in the stratosphere but has stayed at the same relative altitude), although in his best writing the mannerism works to his advantage. But his seemingly monomaniacal prolixity is more likely to be a sign of compulsive brilliance more than anything else, so that the complaint is almost meaningless – roughly the same could be said of Dickens, for example. This is genius in more than one sense: you get the feeling Vollmann has an actual daemon sitting on his shoulder dictating book after book.

The writer that Vollmann brings to mind most strongly is not Dickens, however, but Stephen Crane. At first this may seem like an odd comparison, given that Crane’s devotion to literary realism is very far from being Vollmann’s first priority. Like Crane, Vollmann writes both adventure journalism and novels. Like Crane, Vollmann is drawn to wars and conflict zones. Vollmann’s series of books about prostitution surely have a classic literary source in Crane’s Maggie: A Girl of the Streets. Like The Red Badge of Courage, Vollmann’s historical novels are strongly flavored with reality and research. Vollmann’s tremendous output matches or perhaps even exceeds the famously productive Crane, who by the time he died at age twenty-eight had already published two novels, a multitude of short stories and poems, as well as an immense body of journalism. (The authoritative edition of Crane’s work, published by the University of Virginia, apparently runs to ten volumes.) It’s almost as if Crane knew that time would be short; a sense you get reading Vollmann as well, who, you sometimes feel, has lived longer than he thought he might. Even Vollmann’s short chapters, with their antiquated newspaper-dispatch style headings, call to mind works like “Stephen Crane’s Own Story” (1897).

Crane made his name as a war correspondent, covering, for example, the sinking of the Commodore, a ship laden with arms bound for Cuba. This happened in 1897, just prior to the sinking of the Maine and the entry of American into the Spanish-American War. Journalists were more than observers in the conflict. The representation of the coverage in Citizen Kane isn’t far off the mark regarding the pro-war Yellow Journalism of the Hearst papers of the day. Phyllis Frus and Stanley Corkin, the editors of the excellent Riverside Crane volume, write that

The existence of newsreels, filmed reproductions of events, and even enactments that were clearly remote from the action in Cuba provided people in the United States with images of warfare that made it a kind of spectator sport in which most viewers had a clear rooting interest. With his writing, Crane helped create the new public sphere, and as a celebrity journalist, he participated in it.

Iraq was not the first time that reporters were embedded, and the problems of bias they created are nothing new. Crane reported under fire with the marines direct from a very different Guantanamo. Vollmann’s An Afghanistan Picture Show, published in 1992, ten years after he flung himself into the middle of the struggle against the Soviets, has the self-mocking subtitle “How I Saved the World.” In it, Vollmann dissects himself as much as the conflict, creating a ruthless (and very timely) examination of the entire concept of American altruism when it is combined with an emphasis on military solutions. (Not everything Vollmann wrote about Afghanistan was perfect – when the New Yorker sent him back to check up on the country during the 1990s, Vollmann was at times too soft on the Taliban, acknowledging their crimes but presenting received ideas about how they had brought stability to the country.)

Here’s the problem with adventure journalism more generally: it’s not written by experts or beat reporters, and therefore only infrequently rises above the usual combination of local color, exoticism, florid prose, and received opinion back home. (Good adventure writers, among whom I count friends and some of our best writers all around, are to be admired all the more for rising above this level.) The adventure writer is essentially a proxy for the reader, an American dropped into a strange – and, ideally, somewhat dangerously atmospheric, hopefully more atmospheric than dangerous – locale. It’s understandable, but no less peculiar, that we would rather read what American magazine writers think about the Taliban, for example, rather than someone like, say, Ahmed Rashid, a Pakistani who writes in English and has covered the region’s politics for something close to two decades. But the adventure journalist gives us something we desperately need – an exotic fix.

This thirst for far-flung locations and the current craze for dispatches is surely not bad in itself; I tend to enjoy it, as a kind of literary stamp collecting. Also, it’s probably another “since September 11” type of trend, and hopefully a sign of renewed American interest in the outside world. The only downside is that the entire genre excludes those legions of literary types who are retiring homebodies and prefer to stay in bed all day crafting sentences. Not everyone should be forced to be a reporter, that’s my thesis; writing shouldn’t be a form of reality TV in which one auditions for a part in the national conversation by exposing oneself to mud and murder.

I think it was Schopenhauer who once wrote that there are two kinds of good books, those which introduce the reader to an experience they couldn’t have themselves, and those which use language in a remarkable way. Probably all good writing combines something of both, but the rise of adventure journalism involves a lopsided emphasis on one aspect against the other. It also represents another chapter in the American tradition of anti-intellectualism, for it is against “thought” and for “experience.” Our magazines are full of direct experience – like the kind that comes mediated through a translator on a two-week junket. Of course, the best writers in this field manage to combine thought and observation in a kind of genre-bending tag-team wrestle, and, in doing so, are creating a fine new genre in the process, don’t get me wrong. Vollmann is an ideal example.

Even though much of Crane’s and Vollmann’s fiction is based on research, interviews, and reportage, it is more enduring stuff. “The Open Boat” is a classic, whereas “Stephen Crane’s Own Story” is more ephemeral. Maggie was based on Crane’s real experiences with prostitutes, but the imaginative work outlives the adventurism, just as Vollmann’s The Royal Family feels superior to his Butterfly Stories. But it is the historical fiction of both writers – The Red Badge of Courage and Europe Central, respectively (plus, for my money, Fathers and Crows) – that critics have celebrated as their greatest accomplishments. Historical fiction might be the least fictional of fictions, the most closely related to facts, a genre involved directly with actuality as a magical element in the alchemy. But these two novels have more in common than an obsession with or addiction to the atmosphere of violent conflict. They are both novels documenting real events that their authors never could have experienced. It’s almost as if their thirst for experience was so overwhelming that when they ran out of the amount of reality available to them directly they had to fabricate other worlds to inhabit as well.



Talking Pints: Happy Birthday, Political Science

Towards the end of this year, The American Political Science Review will publish its 100th anniversary issue. In researching for a submission to this centennial issue, I examined what political scientists have been saying for the past 100 years, and in doing do something very odd struck me: that the arguments that I have been having for a decade with my colleagues about the idea of a science of politics being at all possible are the same arguments that have been going on in the pages of The American Political Science Review since its inception.

Then and now, political scientists tend to fall into two camps. In the first camp are those who wear the badge of ‘scientist’ and see their field as a predictive enterprise whose job it is to uncover those general laws of politics that ‘must’ be out there. The second camp contains those who think the former project logically untenable. For years now I have tried (largely in vain) to convince my colleagues in the first camp that the idea of a political ‘science’ is inherently problematic. I have marshaled various arguments to make this case, and each of these has been met by a some variant of; ‘political science is a young science’; ‘what we face are problems of method’; and that ‘more ‘basic research is required’. Then, with ‘more and better methods’ we will make ‘sufficient’ progress and ‘become’ a science. I remain unconvinced by this line of argument, but it was enlightening to see it played out again and again over a century.

Discovering that these same arguments have been going on for 100 years was both heartening (I was in good company) and depressing (‘round and round we go’). But in doing so I discovered something else. If political science is a ‘science’ by virtue of its ability to predict, as many of its ‘scientific’ brethren maintain, then it really should have been abandoned years ago since the prediction rate of my field over the past 100 years is less than what would be achieved by throwing darts at dartboard while wearing a blindfold. To see why this is the case consider the following potted history of political science.

From its inception in 1906 until World War One American political scientists took ‘public administration’ as its object and the Prussian state as the model of good governance. Sampling on this particular datum proved costly to the subfield however when the model (Germany) became the enemy during World War One and the guiding models of the field collapsed. Following this debacle, political science retreated inwards during the 1920s and 1930s. One can scan the American Political Science Review throughout these tumultuous decades for any sustained examination of the great events of the day and come up empty. What I did find however were reports on constitutional change in Estonia, committee reform in Nebraska, and predictions that the German administrative structure will not allow Hitler to become a dictator.

After World War Two this lack of ‘relevance’ haunted the discipline and its post-war re-founders sought to build a predictive science built upon the process notions of functionalism, pluralism, and modernization. These new theories saw societies as homeostatic systems arrayed along a developmental telos with the United States as everyone’s historical end. Paradoxically however, just as the field was united under these common theories, they were suddenly, and completely, invalidated by the facts of the day. At the height of these theories’ popularity, the United States was, contrary to theory, tearing itself apart over civil rights, Vietnam, and sexual politics while ‘developing’ countries were ‘sliding back’ along the ‘developmental telos’ into dictatorships. Despite these events being the world’s first televised falsification of theory, once again political science turned inward and ignored the lesson waiting to be learned – that prediction in the social world is far more difficult than we imagine, and the call for more ‘rigor’ and ‘more and better methods’ will never solve that problem. Our continuing prediction failures continue to bear this out. Since its ‘third re-founding’ in the 1980s till today, political science has predicted the decline of the US (just as it achieved ‘hyper-power’ status); completely missed the decade long economic stagnation of Japan (just as it was supposed to eclipse the US); missed the end of the Cold War, the growth of international terrorism, and the rebirth of religion in politics.

After reviewing this catalog of consistently wrong calls, a very simple question occurred to me. If political science is a science by virtue of its ability to predict, and its prediction rate is so awful, can it be a science even in its own terms? I would say that it cannot. But this answer itself begged another, and I think more interesting, question; why is my field’s ability to predict so bad? The answer to this question is not found in the pages of the American Political Science Review. Rather, it is found in how political science as a discipline, through its training, thinks about probability in the social world. To see why this is the case I ask the reader to follow me through three ‘possible worlds’ that have three different probability distributions, and then decide which world it is that political science studies – and which one it thinks it studies.

Our first (type-one) world is the world of the dice roll where the generator of outcomes is directly observable. Here we live in a world of risk. We know when throwing a die (the generator) that there are six possible outcomes. Given the ability to directly observe the generator and a few dozen throws of the die, the expected and actual means converge rapidly via sampling, and this is sufficient to derive the higher moments of the distribution. This distribution, given the known values of its generator, is reliably ‘normal’ and sampling the past is a good guide to the future. One is not going to throw a ‘300’ – there are only six sides on the die – and skew the distribution. This type one world is reliably Gaussian, and is, within a few standard deviations, predictable. Political science thinks it operates in this world. This is the familiar world of the bell-curve.

Our second world (type-two), is a world with fat tails (Gauss plus Poisson) where uncertainty rather than risk prevails. An example of the generator here would be a stock market. Although one can sample past data exhaustively, one does not observe the generator of reality directly. Consequently, one can ‘throw a 300’ since large events not seen in the sample may skew the results and become known only after the fact. For example, stock market returns may seem normal by sampling, but a ‘Russian Default’ or a ‘Tequila Crisis’ may be just around the corner that will radically alter the distribution in ways that agents cannot calculate before the fact. This is a world of uncertainty as much as it is risk. Agents simply cannot know what may hit them, though they may be think that the probability of being hit is small.

Our third possible world (type-three) is even more unsettling. Imagine a generator such as the global economy. In this case, not only can one not see the generator directly, agents can sample the past till doomsday and actually become steadily more wrong about the future in doing so. As two probabilists, Nassim Taleb and Avatel Pilpel, put it, with such complex generators “it is not that it takes time for the experimental moments…to converge to the ‘true’ [moments]. In this case, these moments simply do not exist. This means…that no amount of observation whatsoever will give us E(Xn) [expected mean], Var(Xn) [expected variance], or higher-level moments that are close to the “true” values…since no true values exist.”

To see what this means, consider the following example. Macroeconomics, like political science, has had at least four general theories of inflation over the past fifty or so years, which suggests two things. First, that these theories cannot be general theories since they change every decade or so. Second, that such theories might be thought of as general (at the time they were constructed given the sample that they were derived from) but such theories must become redundant since the actual sources of inflation change over time.

For example, if the agreed-upon causes of inflation in one period, (monetary expansion) are dealt with by building institutions to cope with such causes (independent central banks), this does not mean that inflation becomes impossible. Rather, it means that the conditions of possibility change such that the theory itself becomes redundant. In such a world outcomes are fundamentally uncertain since the causes of phenomena in one period are not the same causes in a later period. Given this, when we assume that outcomes in the social world conform to a Gaussian distribution we assume way too much. Any sample of past events can confirm the past, but cannot be projected into the future with the confidence we typically assume. Take away that prior assumption of ‘normality’ in the distribution and standard expectations regarding prediction fall apart.

Given this, which world is the world most likely studied by political scientists? Our type-one world can be ruled out since if the world was so predictable our theories should be able to predict accurately. Given the record in this regard, it is safe to conclude that the world we occupy is not this one. Our type-two world seems suspiciously normal most of the time, but our theories ‘blow up’ much more than they should since most of the action occurs in the tails and we cannot see the generator of outcomes. This sounds more like the world where people actually live.

A type-three world is even worse however, since in a type-three world all bets are off as to what the future may bring. Humans do not however deal particularly well with such uncertainty and try to insulate themselves from it. Whether through the promulgation of social norms, the construction of institutions, or the evolution of ideologies, the result is the same. Human agents create the stability that they take for granted. In taking it for granted however they assume the world to be much more stable than it actually is. Consequently, our theories about the world we live in tend to assume much more stability, and thus predictability, than is warranted.

In short, we cannot live in a type-three world, so we build institutions, cultures, and societies to cope with uncertainty. But when we are successful at doing so we assume we live in a type-one world of predictability and develop theories to navigate such a world. Unfortunately, we actually have succeeded only in constructing our type-two world of fat tails, and this is why we are constantly surprised. We think (and model) type-one while living type-two. Meanwhile, as a discipline, we refuse to admit the possibility of a type-three world generating both the others.

The result is that the action is in the tails, and we, given our type-one assumptions and models, are blind to what is going on there. So we focus, like the proverbial drunk under the lamp-post, on the middle of the distribution since that is where the (theoretical) light is; and like the proverbial drunk, we are constantly surprised that our keys are actually to be found somewhere else entirely. Political science may have reached the ripe old age of 100, and I congratulate it for doing so. It did so however by imagining the world to be quite different from what it is, and by completely ignoring its predictive failures. If however political science wants to be around for another 100 years it may want to think a bit more about what those failures are trying to tell us.

Monday, January 23, 2006

Lunar Refractions 1: Cacciari: Politician, Professor, Philosopher, or Don Juan?

Cacciari_2Massimo Cacciari’s writing fell on me a few days ago. He first came to my attention a few years ago when a colleague waved a book called Architecture and Nihilism in my face. I didn’t read this book, but its almost violent passage before my eyes opened an entirely new world. This unknown sphere remained dormant until recently, when I began a rather sunny yet despondent morning reading an essay of his. Cacciari and his writings are less prominent (among North American English speakers, at least) than they deserve. The few excerpts and musings here are meant merely to act as introduction and point of departure for those of you who know little or nil about his work.

My relationship—if one can call it that, as I probably shouldn’t, given the reputation speculative gossip attributes to him—with Cacciari has always been one of chance. I wasn’t looking for that architecture book, which is one of only four by him that is easily accessible to an English-speaking audience, or anything else on nihilism. Nor was I looking for an interview of him, a “sentimental interview” charmingly entitled “Massimo the Incomplete,” when I stumbled across it in L’espresso magazine a couple of years after the first incident. Finally, he came back to haunt me on a recent morning when a book about the classics literally fell off my shelf as I walked past; it’s clearly a physical connection.

Confronting the Classics: in Conversation with the Greeks and Latins was the book I’d inherited from a friend and not since had time to open. I was in what one could term a very brooding, contemporary mood that morning, reflecting on how many things that fascinate me serve no practical purpose in today’s world, and how planned obsolescence and incessant (and often conspicuous) consumption have come to replace many older, more substantial modes of existence. A walk around New York, or Venice, or any other city of the over-privileged world easily inspires the question of just what, exactly, people did before they spent their lives shopping…but I digress. Just as the superficiality of this particular moment in history overwhelmed me, I saw Cacciari’s essay, “Inactual Abstracts on the Study of the Classics.” Having only read a smattering of classics myself, and never really having studied them officially (whatever that might mean), I wanted to see what he had to say. The Nietzscheian reference particularly piqued my interest. I decided to dedicate the morning to this rich eight-page piece.

I mentioned Venice earlier because Cacciari was reelected Mayor there on April 3 of last year after having held the same position from 1993 to 2000. He also served in the Italian Parliament from 1976 to 1983. From the sixties forward he has been publishing his musings, creating a bibliography too vast to address here. His political activity, to many, may seem incongruous with the vocation of philosopher. The reason he is remarkable is precisely because so many things about him initially seem incongruous. It remains to be seen whether or not a completed whole can be made of this picture.

In North America it is fairly rare to encounter a politician who is also professor, author, philosopher, and generally curious thinker (I invite all those reading this to refute any of my statements). Italy gave the world such vastly different figures as Benedetto Croce, Gabriele D’Annunzio, Antonio Gramsci, Pier Paolo Pasolini, Leonardo Sciascia, Manfredo Tafuri, and many others, all of whom blurred the borders between politics, philosophy, and aesthetics. While it’s perhaps a little early to add to such a group, the ability of all these thinkers to assemble a unique opus collecting otherwise fractured and distant fields is certainly echoed in Cacciari’s work.

Returning to his essay, he very concisely makes twenty-seven key points about contemporary study of the classics. He begins with the general feelings of resignation regarding contemporary education—how much it supposedly must adapt to the needs of the day, acting more as something in the service of a technical-economical context rather than a school based on the ideas of culture and real education. Early on he asserts that all words indicating the “school-education complex” (i.e., Schole, paideia, Bildung) refer “not to any specific contents, but to a field of energy; a state that generates potentialities and openness to multiple possibilities rather than orientation toward a precise scope. The goal of the educational process is not the transmission of acquired values…. The real sites of education remain, despite any assertions to the contrary, centers of criticism, discussion, comparison of different trends, and questioning.” I can think of many brilliant people who’ve proven this point after fleeing stifling academic programs that somehow managed to neglect education in this sense of the term.

While that is interesting in itself, and a useful bit for anyone considering study in graduate programs or other accredited, official schooling, it doesn’t yet address the real importance of the classics. Cacciari here clarifies that, in an environment like that of most schools, “which act much like businesses specialized in producing workers, the teaching of the classics can only have a merely ornamental role. The idea of the school-business is metaphysically opposed to all that is classic. Classic, in fact, expresses no return to the past, much less to the dead past, but assuredly a high-spirited contrast to custom, to the present time. Classic is that which is not currently fashion, not the refrain of the day; it carries within it a timbre of battle, an exigency of contra-diction.”

Rather than acting as a throwback to the past, the classics “should arm us to face the present time…. The classic doesn’t flee, it rather challenges. It belongs to the present time, but refuses to serve it…. It speaks of the present time, of this world, even of our daily life, but from a sound distance. Those who weaken the spirit of the classics, transforming it into a sedentary philology; those who make of the classics a cupboard of memories neatly arranged in historical order; those who don’t know how to make them live in divergent agreement with the present time, destroy their essence ten thousand times more than its vulgar detractors.”

Cacciari_3I don’t wish to elaborate each of his points here; suffice it to say that various dangerous words—logos, philology, concordia discors, variety, forma mentis, net-workers—and dangerous thinkers—Nietzsche, Celan, Leopardi, Alberti, Kafka, and Arendt—all make appearances in this tightly-constructed brief. This is the ultimate précis of the truly liberal arts. This is what made it impossible for me to mope around wallowing in the tragic thought that I’d lost days and years of my life to a pleasurable yet utterly fruitless interest in the classics. This is the artillery with which I will respond to the next person inquiring why many of my colleagues and I are “so obsessed with remote, ancient things” in our work. This helped me shoulder what is indeed the heavy weight of taking up active conversation with the classics, yet that load is lightened when I look around and come to a very visceral awareness that I’ve no other choice.

I imagine that many who have heard about Cacciari will have done so through the abundant gossip circulating about him, which merits no further comment here. I will leave you on your own in getting to know his opus, and can only hope that the very limited number of his writings currently available to English speakers will soon grow.

Lastly, a salute to those who have similar dedication to crossing and dismantling the artificial borders between disciplines and cultural epochs—if only we had leaders here in the United States capable of addressing with such depth, or at least being aware of, education and the classics’ roles.

Selected Minor Works: The Heresy of Intelligent Design

Justin E. H. Smith

I would like to explain why, in the matter of the origins of species, there can be no compromise position, no accommodation by one side of the principle tenets of the other.  There can be no way of conceding the basic mechanisms through which evolution works while holding onto an anthropocentric view of the cosmos or a conception of human beings as unique among creatures in their likeness to the creator.  It is time, in short, for evolutionists to be clear: you are either with us, or you are against us.

Last year, Christoph Schönborn, an Austrian cardinal with close ties to “Benedict,” brought the Catholic church a step backwards by calling into question the earlier moderate view on evolution put forth by John Paul II.  In an op-ed piece in The New York Times, Schönborn downplayed a 1996 letter in which the former Pope described evolution as “more than a hypothesis.”  The cardinal held forth with the view that “[e]volution in the sense of common ancestry might be true, but evolution in the neo-Darwinian sense –an unguided, unplanned process of random variation and natural selection– is not.”

The cardinal is not, evidently, denying that random variation and natural selection are the basic mechanisms of evolution.  He is only denying that these proceed without guidance and planning.  For the cardinal, every transformation in animal species prior to the emergence of homo sapiens must have been rigged in such a way as to guarantee this eventual outcome, since human beings, in traditional Christian theology, are the very reason God bothered creating all that cosmic dust and hydrogen and mud, all those supernovae and humble worms, in the first place.  It is all for us.  Things could not have unfolded in any other way.

This account of evolution has become quite common over the past years among conservatives trying to take a moderate stance in the debate.  They argue that there is nothing impious about the view that God may have worked through evolution in order to arrive at his crowning achievement, homo sapiens.  Thus George Will, in a recent article on the woes of the House Republicans, begins with a telling comparison: “Before evolution produced creatures of our perfection,” he writes, “there was a 3-ton dinosaur, the stegosaurus, so neurologically sluggish that when its tail was injured, significant time elapsed before news of the trauma meandered up its long spine to its walnut-size brain” (“How to Evict the ‘Rent-Seekers’,” January 11, 2006).  The implication is that God worked through such earlier rough drafts until he arrived at his final goal, namely, us (overlooking the obvious fact that there still are plenty of neurologically sluggish species lumbering around, and that in the Jurassic there were plenty of species that did not suffer from this shortcoming).  The Christians can hold onto their anthropocentric cosmology, while nonetheless taking good scientific evidence about shared ancestry into account.  It’s the best of both worlds!

But is such a compromise tenable?  Let us review some of the basics of the Darwinian account of how exactly “higher forms” (this is Darwin’s own misleading language) are thought to arise from lower ones.

The supreme virtue of an organism, in an evolutionary sense, is fitness to its environment, and fitness does not admit of non-relative degrees.  Thus, when it comes to getting one’s oxygen supply underwater, a fish is fitter than I, and thus, I suppose, better.  To the extent that we can talk about “better” and “worse”, we must make clear what sort of environmental circumstances we have in mind before we can say whether an organism is better or worse able to live in them.  Beyond this, it makes no sense to speak of an organism’s place in some non-relative, hierarchical chain of being.  The image of the chain is a vestige of a world-view that is hopelessly at odds with the theory of evolution.  (One thing its latter-day supporters frequently leave out is that, traditionally, human beings were not the highest placed on the catena rerum.  This spot was reserved for the angels–purely spiritual beings with nothing of the animal in them.)

But even with this circumscribed conception of betterness as fitness, could we not still go along with Schönborn and say that human beings are still God’s best work, moving from the comparative to the superlative on the grounds that, say, human beings are well-adapted not just to some tiny ecosystem, but to the entire globe, and eventually, perhaps, to outer space as well; or on the grounds that they cannot just live in any ecosystem, but can also dominate all of them, and all their inhabitants, by use of reason?  And is it not in virtue of the possession of reason that we are justified in speaking of human beings as the image of God?

The problem here is that, as Schönborn worries, the mechanism of adaptation that ensures the greater fitness of some organisms in some particular environment –whether this fitness involves the evolution of gills, bipedalism, or language– is one that can be better understood in terms of randomness than in terms of intelligent guidance.  In any population, there are variable traits.  Some organisms have them, some don’t, and the reasons for this variation are random mutations at the genetic level.  If Schönborn wants to deny this, he will also have to deny a whole host of elementary facts about genetics that he probably never even noticed were offensive– facts that have nothing to do directly with evolution, and facts the knowledge of which he probably benefits from on a regular basis in his reliance on modern medicine.

Some of the traits will prove more useful in response to certain environmental features, and the subset of individuals in a population that have these traits will be more likely to survive to  reproductive age and pass them on.  If God is working through evolution, then, as Schönborn and Will believe, he will have to be actively rigging not just all genetic mutations, but also all of the environmental changes to which the organisms, in which these genetic mutations occur, prove to be well or poorly equipped to respond.

Let us consider an example, one that is very close to home for us human beings.  Paleoanthropologists suggest that a significant moment in our becoming human arrived when our ancestors transitioned from arboreal swinging as their primary form of locomotion to bipedalism  This new and handsome way of getting about is thought to have brought in its wake a number of other adaptive consequences, including, some speculate, the evolution of a vastly larger cranium than those of our ancestors.  This, in turn, is what ultimately facilitated the performance of complicated mental feats, including those we today think of as “rational”.

But why did our ancestors go peripatetic in the first place?  Unfortunately, this change cannot be accounted for in terms of any innate desire for self-improvement among hominids, nor can it plausibly be explained in terms of God’s plan for their kind.  The full story of the evolution of bipedalism will also have to take into account the way in which meteorological and geological events changed parts of the landscape of Africa from rain forest into savannah, and forced the hominids in those parts, at pain of extinction, to start moving about in new ways.  If you wish to assert that evolution is a guided process, you must not think only of God pushing his creatures to go down one path rather than another, you must also take into account God’s micromanagement of every single event in the physical world so as to ensure particular outcomes in the biological world.  The passing of meteors, landslides, volcanic eruptions in the Mid-Atlantic range, the dissolution of a cumulonimbus here and the emergence of a cumulus there, the decay of this atom as opposed to that one, all of this must be meticulously set up for the sake of desired results among one tiny subset of natural phenomena.

Indeed, what we end up with is a sort of neo-occasionalism, the view that the only true cause of any event in the universe is God, that there can be no talk of causality except in reference to the ultimate cause of everything.  In the 17th century, Gottfried Leibniz derided this view, held by his contemporary Nicolas Malebranche, as recourse to “perpetual miracle.”  Leibniz, like many fellow Christian thinkers of his era, understood that implicating God in the nuts-and-bolts of the universe’s daily maintenance is to assign to God a task that is beneath his dignity, and thus to lapse into impiety.  If miracles like the incarnation or the resurrection are going to count for anything, Leibniz saw, then they are going to have to be set apart from the ordinary flow of nature.  This is what occasionalism would preclude.  How much more Christian it would be to account for natural phenomena not by perpetual miracle or by divine micromanagement, but by appeal to a few simple and regular laws.

One can easily see why intelligent design cannot work as a compromise position.  Prima facie, it is much more plausible to suppose that, had God made the universe with human beings in mind as his ultimate goal, he would not have bothered coming up with such a meandering mechanism, and one that would require so much upkeep.  He would have seen rather to the simultaneous, instantaneous, once-and-for-all creation of all species in their present form, and would have shaved several billion years of build-up off the history of the universe, setting things into motion around, say, 5,000 BC, rather than circa 15,000,000,000 BC.  In other words, God would probably have done things more or less as Genesis would have us believe.  The creationist’s attempt to compromise with science, whether for honest or disingenuous reasons, by taking the middle road of “guided” or “managed” descent from lower forms, cannot fail to lapse into nonsense.  I would certainly prefer to debate a scriptural literalist who sticks to his guns, who only recognizes one source of truth, and is clear about what this is.

What Schönborn is worried about is not so much the proposition that human beings are the kin of “lesser” animals (elsewhere I have argued that it is precisely this worry that guides many creationists).  In line with traditional Christian theology –as opposed to the aberrant theology of many fundamentalist protestant sects– the cardinal recognizes that the proper understanding of a human being is as a creature that shares part of its nature, though not all, with the animal kingdom.  Rather, Schönborn is concerned that the best scientific theory of how we got here disconnects us from any divine purpose, leaves us to fend for ourselves metaphysically.  This is a worry that is not limited to the debate about human origins.  Indeed it is one that many were expressing long before the descent of man from lower forms became an issue.

While many early modern thinkers agreed with Leibniz that excusing God from the task of micromanaging the affairs of nature is the best way to exalt him, there were just as many who feared that, with diminished responsibilities, God runs the risk of becoming irrelevant.  Some early modern vitalists, such as Ralph Cudworth, the author of a 1686 treatise not-so-humbly entitled The True Intellectual System of the Universe, thought he had the perfect compromise solution: God dispatches a certain “plastick nature” that intelligently guides the unfolding of natural processes in the material world while allowing him to retreat and, I suppose, contemplate his own divine excellence, while this subordinate force “doth drudgingly execute” those tasks that are beneath God’s station.  What terrified Cudworth was the thought that the things of this world might be accounted for, to use Cicero’s compelling phrase, simply as “a fortunate clash of atoms,” yet he understood that the answer is not to make God himself take care of all the “operose, sollicitous, and distractious” affairs of this lowly world.  But this position prompted others to accuse Cudworth of reintroducing the pagan doctrine of the world soul. 

One might easily get the sense that, when it comes to characterizing God’s involvement with the world, you just can’t win.  There will always be reasons for denouncing any position as impious.  If I can hope to contribute anything to the unfortunate debate about intelligent design that has developed over the past few years, it is that ID theory is just as suitable a candidate for denunciation on the grounds of heresy as any other account of what God is up to.  The standard criticism of ID is that it is bad science.  I would like to propose that it is bad theology as well. 

But fortunately none of this has anything to do with the prospects of evolutionary biology.  Today we have at our disposal biochemistry, genetics, and numerous other promising fields of inquiry that are in a position to explain how atoms, in accord with a few simple laws, really can produce human beings.  And at just this promising moment, creationists want to throw in the towel in view of the “irreducible complexity” of it all.  This phrase had some resonance 300 years ago– vitalists had good reason to think that billiard-ball-style mechanical physics was inadequate to account for all the phenomena of nature.  Now, however, it is nothing more than the proclamation of a preference for ignorance.

Unlike Richard Dawkins and his bright friends, I find people who put too much faith in science obtuse, and I do not think my own life would be easily bearable if I were to abandon all hope for a perfect, eternal order beyond this shoddy, decaying one.  But let us keep our activities straight.  Let us not do interpretive dance in our trigonometry classes, and for God’s sake let us not complicate the teaching of a perfectly autonomous and rigorous science with the problem of finding meaning and purpose in the universe.

Negotiations 6: A Christmas Tale

I go home for Christmas, and it is a vast cacophony of family, with grandparents and siblings, aunts, uncles, boyfriends, in-laws and all manner of cousins present: first cousins once removed, double cousins, first double cousins, second and triple cousins. There are fires in three hearths, each trying to outburn the others; the house is shimmering with heat. There are logs to be hewn, trees to be raised, beds made, furniture moved, carpets taken up, banners unfurled, icons hung, candles lit; and there is food, food, food! to be eaten at all times and in every location: grilled venison sausages, baked salmon stuffed with spinach and feta, steamed mussels, smoked trout, wild rice, pearled onions, boiled peas, roast duck, mince pies with brandy butter, Spanish clementines, Belgian chocolates, Danish marzipan, fudge as dense as flesh, suckers, lollipops, chewies, stickies, gummies and squirmies.

These last are for the children: children crawling from under beds, hanging from rafters, sliding down banisters, and building forts. Children banging drums, bouncing balls and riding bicycles; snot-smeared children, wide-eyed children, children with earaches and bellyaches and toothaches; children hacking, spitting, whispering and howling. Their little fingers are ceaselessly working, pushing into pockets, manipulating trucks and plucking violins; grubby fingers pinching, gouging and tickling; wet fingers squishing into ears and noses; grabby fingers at your sleeve; greasy fingers in the shrimp; fragile fingers curling and uncurling with each breath when like the sea, finally, the children sleep.

I am unaccustomed to such activity. No longer a child, I carry myself within myself. I want to slow this traffic; I want to pluck moments and preserve or heal or burn them. My frenzy is a private thing, a damnable, maddening, lonely thing. Thus it is that I find myself, late this Christmas day, under the pretext of gathering mistletoe, climbing the thick crotch of a dying maple just to gain some solitude, and to breathe and to think.

We are a family of spies, however, and one of us has followed me out. It is the girl we call Bug, full of questions and sugar. She is an elf-child, all blonde and blue, with eyes that glow and blink and swell, and I can feel them glowing and blinking in the winter grayness. She contemplates my activity from below then calls up to me in the gathering sky. “What are you doing?”

I am thirty feet above her now, standing in the limbs of a tree that was a mere sapling at the end of the French Enlightenment. I feel like an affluent worm when I consider this fact. Time weaves fate. This means nothing to her. “I’m looking for mistletoe.” “Can I come up too?” She carries the scar of an immense and terrible wound upon her belly, something went wrong in the pre-life of her mother’s womb, but she is quick and agile and I would like nothing so much as to haul here into the transcendent heights of this massive, wooden thing.

“No. The ladder is not secure.”

“I’m an auto-didactic climber,” she insists.

The last time I sent her into a tree, she ended up in the topmost branches of a magnolia in full bloom; we lost her in the perfume and the blossoms, and she refused to come down until I directed her out onto a limb from which she could leap into the swimming pool below. Her mother was not impressed.

“No,” I repeat.

“Uncle, are you a teacher or an artist?” she calls up to me.

“The ladder is not secure,” I repeat. “I’m coming down now. Let’s go inside.”

We step into the house and I am immediately set upon by a troll. It is the boy we call Moo-shu, on account of his fondness for pork wrapped in pancakes. He has been standing on the stairway, wearing a cape of curly sheepskin, waiting for me to enter, and he flings himself at me from above as though he is plunging into a gorge. His arms go around my neck and he is trying with all his tiny strength to throttle me. It is a game we play; he is a boy without a father in a family of women and he longs for his dad, but I am not that person, and the best I can do is wrestle with him, entangle arms and legs and hair with him, teach him to fight and to run and mingle my male smells with his. I drop to my knees and roll, dislodging his grip and his cape. Like a crab he scuttles away, but I catch his knee and drag him back into the fray. “Now it’s your turn, boy,” I am saying. “Prepare to meet the Sheep of Parnassus!” I am wrapping him up in his cape, as though it would swallow him whole, and at first he is giggling, then a note of panic creeps into his laughter. “No, Uncle, no!” he shrieks. “It is too late for you boy,” I continue. “No flight for you; fight, boy fight!” With that I give him license, we both know this game, and his fear turns to fury. He becomes a small Heracles, seizing my wrists like the fabled serpents and twisting them back with a howl. Our eyes meet for a moment; his loneliness and fear of abandonment fall away like dust and he is just a boy at play in the world, struggling for triumph, and he delivers a good shot with his knee to my stomach. I roll away, doubled up and moaning, and he stands over me, glowering with a grin on his face and his hands on his hips. “You have wounded me, Moo-shu,” I groan, “but the sheep will return!” I make a grab at his ankle, but he scampers up the stairs and is gone.

A family is like a loaded gun: point it in the wrong direction and someone is bound to be killed. We take our shots over dinner, stuffing ourselves with creamy, sauce-laden dishes, then we belch up our vitriol and fire away. “Let’s play a game,” says my grandmother. She is 90 years old and as mean as a switch, with violet eyes that glimmer like thistles in rain. “Let’s say the most insulting things we can possibly think of to each other!”

“Okay, Joanna,” my father responds. “I’ll go first.”

At the children’s table, meanwhile, one of the boys has tipped his plate into his sister’s lap, and he is moaning over his loss. “Clean it up, fatso,” she says to him. Her mother looks over sharply. “Well he is obese, you know,” says the girl. “You said so yourself. You said you would take us all to Hawaii if he lost thirty pounds. You called him obese.”

A cousin is slugging his wine and barking across the table at someone’s boyfriend. “Our president has said that if you are not with us, you are against us. Well, are you with us or against us?” This is a man who considered joining the priesthood but ended up flying jets for the navy instead.

“As I mentioned,” says the boyfriend, “I am from Switzerland. We are a neutral country and I am here to study science, not politics.”

My mother is having a quiet talk with one of my sisters. “Your son has been doing something odd in the bathroom,” she says.

“Mm-hmmm,” says my sister. “Tell me about it.”

“He seems to have taken to smearing his feces on the wall when he defecates.”

Bug is at my elbow at once, tugging away. “Theses? What’s theses?”

“Yes, I’ve noticed that myself,” says my sister. “What do you think it means?”

“I don’t much care what it means,” says my mother, “but it’s staining the finish in the bathroom and I’d rather not have to repaint it.”

“What’s theses, Uncle?” I want to avoid this conversation if I can. “He thinks his name is Martin Luther,” I say to Bug. “Why don’t you ask him if he’s thrown his inkpot at Satan recently?”

My grandmother is clutching at the boy we call, on account of the size of his head, The Squash. He is 13.

“Don’t ever trust a woman,” she is hissing at him. “Once she gets her claws in you, you’ll never get them out.”

My father is chatting with his vegan/neurotic daughter. “You were, without a doubt, the most obstreperous six year old I have ever met.”

The boy we call Sharp-Tooth is picking a scab, and my sister-in-law is thinking her Republican thoughts.

“Priests are such funny things,” my aunt is saying. “They’re always shaking things. I wonder one day they don’t shake something out of their noses.”

Someone begins to pray. “Hail and blessed be the hour and moment in which the Son of God was born.” The children are under the mistletoe, performing some weird ceremony. They seem to be making out with one another. The moments we forget likely mean more than the ones we remember. The prayer continues, “…in that hour be pleased, oh Lord, to hear my prayer and grant my desire.”

Bug is at my elbow again. “Uncle.”

“Bug.”

“There’s someone in the tree.”

“No, Bug.”

“There’s someone in the tree.”

“It’s just a memory, Bug.”

“…through the merits of our Lord Jesus Christ.”

“Uncle!” Bug’s eyes are so wide and blue they hurt to look into. “Be an artist!” she says.

“…and his most afflicted mother.”

“Come look!” she says, and we flee.

Dispatches: On Michael Haneke

There are filmmakers who help us learn to watch movies better. Many of them are canonical: Eisenstein, Hitchcock, Ford, Kubrick, Kiarostami, Sokurov, etc. What links the group of directors I am referring to is the way that watching their movies forces the viewer to pay attention to form. Rather than simply immersing one in plot, these artists ask viewers to glean information from the directorial choices being made, from compositions and cuts and such. In Hitchcock, to give the classical example, pretty generic plots combine with a camera eye that makes associations and psychological inferences with startling sharpness. These are moviemaker’s moviemakers. The critic-artists of the nouvelle vague did much to emphasize the aesthetic value of highlighting formal elements, and so the auteur, rather than the studio, became the most important unit to consider when watching movies (they also extended this view backwards to incorporate Hawks and many others). Film formalism is really part of the mid-century revaluation of modernism that extended to criticism and architecture. These days, auteurship mostly serves as the justification for self-absorbed directors whose most urgent message is the advertisement of their own genius.

There are, though, directors working today who respect their audiences enough to command and repay that respect with thought-provoking work that also relies on the audience’s attention to formal features. One of these is the Austrian director Michael Haneke. The sobriety and equipoise of his camera, and the subtlety of his aesthetic choices, make most of his films a pleasure to watch. His recent work has seen him rein in his early tendency towards flashy violence and degradation, as in Funny Games. His adaptation of Elfriede Jelinek’s The Piano Teacher was amazing in its visual translation of that novel’s obsessive tone, and I though his fondness for menacing quiet moments made “Time of the Wolf” one of the best post-apocalyptic movies made recently.

The first shot of his latest, Cache (“Hidden”) is a perfect example of his talent. He holds the wide shot of the main characters’ home for an extremely long time, maybe five minutes. Luckily (or rather, deliberately) the composition is complex enough, and photographically interesting enough, to maintain one’s interest despite the confusing lack of activity. Soon it is revealed that the nature of the first shot is very different to what one at first assumes, and one is forced to revise one’s faith in the basic nature of shots in movies. It’s that clever. As the image becomes a motif, repeatedly returned to, over the course of the film, its details become more and more familiar, and our encounters with the same space from different perspectives are made as familiar as if we ouselves had inhabited this street. This is filmmaking: to grasp a space and its complexity and impart that complexity to a viewer, in something like three-dimensionality. The movie is about surveillance, literally and figuratively, and it commands the viewer to confront the ambiguity of looking at the world, and how assumptive most of the judgments we make about it are. Like most great formal films, we learn about observation by observing it.

The narrartive theme of Cache might be said to be the return of the repressed, globalized. The movie concerns an haute-bourgeois couple – their modernist dwelling, decor, even food, perfectly observed – who are possibly threatened by figures from the husband’s past. The relations between the modern liberal individual, secure in his sanctimonius domain, and the world-at-large (in this case, the French colonial world) are called in question with devestating results. Compared to a movie like Syriana, whose idea of exploring the links between countries is to represent everything through the tired themes of espionage and politics, with human beings a kind of generic afterthought, a plot device, Cache starts from the most locally situated, domestic setting (the ur-Parisian couple of Juliette Binoche and Daniel Auteuil) and gradually expands the circle outward, relentlessly and at times grimly, until you feel the distance between places and places, times and times, unraveling.

At times, in all his films and this one, Haneke can risk dourness. I never actually feel he is a miserablist; more likely, I think, his tone is so even and reserved that easily bored viewers sometimes feel punished. I think his directorial reserve, his lack of flashy camera movements and cuts, is his great strength: it buys him the time to examine people more closely than most filmmakers. If he has a trademark shot, it is the stationary wideshot. These shots, so beautifully composed, are reminiscent of another very systematic artist, Andreas Gursky. But they are much more daring in the cinema than in photography, and they build up great pathos over the long durations for which he holds them. What seems to get exposed by these patient intervals of looking is something like the Pinterian hypocrisy of everday life, the little lies that must be constantly told and that we must ferret out. The artificial, quick style of the commercial film industry can’t show us this; it substitutes the pleasure of cutting to the beat of music and fetishizing the close-up. Haneke can’t or won’t provide these confectionary pleasures, but he substitutes something richer: visual detail, blocks of color, compositions that combine foreground and background elements. In a way, his work is a defense of cinema against music, and against television (the home of the close-up).

Haneke’s comfort with unsympathetic characters, actions, and styles comes along with something a little less savoury: his attraction to sadism. Violence, and especially the visceral display of blood, in his movies is a bit of an addiction for him, and at times it can feel a little too much. But he shows signs of maturation: where he reveled in brutality in his earlier films, especially Funny Games, The Piano Teacher mostly observes blood so clinically as to reconfirm the aversion to violence. In this sense, Haneke’s violence is the opposite of the kind of celebratory intensity that you find in so many American directors (Scorcese, Tarantino). In Cache, despite a pervasive air of menace, there is only one violent moment, and it irrupts so shockingly into the texture of the film that it at first feels manipulative. Later one begins to decide it was earned after all.

The film’s final shot, another elegant stationary composition held for minutes, only furthers the ambiguity of what has come before. The film, full of jokes and setups that defy generic expectation, ends by neglecting to conclude, instead pointing to the unknowability of urban culture. It poses some really difficult questions about contemporary French identity and the price of its maintenance. For his recommitment to radically simplistic cinematic tools; his mastery of tone and pacing; his photographic complexity; his fearless attitude towards unsympathetic characters; and most of all his respect for the viewer’s intelligence, I think Haneke is one of the most interesting directors at work in the world today.

Dispatches:

Divisions of Labor III (NYU Strike)
Divisions of Labor II ( NYU Strike)
Divisions of Labor (NYU Strike)
The Thing Itself (Coffee)
Local Catch (Fishes)
Where I’m Coming From (JFK)
Optimism of the Will (Edward Said)
Vince Vaughan…Eve Sedgwick (Homosocial Comedies)
The Other Sweet Science (Tennis)
Rain in November (Downtown for Democracy)
Disaster! (Movies)
On Ethnic Food and People of Color (Worcestershire Sauce)
Aesthetics of Impermanence (Street Art)

Monday Musing: A Moral Degeneracy

One of the few vices I have always had an extreme aversion and almost allergic reaction to is gambling in all its multifarious incarnations. So much so, that I have never even learned to play a single card game, because they are all somehow indissolubly (and probably unfairly) associated with gambling in my mind since an early age. Besides the irrationality of trying to “beat the odds” at a casino, and the elaborately idiotic “systems” that people come up with for doing so, the idea that one is getting some entertainment in exchange for throwing one’s money away is, at the least, irritating to me. Since when is sitting in a near-hypnotized state in front of a gaudily festooned refrigerator-with-a-gearshift-lever-attached for hours, feeding small (and not so small) change into it, and occasionally getting some ducats spat out at one, considered entertainment? And why? I’m sorry, it seems much more like a compulsive sickness to me.

Still, if people want to congregate in some monstrously ugly building and drunkenly give their money to casino owners for nothing in exchange, and to find this entertaining, who am I to object? I don’t even begrudge the Native-American tribes that have managed to get something back from the people who have taken everything else from them, by taking advantage of their addictions for a change. I do, however, draw the line at state-sponsored gambling. Human minds have a well-known and well-studied weakness in dealing with probabilistic phenomena. (See this earlier 3QD article, for example.) It is one thing for individuals, or even private corporations, to take advantage of this systematic weakness; it is quite another for government itself to be doing so by actively and enthusiastically promoting gambling, rather than protecting people from it by making sure that they are aware of its obsessive dangers and basic irrationality. Yes, I am talking here of all the lotteries.

Lotto_ticketFrom the point of view of rational choice theory, to play a state lottery is undeniably, unarguably, irrefutably irrational. They give out much less than half the money as prizes, than what they receive from the tickets they sell; the rest goes to the cost of administering the lottery, and whatever is left over is used for the benefit of citizens in supporting educational and other governmental programs. It is like someone telling you, “I will flip a coin and you call it. If you lose, you pay me $10. If you win, I will pay you $4.” Would you repeatedly keep playing this game? Well, if you play the lottery, this is exactly the game you are playing. The bigger the potential payoff, the more even otherwise-rational people become willing to suspend all reason, even if the odds of losing have grown to astronomical proportions. The government of my home state of New York constantly takes advantage of this very human mental laziness by running ads on TV whenever the total amounts available in the lottery exceed some amount, like 10 million dollars. Whenever the amount is over a 100 million dollars, the glamor-promising promotion is constant, and it seems to work very well in replacing any misgivings people may have about the basic stupidity of buying lottery tickets, with ineluctably seductive visions of nearly unimaginable wealth. People drive miles from Connecticut and New Jersey to come to NYC to buy Lotto tickets when this happens, no doubt planning what they will do with their winnings along the way. Someone once pointed out that for a round trip of more than 14 miles, there is a greater chance of dying in a traffic accident than there is of winning 100 million dollars in the NY State Lotto. Still, there are plenty willing to take their chances.

So what about the supposed benefits to society? A slogan on the website of the NY State Lotto proudly proclaims: “Raising billions to educate millions!” Well, let’s take a look at where these billions-for-millions are coming from. As you are no doubt aware, many demographic studies in many different states show that the numbers of lottery players are dramatically skewed toward African Americans and low-income groups, and those with low levels of education, with a significant number of senior citizens added to the mix. It is hardly surprising that those who feel most desperate about their situations in society, and have the least hope or avenue of uplift, would be most likely to risk their precious few dollars on the futile dream of escaping their condition. Look, let’s just call this what it is: another tax on the poor, and a racist one at that. What is remarkable in this case is that it is liberals who support such efforts, mistakenly thinking that the monies raised will be spent on the needy, while conservatives, such as George F. Will, vociferously oppose lotteries on the moral grounds that the government should not be in the business of promoting gambling.

I am not sure why otherwise reasonable people are so taken with the idea of lotteries as a great way of raising money for good causes. I was once told by a very distinguished diplomat (whom I respect and admire immensely) that he had suggested the idea of a worldwide lottery administered by the UN to Kofi Annan, as a way of raising money for worthy UN causes. Maybe he was joking. But proposals for new lotteries are everywhere. Look at a sample I just found from today, for instance:

Elmer L. Forbath proposes this in Space.com:

A Space Lottery: An Idea Whose Time Has Come

The National Space Society should promote creation of a National Space Lottery. Ideally, this might become an International Space Lottery, and would offer the possibility of space flight, as a prize, to every man, woman and child on earth…

The problem with funding space efforts with tax dollars is that many say, “What’s in it for me?” To date, space has been reserved for scientists and rich tourists, like Dennis Tito and few imagine themselves as having a chance. A National Space Lottery will offer the possibility of space travel to everyone, rich or poor!

Lotteries generate huge amounts: One multi-state jackpot reached $363 million! The lottery for New York has the motto “A Dollar and a Dream.” The dream offered by a ticket in the “space” lottery could be a ride on an F-16 or the “Zero G” airplane, suborbital flight on SpaceShipOne, a trip to the International Space Station, or eventually to our lunar colony.

Yes, why not have the desperate and the destitute of the world pay for our increasingly controversial plans for space missions? Maybe we can’t con our taxpayers, or even their usually easily-bought representatives in Washington in this case, but, hey, we can always exploit the silly dreams of wealth that the extremely poor and illiterate of the world can always be counted on to indulge! I can just imagine the long lines at the pale blue cash-in-dollars-only UN Lottery terminals in Malawi. While we’re at it, I have a modest proposal of my own: why doesn’t the government also get into the business of hawking Hock? It’s legal, after all, and maybe we could raise enough money to start treatment programs for alcoholics. Maybe there would even be enough left over to house the homeless!

Have a good week!

My other Monday Musings:
In the Peace Corps’ Shadow
Richard Dawkins, Relativism and Truth
Reexamining Religion
Posthumously Arrested for Assaulting Myself
Be the New Kinsey
General Relativity, Very Plainly
Regarding Regret
Three Dreams, Three Athletes
Rocket Man
Francis Crick’s Beautiful Mistake
The Man With Qualities
Special Relativity Turns 100
Vladimir Nabokov, Lepidopterist
Stevinus, Galileo, and Thought Experiments
Cake Theory and Sri Lanka’s President

Monday, January 16, 2006

Monday Musing: Awareness of Mortality, Conservatism, and Local News

I’ve generally been dissatisfied with the idea that media outlets such as Fox TV and Sinclair are the drivers behind the rightward move in politics in recent decades. It’s not that I find the claims impossible. I certainly have come across enough people whose arguments for this or that right-wing view pretty much echo Rush Limbaugh or Bill O’Reilly, or, at better moments, Bill Kristol. I hear these arguments as well, and remain unconvinced. The smarter conservatives have heard opposite views as well, and they remain unconvinced. There is plenty of space for reasonable disagreement for the simple fact that we disagree about what factors are relevant, how they should be weighted, and that we have differing commitments as to what values should be prioritized. So, that doesn’t leave me surprised. But the trend has been a steady one, or had been a steady one. And while there are reasons we have for our positions, and while in some broad sense these reasons can and do act as causes for our political preferences, something else seemed to be at play, and the media seems as good a place as any—all the more so if reasons and the information they appeal to, matter.

Conservatives often point to how media and entertainment alter values by altering our attitudes and psychological stances on sex, violence, and authority. To the extent that political attitudes depend on psychology, on sentiment, their take that the media alter sentiments and thus political may be at least as true as the left-liberal claim that the media distorts information.

The appeal to the psychological basis of political preferences, and especially the patterning of political preferences is old. Studies of crowd psychology, mass movements, fascism, obedience to authority and the like have been steady, and the turn towards rational choice and strategic behavior, with its assumption that these turns are motivated by self-interest, has not managed to dislodge it.

I was thinking of politics and psychology this past holiday season when back in Houston for a visit. An increasing number of family members were becoming more and more conservative and they were becoming more and more fearful.

Recent studies in psychology suggests the people are effectively made more conservative when they’re made aware of mortality. This new approach, “terror management theory”:

holds that cultural worldviews or systems of meaning (e.g., religion) provide people with the means to transcend death, if only symbolically. The cornerstone of this position is that awareness of mortality, when combined with an instinct for self-preservation, creates in humans the capacity to be virtually paralyzed with fear. Fear of death, in turn, engenders a defense of one’s cultural worldview. Consequently, the theory predicts that if the salience of one’s mortality is raised, the worldview will be more heavily endorsed to buffer the resulting anxiety. Under conditions of heightened mortality salience, defense and justification of the worldview should be intensified, thereby decreasing tolerance of opposing views and social, cultural, and political alternatives.

The relevance of terror management theory to the psychology of conservatism should be apparent. When confronted with thoughts of their own mortality (Greenberg et al., 1990; Rosenblatt et al., 1989), people appear to behave more conservatively by shunning and even punishing outsiders and those who threaten the status of cherished worldviews. This perspective is especially consistent with the notion of conservatism as motivated social cognition; terror management theory holds that social intolerance is the consequence of worldview-enhancing cognitions motivated by the need to buffer anxiety-inducing thoughts.

The theory was founded by Sheldon Solomon, Jeff Greenberg, and Tom Pyszczynski, who point to considerable empirical support for their claims.

Empirical support for TMT has been obtained in over 200 experiments by researchers in 13 countries, primarily by demonstrating that reminders of death (mortality salience) in the form of open-ended questions, death-anxiety questionnaires, pictures of gory accidents, interviews in front of funeral parlors, and subliminal exposure to the words “death” or “dead,” instigate cultural worldview defense. For example, after mortality salience, people: 1) have more favorable evaluations of people with similar religious and political beliefs and more unfavorable evaluations of those who differ on these dimensions; 2) are more punitive toward moral transgressors and more benevolent to heroic individuals; 3) are more physically aggressive toward others with dissimilar political orientations; and 4) strive more vigorously to meet cultural standards of value. In addition, research has shown that mortality salience does not influence conscious affect or physiological arousal, and its effects are greatest following a delay, when death thought is highly accessible but outside of focal attention. Recent work has demonstrated that it is the potential for anxiety signaled by heightened death thought accessibility, which motivates worldview defense and self-esteem bolstering, which in turn reduces death thought accessibility to baseline levels. . .

President Bush’s popularity soared after the massive mortality salience induction produced by the attacks of 9/11; since then, Bush has emphasized the greatness of America and his commitment to triumphing over evil. . . Do reminders of mortality increase the appeal of such a leader? Studies published in the September 2004 issue of Personality and Social Psychology Bulletin suggest that they do. In Study 1, a mortality salience induction dramatically increased support for President Bush and his policies in Iraq. In Study 2, subliminal reminders of 9/11 or the World Trade Center increased the accessibility of implicit thoughts of death; for Americans then, even non-conscious intimations of the events of 9/11 arouse concerns about mortality. Accordingly, in Study 3 participants were asked to think about death, the events of 9/11, or a benign control topic; both mortality and 9/11 salience produced substantial increases in support for President Bush among liberal as well as conservative participants. Finally, in Study 4, whereas participants rated John Kerry more favorably than George Bush after thinking about being in intense pain, after a reminder of death, evaluations of Bush increased and Kerry decreased, such that Bush was more favorably evaluated than Kerry.

And the mechanism seems to be a fairly straightforward, in-group out-group dynamics. I recalled these studies which I had come across not too long ago when I was in Houston, because a fear and stigmatization of refugees from New Orleans was not simply palpable but open. More importantly it was on the local news all the time.

I stopped watching local news a long time ago mostly because of headlines like “Potholes, what you don’t know might kill you.” (That’s not a made up headline.) The alleged crimes associated with refugees from New Orleans were particular instances of the sensationalist fear mongering that is a staple of local news. If there’s anything to this research, then Fox News and Sinclair may be products rather than sources of the changes in political preferences, and the sources themselves may be from seemingly innocuous media.

Critical Digressions: Beyond Winter in Karachi (or the Argumentative Pakistani)

Ladies and gentlemen, boys and girls,

Recently in town for the British tour, cricket reporter Andrew Miller observes,

“One of the first things you notice about Karachi, so long as you’re not being hot-boxed in a rickshaw as the morning traffic crawls to a halt, is the improbable clarity of the air. Despite being home to 14 million intensely active inhabitants, there’s none of the oppressive smog that lingers over Lahore like a caggy blanket. As the sea breezes work their magic and dissipate the city’s exhaust fumes, it’s possible too to see through some of the thick layers of misconception that abound about the place.”

Karachibynight During winter in Karachi, the sunlight is soft and milky during the day and after dusk the air becomes cool and wafts firewood and the sea. The billboards down Shahrah-e-Faisal flash and buzz and the wedding halls in and around the wide boulevards of Nazimabad and Hyderi are lit like carnivals. Winter is wedding season, Jinnah’s birthday, Christmas time, new year. In Saddar, St. Patrick’s Cathedral, garlanded by Christmas lights, glows something like a medieval structure in downtown Prague. At Clifton beach, floodlights animate the swelling gray sea and the silhouettes of families skipping on the silt. On New Year, tens of thousands of Karachites flood the beachfront on the backs of motorbikes, chanting, waving flags, celebrating themselves and the city. 

This year, however, a shadow has fallen over the city’s winter pageantry. A few months ago the earth opened up in the north and swallowed up mountains, roads, schools, villages, people. The earthquake in Kashmir is a catastrophe of epic proportion: a hundred thousand dead, three million displaced. The numbers bewilder; and grow: the severe northern winter is now claiming thousands every day. (We urge you to contribute generously, immediately.)

DisplacedchildrenAlthough far away, the tremors of the earthquake have reached Karachi: not only does the city host a large Kashmiri population but possesses the requisite infrastructure to provide relief. From the city’s efficient political machinery – notably the MQM and the Jamaat – to the remarkable civic organizations – the Edhi Welfare Trust and the Citizens Foundation – all have been involved in the relief and present reconstruction effort. Moreover, students from high-schools and colleges, some who have never left Karachi, are volunteering in far-flung Muzaffarabad and Balakot.

Fellow at the Council for Foreign Relations, Mahnaz Isphahani notes that “an August 2000 study by the Agha Khan Development Network rated Pakistan as one of the most charitable countries,” a remarkable statistic for a developing country. “It shows. The private sector, non governmental organizations, political parties and thousands of volunteers led the relief efforts. The earthquake has driven a unique mobilization of Pakistan’s civil society.” Rugged individualists, Pakistanis don’t come together often. But when they do, they seem to move mountains.

Pakistan routinely makes headlines for being a “frontline state” on the “War on Terror,” a function of its geography which is defined by a collection of tribal fiefdoms in north, Muslim fundamentalist Iran to the West, and until recently, Hindu fundamentalist India in the East. Consequently other developments escape discourse. Save a piece or two, there has been no coverage of the remarkable summoning of national resources toward relief and reconstruction. Indeed the earthquake has often been essentialized in mainstream media as a matter of five or six Indian army helicopters that were not accepted by the government. As usual, many dimensions have escaped scrutiny.

Writing in the Wall Street Journal, however, political commentator Husain Haqqani picks up on an interesting development:

“So much for the popularly peddled view that anti-Americanism in the Muslim world is so pervasive and deep-rooted it might take generations to alter. A new poll from Pakistan, a critical front-line in the war on terror, paints a very different picture – by revealing a sea-change in public opinion in recent months…Pakistanis now hold a more favorable opinion of the U.S. than at any time since 9/11…The direct cause for this dramatic shift in Muslim opinion is clear: American humanitarian assistance for Pakistani victims of the Oct. 8 earthquake that killed 87,000. The U.S. pledged $510 million for earthquake relief in Pakistan and American soldiers are playing a prominent role in rescuing victims from remote mountainous villages.”

There are other seismic developments in the north that have escaped attention. A few months ago, the Hasbah bill made headlines everywhere. The procrustean legislation, passed by provincial assembly in Peshawar (the capital of NWFP, the province bordering Afghanistan), would have created a moral police. In December, however, the Supreme Court definitively struck it down. This, of course, did not make the news anywhere. (We urge you do do a google search on the issue). Furthermore, the sponsors of the bill, the MMA, the Islamist party that won some seats after the second Afghan campaign, has been unable to subsequently pursue it. Sirajul Haq, a senior provincial minister in the MMA, claims in the Herald that “some people in Islamabad are allergic to the word Islam.” Indeed, Musharraf’s administration came down like a ton of bricks on the mullahs.

Abdullahshah Of course, Haq’s Islam is not Pakistani Islam; the personality of Pakistani Islam is inherently accommodative. Whenever we return home, for instance, we visit the shrines of Sufi saints in and around Karachi, from Mayvah Shah deep in a necropolis featuring a Jewish cemetery to perhaps one of the most intriguing tombs in all of South Asia, Udero Lal (or Duryah Shah). Here, people congregate on Thursday nights, singing, dancing, smoking chars. The weekly festival features fortune-tellers, wrestling competitions, food stalls. Near the beach, at Abdullah Shah Ghazi, we pay homage to the saint who is said to have saved Karachi from the sea. In the limestone cave behind the tomb there is talk of other miracles. This time around we visited a couple of Hindu temples, including Shree Ratneswar Mahdevi, a five minute saunter from Abdullah Shah. There, we were taken to a limestone cave underneath and briefed about miracles. No doubt, the two caves are connected. Whether you talk about the Barelvi Punjabi countryside or the Maulai Sindhi interior, the weddings rituals or notions of hygiene, the infrastructure of Pakistan’s Islam rests on a syncretistic heritage. And Islam got more civilized the further it moved away from Arabs and Arabia (and arguably is most accommodative in the Far East).

Theboysfrompeshawar Another exciting development in the north is the emergence of Sajid and Zeeshan, a solidly middle class, Peshawar-based, electronic pop duo. Their thoughtful but finger-snapping, hip-shaking singles “King of Self” and “Freestyle Dive” (that we urge you dowload from their website) have topped the charts and if marketed successfully, their upcoming album can take dancefloors worldwide by storm. The latter’s animated video – featuring a bank robber who suffers a pang of conscience – was nominated for best video at the Indus Music Awards (held, by the way, with great pomp on the lawns of the Karachi Parsi Institute on December 24th.) Sajid and Zeeshan are the face of contemporary Pakistan, resolute members of the Media Generation, heterodox rockers who unlike mullahs don’t worry about what Pakistan should be, about silly notions of authenticity, but are confident that what they do is by definition, Pakistani.

Indeed, the Media Generation is redefining notions of self and sovereignty in a way that no prior generation has before, a phenomenon we have covered in this column earlier. This winter, the Media Generation is responsible for the superb Kara Film Festival and for bringing Bryan Adams to Karachi later this month for a benefit concert (where some twenty thousand are expected to attend). It was also responsible for local channels broadcasting Christmas programming on Christmas eve: not only was there a Christmas address by the Prime Minister to the a large gathering on PTV but Christmas carols on TV1, and on GEO (the most watched entertainment channel), a serial on prime-time featuring Pakistani Christian family, a first. This is the stuff that changes sensibilities. For instance, at Nasra School, one the largest private urban school networks for lower-middle and middle class students, the topic for the annual middle school debates this year (reminescent of the program “50 Minutes” on GEO) was, “Should Pakistan develop relations with Israel?”

Ourboys_1 The sports channels are commemorating another event: Pakistan walloping England in cricket. For years, Pakistan had been in the wilderness, a team with few stars, little direction, no guts. Things began turning in 2005 with Pakistan’s resounding defeat of India in India. England, on the other hand, was coming off an historic victory over Australia, the best team in the world, and was favored to win. But as with India, the Pakistan boys made chapli kebab out of them. Under the squinty, watchful gaze of Inzi, “The Big Easy,” their towering Punjabi captain who has finally come into his own, Danesh Kaneria, the Hindu leg spinner, took wicket after wicket after wicket as Shahid Afridi, the blue eyed Pathan from Karachi, transformed from a streaky opener to the most explosive batsman in cricket today, swatting pitches as if he were playing street cricket with a tape ball. Cricket reporter Kamran Abbasi avers, “There is a Pakistani way in cricket, abundant talent abundantly flawed, that leaves you holding your breath in anticipation of the next act and staring in disbelief if it comes off.” We see things somewhat differently. The cricket team can be thought as a proxy for nation: rugged individualists with varied styles and backgrounds who against all odds somehow come together at critical junctures to come out on top.

Other Critical Digressions:

Dispatch from Karachi

The Media Generation and Nazia Hassan

The Naipaulian Imperative and the Phenomenon of the Post-National

Literary Pugilists and Underground Men

Gangbanging and Notions of the Self

Dispatch from Cambridge (or Notes on Deconstructing Chicken)

Poetry of the Real: Six Feet Over

Australian poet and author Peter Nicholson writes 3QD’s Poetry and Culture column (see other columns here). There is an introduction to his work at peternicholson.com.au and at the NLA.

You quite often see poetry used in contemporary culture in interesting and creative ways. For example, just last week I heard some Emily Dickinson used in the Kneehigh Theatre’s production of Tristan & Yseult at the Sydney Festival, and I’m pretty sure I heard a line from Larkin in Jerry Springer The Opera which was shown here recently. What is a lot less common, I think, is to find what one might call the poetic, or the poetry of the real, in actual television shows. Some would say that all television is poetic in the sense that it heightens the aesthetic experience in ways that can intensify viewing and listening pleasure. The mordant satire of The Simpsons or the social comedy of Frasier or Seinfeld is television of a very sophisticated and pleasurable kind. However, programs that get through to the wound of living and are equal to the realities of how we know our lives to be are much rarer.

Six Feet Under is surely one of the most effective television series from this viewpoint. Here is the poetry of the real, written and produced to entertain, but getting very close to the place where poetry lives. True, the poetry is often near to incomprehension, pain and disillusionment. There are no one-liners to relieve the cataclysms in the heart. From the very opening shots, with their intimations of Poe’s ‘Quoth the raven, ‘Nevermore’’ (or is it Macbeth?), and the necessary death we witness at the commencement of each episode, we are put in the way of the strangeness and unpredictability of death, and therefore of the strangeness and unpredicability of our lives.

It is a very important creative achievement to bring about this kind of intensification for five seasons of programming. Here, I feel, there are characters who could walk out of the screen and I would recognise them as of my own kind and time. They speak truly, and their silences are true too. And I like the way the characters change our perceptions of them, how our reactions to them vary. Brenda Chenowith, for example, can seem liberated and perceptive, and then self-obsessed and narcissistic—not quite the same thing. Claire’s adolescent wilfulness can be irritating until we remember we were just like that not so very long ago. But Claire is also strong enough to resist the commodifications that want to turn her generation into mindless consumers. Just as David Fisher’s forbearance is of the kind we know is needed to get through, so too you sometimes want to shout at the screen for him to tell Keith where to get off. The southern Californian skies may not be our milieu, and we may not take as many hard substances as the Fisher family and their friends do, but all the same, these people are like us with their desire and hope, their frailty and strength. There are no saints here, just people getting through the rhythm of their days, aiming for good, and often falling short. Alan Ball and his production team have done the most effective job in conveying the wing of their joys and discontents. The program has to be packaged as entertainment, but it is entertainment of the first order where there has been no compromise to get ratings points.

There is one aspect of this program that wouldn’t necessarily strike the average viewer the same way it would an independent writer like myself: the fact that Fisher & Sons is an independent business, trying to survive against the onslaught of Kroehner Service International and the shark-toothed Mitzi who is always waiting to devour the Fisher family in her maw. I have often felt every sympathy for Ruth Fisher and for the ghost of Nathanial Fisher who makes unpredictable sorties in the psyche, especially of elder son Nate. They held onto an independent family business for all their working lives, and it has often looked like the business was going to be swallowed up in some capitalist conglomeration. The poet Joseph Brodsky once wrote that some poets store up malice as a kind of life insurance. They have little to give except their bile, and rampant verse offerings. The equivalent of Kroehner Service International is well and truly alive in the arts. David and Nate have to fight off attempts to take over their business at every turn, and so far they have succeeded. Federico, their brilliant mortician who gets people looking as good in death as they ever did in life, also wants in on the action, and he is properly put out when, earlier in the series, the Fishers don’t seem to have any time left over for his ambitions. I think this solidarity against the predatory and levelling tendencies of capitalism in its late phase partly-explains the appeal of Six Feet Under to so many different kinds of viewers, just as each of the characters summons up some aspects of our own lives which can be as inspiring as Ruth’s willingness to shed her old skins or as strange as Brenda’s psychiatrist parents’ shenanigans.

To me, this series attains what is very close to a Wagnerian Gesamtkunstwerk, a total work of art, where all the contributory factors—the writing, the music, the acting, the sets, the editing—go to make the cathartic whole. SFU is one of those productions against which subsequent series which aim at dramatic credibility will be judged. One can’t, and shouldn’t, go around all the time with your head in aesthetic clouds. A series like Six Feet Under brings you down to earth with a thump, but the kind of thump where poetry can be real for you, and the words spoken and the feelings experienced transcend the passing moment, and you, as viewer, go through to another level of acceptance or recognition; you are somewhere beyond the crassness that some contemporary culture insists is your due. And you are over, at least six feet over, all that detritus and failure.

                                                                      *

        Poem Of The Real

Child soldiers of sex slaves
Cut off lips with razors.
Time for some aesthetics.

Ocean rears from bad dreaming,
Swallowing your family whole.
What about hermaneutics?

Poem of the real is this world
Spun in violent fractals.
Insoluble acrostic.

Written 2005

Monday, January 9, 2006

Monday Musing: Being Polish

I’ve decided to become Polish. This will be slightly easier for me than for some because I happen to be almost completely Polish on my mother’s side. But only slightly easier. The Polishness of my Polishness never got going. The things that happen to national identities in the American experience happened to my Poles. The Polishness got filtered out over the years, a couple of generations. It is only a name now, a word that points to origins that stopped explaining things. Calling myself Polish explains almost nothing about me.

But I’ve decided to make it explain something. There are some names associated with this decision. One of them is Czeslaw Milosz, another is Adam Zagajewski. And what about Gombrowicz and more recently Adam Michnik? There is also Ryszard Kapuscinski. There are others; names I’m still discovering and exhuming from the 20th century. In a way, the 20th century is a Polish century. That is if history should sometimes be written by the losers. And probably it sometimes should. Not that Polish hands aren’t stained with the blood of others and stigmitized by the same horrors that marked so many during that terrible century just passed. But Polish Letters, the Polish essay, is profoundly marked by that tragic sense of history that defines the Central and Eastern European mindset that watched, mostly helplessly as Nazism handed them off to the Soviets.

The Polish essay is about individual acts of resistance against the eradication of the mind. Sometimes these essays are conservative, sometimes they are grasping for something new. Sometimes they feel profoundly European, like faded scraps of parchment, testaments to a world that was destroyed by the very hands that had built it. Milosz feels that way most of the time, like a character from one of Sebald’s novels, like a memory waiting to dissapear. Milosz is a million miles away, talking about his Polishness in ways that don’t even completely make sense. And he is so good that he doesn’t have to care. He writes:

My work for foreigners has been of a practical, even pedagogic nature–I do not believe in the possibility of communing outside a shared language, a shared history–while my work in Polish has been addressed to readers transcending a specific time and place, otherwise known as ‘writing for the Muses’.

But Milosz too was an exile and he had to take his Polish with him. Polish essay writing always has some aspect of exile mentality. The Polish 20th century is about the tenuousness and transmutability of physical space. And it is about the power of mental space in the face of that fragility. Zagajewski writes about Gombrowicz:

And yet, despite all his theories, polemics, and quasi-philosophical and anthropological lectures, it is not in the sphere of ideas that we should seek his greatness, but deeper, in a more elementary realm. Through all of his disputes and debates, Gombrowicz, a restless spirit provoked by time, by modernism and recent history, expresses himself, and speaks—not straightforwardly, which is precisely what is so engaging—about himself, his adventures, his sufferings; about pain and about joy. He is like an Everyman for our time; he is our fellow, tormented not only by sickness, emigration, poverty, and loneliness, but also by ideas.

That is exile writing too. It’s tormented but it has found some strength in that condition. The exile in the Polish essay isn’t a victim. The Polish essay bitches and moans but then laughs about it. The Polish essay can always draw on totalitarian humor, the blackest and often most painfully humorous of humors.

I think that the exiled fragments of experience that have come down to us from the 20th century in the Polish essay are something to identify with as ruins. In these ruins are the best, if broken, parts of the human mental landscape. That is the kind of Polish I’ve decided to try and be.

Poison In The Ink: Darwinian Grandparenting

Most grandparents would never admit it, but studies consistently reveal that they treat some grandchildren better than others.

When surveyed, adults said they felt closest to their maternal grandmothers, followed by their maternal grandfathers then paternal grandmothers and finally paternal grandfathers.

The pattern was the same whether the researchers tested for emotional closeness, the amount of time spent per week with a grandchild or the money spent on them each month.

It was also the same whether the adults surveyed were from America, Germany, Greece or Australia and even when such things as the grandparents’ age, the distance they lived away from the grandchild and the number of living grandparents were controlled for.

One of the most intriguing explanations for this trend comes from evolutionary biology. The idea is that the investment a grandparent makes in a grandchildren reflects how certain they are that they are actually related to them.

Biologists refer to an organism’s ability to survive and produce offspring as “fitness.” From a Darwinian point of view, the goal of grandparents is to help their children have as many children of their own as possible. By doing so, the grandparents not only increase their children’s fitness, but their own as well.

Evolutionary theory therefore predicts that a maternal grandmother will be most likely to invest in her grandchild because in nearly all cases, she can be 100 percent sure that the grandchild born of her daughter is really related to her.

It also predicts that a paternal grandfather will have the least incentive to invest in his grandchild because not only is he unsure of whether his son is really his grandchild’s father (the daughter-in-law may have cheated on her husband), he also can’t be sure of whether his son is really his son (his wife may have cheated on him).

But while evolutionary theory does a good job of explaining why maternal grandmothers invest the most in their grandchildren and paternal grandfathers the least, it doesn’t explain why adults consistently said they felt closer to their maternal grandfathers than their paternal grandmothers.

If all that matters is relatedness, both these grandparents should show similar levels of investment since both have an uncertain genetic link to their grandchildren: the paternal grandmother can’t be completely sure that her son was really the father of her grandchild and the maternal grandfather can’t be completely sure that the mother of his grandchild is really his daughter.

A possible explanation for this anomaly was suggested by William von Hippel, a psychologist from the University of New South Wales in Sydney, Australia and colleagues in a paper published last year in the journal for the Society for Personality and Social Psychology.

According to von Hippel, paternal grandmothers are more distant than maternal grandfathers because they have another, safer, bet when it comes to the investment of their time and resources: your cousins.

The reasoning behind this idea is simple: while your paternal grandmother may be uncertain about the genetic link between her son (your father) and you, she can be 100 percent sure of her relatedness to her daughter’s (your aunt) child (your cousin).

This hypothesis therefore predicts that your paternal grandmother will invest more time in your cousins if they are the children of her daughter than in you. Your maternal grandfather, on the other hand, is as clueless about his relation to you as to your cousins and therefore has no incentive to prefer one over another. The researchers also predicted that in cases where the maternal grandmother had no grandchildren through daughters, this effect would dissapear.

To test their hypothesis, the researchers surveyed 787 students from the University of South Wales. They asked the students to rate their emotional closeness with their grandparents and to also indicate whether they had cousins, and if so, whether they were from paternal or maternal aunts and uncles. The results followed the exact pattern that the researchers predicted; however, the effect was only marginally significant.

The researchers were unfazed though. “Rather than being unimpressed by the small size of these effects, one might instead be impressed that such an effect emerges at all,” they write.

“Of all the reasons to feel close or distant to a grandparent, the fact that genetic uncertainty and preferred investment outlets led to predicted differences in closeness testifies to the potency of evolutionary principles.”

The researchers hope to replicate the experiment in non-Western cultures and to use more direct measures of grandparental investment, such as gifts given.

Rx: Reductionist vs. pluralist views of Cancer

CancercellCancer, the malignant evil that corrodes fatally, is supposed to start in one cell. In appearance and behavior, this cell and its daughters are so different from their “normal” predecessors and counterparts that they appear to represent a new species. In this essay, I suggest that the transformation of a non-malignant cell into a frankly malignant state accompanied by all the biologic changes that define cancer as a disease (expansion, angiogenesis, metastasis) may follow the rules of evolutionary biology during speciation. In the strictest sense, speciation refers to reproductive isolation, which is obviously not the case here; subsequently I will use “clones” of cells in lieu of species. How this clone develops a growth advantage over its surrounding neighbors and at the same time, manages to suppress the growth of its normal counterparts, is a subject which is not well understood. The conventional approach of most scientists to such a problem is that of reductionism where an attempt is made to break the cell down into its individual components, and concentrate on identifying abnormalities that could explain the malignant characteristics. Reductionists would view the initiation and subsequent expansion of a cancer cell into an overwhelming clone as being driven by events related predominantly to the cell itself; for example the dysregulation of genes by mutations or deletions. Although, the reductionist method constitutes the backbone of solid science, transformation of a normal cell into a frankly malignant one is a gradual process involving multiple steps, making it difficult to apply the reductionist approach to the problem. These steps are not confined to the cell alone, but also involve a dynamic microenvironment which affects, and is in turn, affected by the expanding population of the abnormal cells. Thus the cell and its microenvironment, or the seed and the soil, constitute a complex system, and pluralists would argue that complex systems cannot be reduced to simple properties of their individual components. Or, to paraphrase Einstein, one can reduce the problem to its simplest possible solution, but no simpler.

Thousands of putative cancer cells are produced in the body each day, but die without further expansion because they are not well equipped to survive in an environment optimized for the support of normal cells. An ongoing interaction between a potential cancer cell and its micro-environment is therefore a necessary requirement for their co-evolution towards a malignant disease state. In other words, even as thousands of cancer cells are produced in the body on an annual basis, the clinical disease with all its malignant manifestations does not appear unless the cancer cell has had a chance to “evolve”. In fact, the situation has many parallels with the ongoing lively debate between the two groups of evolutionary biologists regarding speciation. The orthodox Neo-Darwinians (Maynard Smith, Richard Dawkins, Daniel Dennett) are reductionists who believe that natural selection is the sole engine driving evolution. The proponents of the punctuated equilibrium hypothesis (Niles Eldridge, (the late) Stephen J. Gould and Richard Lewontin) see evolution as being more complex so that natural selection may be the primary but not the exclusive source of modification. They are the pluralists. Application of the broad principles of evolutionary biology to carcinogenesis may define the sequence of events involved in the development of a malignancy, thereby locating therapeutic targets where intervention is likely to lead to an arrest, if not a reversal, of the process.

Let us take the example of the human bone marrow which is an exceedingly dynamic compartment with billions of cells of many different varieties being produced, as well as being programmed to die on an hourly basis. Deviancy is not well tolerated in this high throughput factory. Darwinian tenet would hold that natural selection acts to maintain stasis in a population by jettisoning the anomalous. Survival of a potential cancer cell is clearly incongruous in this background, since it should have been weeded out long before its daughters were able to overwhelm the marrow, but not if the initiation of cancer is a serendipitous phenomenon. Within every population, there are cells with minor variations; some cells are more “fit” to survive than others. Cellular proliferation in the bone marrow, occurs in “niches” where the balance between the negative and positive growth signals is tilted towards the latter. Imagine that a population of cells happened to become isolated in a microenvironmental niche that provided less than ideal support (for example, a slightly hypoxic environment) for the growth of normal cells. Some of the trapped cells may have been better able to survive in this abnormal environment as compared to normal cells that would have died perhaps because they were smaller in size, or they divided faster, or could withstand hypoxia better or lacked a surface protein necessary for recognition by a death effector for elimination. In short, cancer cells may be able to survive and outnumber normal cells in certain “abnormal” microenvironments precisely because of their inability to compete with normal cells in the “normal” microenvironment. The abnormality is best manifested as a growth advantage. If a cancer cell enjoys even a slight growth advantage, it will outnumber its normal counterparts within a few generations, something that can happen in a matter of weeks or days as far as the human body is concerned.

I would like to posit that at least in some instances, the initiation of cancer involves isolation and entrapment of variant cells in a microenvironmental milieu that is not conducive to the proliferation of normal cells. Any variation that enhances the likelihood for survival and reproduction will then be passed from one generation to the next simply as a result of natural selection. Accumulation of even subtle genetic changes over many generations could eventually have a dramatic effect.

An example is that of fatty foods causing gastrointestinal cancer. In rather simplistic terms, there is a burst of secretion of bile acids in the gut following the ingestion of a fatty meal. These bile acids perform their metabolic function efficiently, but as a side effect, also induce programmed cell death in the surrounding mucosal cells. With frequent fatty meals and repetition of this cycle, the stressed cells facing the bile acid assaults fight back by developing survival strategies in this noxious environment. Eventually, one cell will either be selected for survival because of its “differential fitness” or because it has silenced the genes that mediate programmed cell death. An epigenetic mechanisms that cancer cells have been widely shown to employ for silencing genes for death and differentiation is that of hyper-methylation. Simply by adding methyl groups to the cytosines (CpG islands) in the promoter sites of critical genes, the cell can block transcription of that gene. This cell develops the ability to thrive in a microenvironment which is killing its normal counterparts. A survival phenotype is a cancer phenotype.

Chance factors could operate to facilitate the survival of a variant clone of cells, slightly different than the normal cells, but it is still natural selection that does the rest of the work. The role of natural selection is to improve the “fit” between an organism and its environment. Expansion of the clone of cells must be accompanied by co-evolution of the seed (cells) and the soil (microenvironment). Take the following example. Cancer cells may proliferate continuously either because the soil is providing these “growth factors”, or the cell is constitutively turned “on” because of a genetic mutation. The cancer cell must not only divide and expand its own population continuously, it must also shut off the proliferation of normal cells. One way this is accomplished may be by developing the ability to proliferate in response to signals that are inhibitory to the normal cell as illustrated in the following example.

Cells communicate and transmit signals through proteins called cytokines. Tumor necrosis factor or TNF is a cytokine that induces normal cells to undergo programmed cell death. Some leukemia cells on the other hand are stimulated to proliferate by TNF. Let us go back to our statement that within every population, there are cells with minor variations; some cells are more “fit” to survive than others. Now imagine what happens when there are a number of stem cells with varying “fitness” trapped in a microenvironmental niche which had a higher than normal level of TNF. The “normal” cells will be inhibited from proliferating while the slightly “abnormal” one will begin to proliferate. With time, the more TNF is produced, the better the abnormal cell fits the environment and expands its population at the expense of its normal counterpart. In fact, the abnormal cell itself may start producing TNF to enhance its own growth while at the same time suppressing that of the normal cells.

The microenvironment of cancer cells in the body not only consists of stromal cells capable of producing cytokines such as TNFa, but in addition harbors components of the immune system as well as newly formed blood vessels which directly affect the growth and perpetuation of the abnormal clone of cells. An important implication of these biologic insights is that the “cause” of cancer as a disease entity is at least in part related to the changed microenvironment and not something restricted to the intrinsic properties of the cancer cell. Consequently, strategies directed at eliminating the malignant cell alone, no matter how efficient, will only solve part of the problem at best, and be successful temporarily. Even if 99% of the abnormal cancer cells are destroyed but the microenvironment is left intact with all its abnormal features, then normal cells would not be able to survive for long in that setting, resulting in the redistribution of the growth advantage back to a “more fit” or abnormal cell causing relapse. This scenario is unfortunately all too familiar in the treatment of most cancers. Chemotherapy can produce striking complete remissions, but the cancers relapse eventually, and the second time around, they are more resistant to therapy as the cells causing a relapse have followed the Darwinian selection process of having survived in the presence of the noxious drug in the first place. In order to obtain complete and durable responses, both the seed and the soil would need to be targeted.

Developing models like this is not just of theoretical interest, but there are immediate and practical applications of these to the human condition. The conclusion is that it should not be a case of “either/or” in terms of the reductionist versus pluralistic view of cancer, but a combination of the two views as far as planning effective treatment is concerned. In order to kill the seed or the cancer cell, a reductionist approach must be used to identify the key steps involved in the perpetuation of the clone. Targeted therapies should be developed to interfere with the specific intracellular steps, for example an abnormal protein being produced by a mutated gene. In addition, with the pluralistic view of cancer in mind, the extracellular components should be targeted simultaneously, for example blood vessels or cytokines such as TNF. The future success of cancer treatment will depend on how rapidly and how effectively we learn to combine therapies which simultaneously attack several targets in the cell as well as the microenvironment. Studying cancer cells in isolation without their natural in vivo microenvironment, or through artificial mouse models will only yield limited information.

In summary then, cancer initiation could be the result of the serendipitous presence of an abnormal cell in an abnormal microenvironmental niche. Natural selection then works to improve the fitness between the seed and the soil, making both increasingly abnormal. The rate at which this occurs depends at least in part on the body’s ability to mobilize the immune system to mount a counterattack, and that of the cells to expand their clone, for example through the formation of new blood vessels. Thus, the time from initiation to actual disease manifestation could vary considerably depending on the forces driving the fitness landscape. The famous quip by a Neo-Darwinist (who believe that evolution is a gradual process) criticizing the punctuated equilibrium theory that he “did not believe in evolution by jerks” was answered by the Gould group (who suggest that periods of stasis are punctuated by sudden proliferation of species) with the retort that they “did not believe in evolution by creeps”. The evolution of cancer is probably best described by both jerks and creeps.

Previous Rx Columns:
Spicing Cancer Treatment
The War on Cancer

Monday, January 2, 2006

Monday Musing: In the Peace Corps’ Shadow

A couple of weeks ago the travel writer and memoirist Paul Theroux published an opinion piece entitled “The Rock Star’s Burden” in the New York Times. It is an article full of bitterness and bile where, in a display of almost unbelievable hubris, Theroux basically expresses a thinly disguised disappointment that the country of Malawi, where he worked as part of the Peace Corps 40 years ago, has not been able to convert his (and others’) generous donation of time and energy into becoming more like a grateful version of Switzerland:

Theroux_2Those of us who committed ourselves to being Peace Corps teachers in rural Malawi more than 40 years ago are dismayed by what we see on our return visits and by all the news that has been reported recently from that unlucky, drought-stricken country. But we are more appalled by most of the proposed solutions.

I am not speaking of humanitarian aid, disaster relief, AIDS education or affordable drugs. Nor am I speaking of small-scale, closely watched efforts like the Malawi Children’s Village. I am speaking of the ”more money” platform: the notion that what Africa needs is more prestige projects, volunteer labor and debt relief. We should know better by now. I would not send private money to a charity, or foreign aid to a government, unless every dollar was accounted for — and this never happens.

He then takes his misguided judgment of the causes of problems in Malawi and, predictably enough, generalizes it to all of Africa:

Teaching in Africa was one of the best things I ever did. But our example seems to have counted for very little. My Malawian friend’s children are of course working in the United States and Britain. It does not occur to anyone to encourage Africans themselves to volunteer in the same way that foreigners have done for decades. There are plenty of educated and capable young adults in Africa who would make a much greater difference than Peace Corps workers.

The emigration of Africans to the preposterously prosperous countries of the West particularly galls Theroux; after all, didn’t he go there to try and help them? Why can’t they stay and help themselves? Is he really seriously suggesting that if Malawians, with an average income of around 50 cents per day, 900,000 of whom are infected with AIDS, and who have a basic literacy rate of barely 50 percent, were to just stay home and “volunteer in the same way that foreigners have done for decades,” that Malawi’s problems would go away? It doesn’t seem to have occurred to Theroux that while he had the education and the luxury of taking a couple of years off in his youth to indulge his idealistic fantasies (and turn the experience into a lucrative career writing about it–it takes the average Malawian a year to earn the amount of money Theroux probably makes in a day) through a program (the Peace Corps) explicitly designed as a propaganda tool for the American government in the cold war years, most Malawians cannot take a few years off to “volunteer” for the betterment of their country. Of course, those (and there are really very few) who are able to get to the West to make a better life for themselves will do so. And why shouldn’t they? (Mr. Theroux seems not even to have any idea of the difficulties of getting a visa to the West for anyone in the third world.)

Bonoimg782200Bono, through his high-profile campaigns for African debt relief, serves as the main lightning rod for Theroux’s odious and acidic attacks:

There are probably more annoying things than being hectored about African development by a wealthy Irish rock star in a cowboy hat, but I can’t think of one at the moment. If Christmas, season of sob stories, has turned me into Scrooge, I recognize the Dickensian counterpart of Paul Hewson — who calls himself ”Bono” — as Mrs. Jellyby in ”Bleak House.” Harping incessantly on her adopted village of Borrioboola-Gha ”on the left bank of the River Niger,” Mrs. Jellyby tries to save the Africans by financing them in coffee growing and encouraging schemes ”to turn pianoforte legs and establish an export trade,” all the while badgering people for money.

And also:

Bono, in his role as Mrs. Jellyby in a 10-gallon hat, not only believes that he has the solution to Africa’s ills, he is also shouting so loud that other people seem to trust his answers. He traveled in 2002 to Africa with former Treasury Secretary Paul O’Neill, urging debt forgiveness. He recently had lunch at the White House, where he expounded upon the ”more money” platform…

By coincidence, at the time that I read Theroux’s hysterical screed against any money for Africa (keep in mind his saying, “I would not send private money to a charity, or foreign aid to a government, unless every dollar was accounted for — and this never happens”), I had just finished reading The End of Poverty by Jeffrey Sachs, with a foreword by the much-maligned Bono. Sachs is an extremely well-respected economist, and was named one of Time Magazine’s 100 Most Influential People. He is also the director of the Earth Institute at Columbia University. I recommended Sach’s book in 3QD’s year-end round-up of the best books of 2005, and he does such a good job of not only explaining the “poverty trap” that some African (and other extremely poor) countries find themselves in, but also of anticipating and answering the objections of the likes of Theroux, that I will let him do most of the talking now:

Sachs_3When poverty is very extreme, the poor do not have the ability–by themselves–to get out of the mess. Here is why: Consider the kind of poverty caused by a lack of capital per person. Poor rural villages lack trucks, paved roads, power generators, irrigation channels. Human capital is very low, with hungry, disease-ridden, and illiterate villagers struggling for survival. Natural capital is depleted: the trees have been cut down and the soil nutrients exhausted. In these conditions the need is for more capital–physical, human, natural–but that requires more saving. When people are poor, but not utterly destitute, they may be able to save. When they are utterly destitute, they need their entire income, or more, just to survive. There is no margin of income above survival that can be invested for the future.

This is the main reason why the poorest of the poor are most prone to becoming trapped with low or negative economic growth rates. They are too poor to save for the future and thereby accumulate the capital per person that could pull them out of their current misery…

[The saving rate, for example, of upper-middle-income countries was 25% as opposed to 10% for the least-developed countries, according to a 2004 World Bank study.]

In fact, the standard measures of domestic saving, based on the official national accounts, overstate the saving of the poor because these data do not account for the fact that the poor are depleting their natural capital by cutting down trees, exhausting soils of their nutrients, mining their mineral, energy, and metal deposits, and overfishing… When a tree is cut down and sold for fuelwood, and not replanted, the earnings to the logger are counted as income, but instead should be counted as a conversion of one capital asset (the tree) into a financial asset (money). (TEoP, p.57)

There is much more to this, but you will have to read the book yourself to get all the details, which Sachs does an admirable job of laying out for the non-specialist reader. Much of the book is spent in showing that it is possible, using available data, to estimate fairly accurately the amounts of capital infusion needed by a country to escape the poverty trap. It’s better to just let Sachs take it from there:

Africa needs around $30 billion per year in order to escape from poverty. But if we actually gave that aid, where would it go? Right down the drain if the past is any guide. Sad to say, Africa’s education levels are so low that even programs that work elsewhere would fail in Africa. Africa is corrupt and riddled with authoritarianism. It lacks modern values and the institutions of a free market economy needed to achieve success… And here is the bleakest truth: Suppose that our aid saved Africa’s children. What then? There would be a population explosion, and a lot more hungry adults. We would have solved nothing.

If your head was just nodding yes, please read this chapter with special care. The paragraph above repeats conventional rich-world wisdom about Africa, and to a lesser extent, other poor regions. While common, these assertions are incorrect. Yet they have been repeated publicly for so long, or whispered in private, that they have become accepted as truths by the broad public as well as much of the development community, particularly by people who have never worked in Africa.  I use the case of Africa because prejudices against Africa run so high, but the same attitudes were expressed about other parts of the world before those places achieved economic development and cultural prejudices could not hold up. (TEoP, p. 309)

Hmmm, does the first paragraph above remind you of something you’ve read lately? In the rest of the chapter, Sachs answers these and other objections to aid for Africa in careful detail, with section headings such as:

  • Money down the drain
  • Aid programs would fail in Africa
  • Corruption is the culprit
  • A democracy deficit
  • Lack of modern values
  • The need for economic freedom
  • A shortfall of morals

Just to give a flavor of how Sachs’s refutations of these cliched arguments go, let me first quote our self-appointed Africa expert, Mr. Theroux, one last time:

When Malawi’s minister of education was accused of stealing millions of dollars from the education budget in 2000, and the Zambian president was charged with stealing from the treasury, and Nigeria squandered its oil wealth, what happened? The simplifiers of Africa’s problems kept calling for debt relief and more aid. I got a dusty reception lecturing at the Bill and Melinda Gates Foundation when I pointed out the successes of responsible policies in Botswana, compared with the kleptomania of its neighbors. Donors enable embezzlement by turning a blind eye to bad governance, rigged elections and the deeper reasons these countries are failing.

Now here is Sachs again:

In the past, the overwhelming prejudices against Africa have been grounded in overt racism. Today the ever repeated assertion is that corruption–or “poor governance”–is Africa’s venal sin, the deepest source of its current malaise. Both Africans themselves and outsiders level this charge…

The point is that virtually all poor countries have governance and corruption indicators that are below those of the high-income countries. Governance and higher income go hand in hand not only because good governance raises incomes, but also, and perhaps even more important, because higher income leads to improved governance…

Africa’s governance is poor because Africa is poor. Crucially, however, two other things are also true. At any given level of governance (as measured by standard indicators), African countries tend to grow less rapidly than similarly governed countries in other parts of the world… Something else is afoot; as I have argued at length, the slower growth is best explained by geographical and ecological factors. Second, Africa shows absolutely no tendency to be more or less corrupt than other countries at the same income level. (TEoP, p. 311)

As for Africa’s lack of democracy, Sachs notes that:

Africa’s share of free and partly free countries, 66 percent, actually stands above the average for non-African low-income countries in 2003, 57 percent…

Democratization, alas, does not reliably translate into faster economic growth, at least in the short term. The links from democracy to economic performance are relatively weak, even though democracy is surely a boon for human rights and a barrier against large-scale killing, torture, and other abuses by the state. The point is not that Africa will soar economically now that it is democratizing, but rather that the charge of authoritarian rule as a basic obstacle to good governance in Africa is passe. (TEoP, p. 315)

Well, you get the idea. Buy the book and read it. As for Theroux, he should stick to doing what he does best: writing gossipy accounts of much better writers than himself, like, In Sir Vidia’s Shadow, his book trashing his former mentor, V. S. Naipaul. And more power to Bono!

From Sach’s website: How You Can Help End Poverty.

Have a good week!

My other recent Monday Musings:
Richard Dawkins, Relativism and Truth
Reexamining Religion
Posthumously Arrested for Assaulting Myself
Be the New Kinsey
General Relativity, Very Plainly
Regarding Regret
Three Dreams, Three Athletes
Rocket Man
Francis Crick’s Beautiful Mistake
The Man With Qualities
Special Relativity Turns 100
Vladimir Nabokov, Lepidopterist
Stevinus, Galileo, and Thought Experiments
Cake Theory and Sri Lanka’s President

Selected Minor Works: Oh. Canada.

Justin E. H. Smith

One often hears that Montreal is the New York of Canada. It seems to me one may just as well say that Iqaluit is the New York of Nunavut. Both analogies are true enough, insofar as each settlement in question is the undisputed cultural capital of its region. But analogies can often work simply in virtue of the similitude of the relation in each of the pairs, even when the two pairs are vastly different the one from the other. Montreal is the New York of Canada, to be sure. But Canada, well… Canada is the Canada of North America.

This will be the first of two articles in which I lay out a scurrilous and wholly unfounded diatribe against the place I now call home. The second part will consist in a screed against Canada as a whole; today I would like to direct my bile towards Montreal in particular.

Sometime in early 2002, there was an amusing article in the New York Times, chronicling the fates of a few New York families that had fled to re-settle with relatives in Canada for fear of further attacks. Within a few months, they were back. As I recall, one man was quoted as saying something like: I’d rather go up in fireball, I’d rather be vaporized, than live out the rest of my days up there.

New York pride is not only quantitative, yet it is interesting to note that there was more square footage in the World Trade Center than in all the highrises of Montreal combined. Still, in terms of square feet, if not of lives, September 11 scarcely made a dent in Manhattan.  It is of course not everywhere that the greatness of a city is measured by the number of skyscrapers it hosts. If this were the universal measure, Dallas would have London beat by a long-shot. But in Montreal the skyline is constantly pushed, on the ubiquitous postcards and tchotchkes sold along St. Catherine Street, as though this were some great accomplishment of human ingenuity, rather than a paltry imitation, a mere toy model, of the envied city to the south.

Les gratte-ciel are also celebrated shamelessly in Quebecois art and cinema. Take Denys Arcand, the tiresome and repetitive director of The Decline and Fall of the American Empire and its sequel The Barbarian Invasions, as well as of the slightly more compelling 1989 film, Jesus of Montreal. The way he cuts to new scenes with panoramic shots of the city’s skyscrapers at night, alto saxes blaring, you would think you were watching a promotional segment of the in-flight entertainment program on an incoming Air Canada plane. You would almost expect this schmaltzy segue to be followed by scenes of children getting their faces painted at a street fair, of horse carriages in the old town, or of a group of young adults, sweaters tied around their necks, laughing in a restaurant booth as a man in a chef’s hat serves them a flaming dessert. And yet this is not Air Canada filler, but the work of a supposedly serious director, himself only one example of a very common phenomenon in French Canadian movies. Every time I see the Montreal skyline glorified in Quebecois cinema, I think to myself: if Nebraska had a state-subsidized film industry, Omaha too would be portrayed as a metropolis.

But pay attention to the panorama, and you will see that there is simply not much there. Montreal is probably a notch closer to Iqaluit than it is to New York on the scale of the world’s great cities. I place it just behind Timisoara, and just ahead of Irkutsk, Windhoek, and Perth. It is admittedly not just an aluminum shed and a ski-doo or two. But still one gets the sense there that the entire settlement could be easily dismantled and quitted overnight, as one might pack up a polar research station. I’ve lived in Montreal for three years, and still, every time a Canadian commences another soporific paean to the place I think to myself: where is this city you keep mentioning? I must still be lost in the banlieue. I must not have discovered that dense and vital core of the place that would justify all this effusive praise. And so I consult the map repeatedly, and determine to my confusion that I have by now been just about everywhere in the city, indeed that I live in the centre-ville. In New York, in contrast, I always know, in the same way I know I exist, that I am most assuredly, metaphysically there. You cannot be in New York and doubt that you are in New York.

A student of mine recently returned from her first trip to New York and announced that it is ‘not all that different’ from Montreal. She noted that there is virtually the same concentration of hipsters in each place, and that many New York hipsters are listening to Montreal bands such as Les Georges Leningrad. Call it ‘the hipster index’. In Baltimore, Tucson, Cincinnati, and even Edmonton, there are plenty of ruddy youngsters who collect vinyl, make objets d’art with trash they find, do yoga, declare ‘I’m not religious per se, but I consider myself a very spiritual person,’ read Jung and Hesse and Leary and (‘just for fun’) their horoscopes, have spells of veganism, try to build theremins, decorate with Betty Page artifacts, and speak disdainfully of that empty abstraction, ‘Americans’. I’ve been to these places, and seen them with my own eyes. All these places rank very high on the hipster index. I’m afraid, though, that I am reaching a period of my life in which I measure the greatness of a place by other indices. Like beauty, for instance, and the intensity and importance of the things the grown-ups there are up to.

The other city often invoked in order that Montreal might borrow a bit of greatness is, of course, Paris. The city on the Seine, but without the jet-lag, is how the tourism industry packages it. I think this has something to do with the fact that a French of sorts is spoken in the province. But an English (of sorts) is spoken in Alabama, and nobody thinks to invoke London to try to get people to go there. It is odd, when you think about it, to make a claim to greater affinity with the Old World on the mere basis of la francophonie. After all, every major language of the New World –excluding those of the First Nations—is part of the European branch of the Indo-European family, but this doesn’t give Brazil, Panama, or the United States any special foothold in Europe.

I have been to Paris, and stood at intersections waiting to see pick-up trucks pass by with bumperstickers exclaiming the French equivalent of ‘This vehicle protected by Smith & Wesson,’ or ‘U toucha my truck, I breaka u face.’ They don’t have these there. They don’t have strip malls, or ‘new country’, or donuts, or (regrettably) coffee to go, and WWF wrestling has not made much of an impact.

The situation is quite different in Quebec. La belle province is 100% American, in the early-18th-century sense of the term, and Montreal is but an outlying provincial capital. The metropolitan capital to which Montreal is subordinated is New York. What counts as center and what as periphery does not, of course, stay the same forever. A few more decades of incompetent US government and global warming may change the balance between the two cities. For now, anyway, this is just how things are.

A very happy new year from 125th Street in Harlem. I will be returning to my usual, deracinated life up north a few days from now. If they’ll still let me in.

Dispatches: Divisions of Labor III

Strikes have engulfed New York City this winter. While members of the Transit Workers Union have gone back to work, NYU graduate assistants are preparing to resume picketing with the start of term on January 17th (usual disclaimer: me too). The situation is simultaneously encouraging and grim. Administrative threats of three semesters’ loss of work and pay have caused some attrition, but, impressively, have not broken the strike. By comparison, the 1995-6 Yale grade strike ended after threats of a similar variety – perhaps having already had union recognition and a contract has made the NYU graduate assistants more optimistic. Individual departments’ attempts to protect students from the severity of the administration’s punitive measures have mostly fallen short of extending any promises to those who continue picketing on the 17th. The climate, then, has become inhospitable to assistants who, for entirely legitimate reasons (among them, concerns over visa status, financial hardship, and impeded career advancement), no longer find enough certainty with respect to escaping potential reprisals. So far from signifying dissent from the union, however, these losses measure instead the level of vituperation with which the university sees fit to treat its members – the preservation of a ‘collegial’ relation to whom supposedly necessitates the union’s destruction. Here, rather than attempt an ethical adjudication (a perusal of the relevant documents will allow you to do that for yourself), I think it might be useful both to narrow and widen the usual perspective, which sees the university as the relevant object of focus, in order to consider some relevant internal differences as well as some external factors in this conflict. (For the basic dossier, see the Virtual Mind strike archive.)

To begin with, a narrower focus. Much discussion of late has had to do with the alleged concentration of strikers in the humanities and social sciences. Like many assertions in this debate, it usually remains unsubstantiated, circulating instead as a dark hint that the strike is the result of naive idealism. Consequently, NYU President John Sexton often describes graduate assistants in infantilizing terms,  reinforcing the idea that their grievances are an immature form of teenage rebellion. Furthermore, such infantilizing rhetoric carries with it the paternalistic notion that the university administration should be trusted to have its charges’ best interests at heart, even and especially when said charges are misbehaving. The longstanding association of the humanities with countercultural protest, amplified by the academic “culture wars,” in this case serves to delegitimize, and render strictly cultural, complaints of exploitation by graduate students. Strategically, then, this emphasis on the culture of protest over social analysis is a favored tactic of the administration and its supporters: as one anti-union philosophy professor put it on a weblog discussion of the strike, “if graduate students don’t want to be treated like spoiled children, they should stop behaving like spoiled children.” (Of course, the irony of this tautological ad hominem attack is that graduate assistants are attempting to dispute just this characterization of their position.)

Here I might return to the theme of “collegiality.” The picket line, with its chanting, drumming, singing – in short, its performativity – is by its nature often carnivalesque: not only the ordinary collegial etiquette, but the very habitus, or social and bodily disposition, of university life is suspended by it. The result is an unleashing of pent-up energies and frustrations of many kinds, including elements that exceed the basis of the conflict, such as the offensive nature of the university’s communications with graduate assistants. This is why the defense of collegiality has become an important high ground to the administration: harping on it allows the picket line’s symbolic excess to be depicted as a form of reactive immaturity. Paradoxically, immaturity is also seen to be a form of belatedness: Sexton’s euphemistic corporate terminology of an “Enterprise University” and “University Leadership Team” leaves no room such “dated” practices as strikes and protests, and the supposedly expired sixties radicalism from which they are thought to stem. Just as the domain of the humanities is linked to anachronistic countercultural protest, so then is the social practice of picketing. On both counts, we’re both too young and too old, past our sell-by date before we grow up. This argumentative tack, however, allows for the obfuscation of the original conflict. Even so, analyzed as a cultural form, the picket line performs an important function: it inscribes and instantiates the strike both to observers and in the minds and bodies of those striking. As Louis Althusser might have said, it “interpellates” (roughly, allows the self-recognition of) those who take part, and thus functions as a radicalizing action. Insofar as it refuses collegial dialogue and substitutes the implacable presence of the bodies of strikers, picketing only belongs more purely to the category of action.

Whatever the ideological hailing effects of picketing, if humanities students are strongly in support of striking, the true cause is not a nostalgic commitment to counterculture. The sociological facts on the ground, which are cleverly obscured by the strategy of infantilization, provide much more compelling justification. Unfortunately for the University Leadership Team’s propaganda efforts, graduate study these days tends to include discussion of the sociology of graduate education itself, which has become an important sub-field in literature departments. Doctoral students thus know all too well that fewer than half of them receive tenure track jobs within a year of receiving a diploma; that the number of non-tenured teachers continues to grow at a much faster rate than that of tenured faculty across the disciplines; that universities continue to rely on graduate and adjunct labor, while relatively fewer and fewer tenured professors enjoy the privilege of teaching only upper-level and graduate courses; that graduate assistants teach nearly all introductory courses in language and literature; and that collectivization is the rational response to the exploitation of a labor pool. These are not cultural differences between bohemian graduate students and technocratic administrators; they are social realities. And although these realities are not restricted to the language and literature programs – not at all – these departments have been affected very deeply by this macrocosmic shift in the structure of university teaching.

For this reason, which the “U.L.T.” knows as well as we do, a “New Policy” was announced in November by the university’s deans, which stipulates that graduate assistants’s normal teaching load of two stand-alone courses per semester will be reduced to one (this will primarily affect language and literature graduate assistants, as they teach most of the stand-alone courses). On the face of it, an early Christmas present, no doubt unrelated to the strike. In practice, however, it means three things: one, the university is suddenly authorizing itself to hire large numbers of new adjuncts to fill the newly vacated positions, in contradiction to its expressed aim of reducing the amount of contingent (adjunct) labor, without it looking like these are replacements for striking workers. Why, they’re simply being brought in to fill brand-new positions. The fact that these adjunct professors might conveniently be asked to substitute for striking workers is doubtless a coincidental side benefit. Second, it nourishes the university’s paternalist stance: reducing the teaching load strengthens their claim that graduate teaching is nothing more than apprenticeship or training, and that long-term shifts towards graduate and adjunct labor are being magically reversed. They really care! And third, most disturbingly, graduate assistants who choose to take on the heretofore normal load of two courses next semester can “bank” the extra course, and collect a free semester of funding in the fall. That’s right: teachers who strike this spring semester will lose their work and pay for the next three semesters, according to the Provost, whereas those who return to work and teach what until now was the standard two courses will receive a semester of free money. It might be supposed this will not foster a collegial atmosphere amongst teachers. Best of all, for the administration, this policy will primarily affect the language and literature programs, where students have a clear-eyed view of the labor issues involved because of their disciplinary location and thus strongly support the union. One is perversely impressed with shrewdness of this policy, although one is also sure that the law firm NYU employs to eradicate the union is more straightforwardly proud.

Finally, by way of briefly widening the focus beyond the institution of the university, let us consider NYU in a larger context. As this investigative piece in the Nation reveals, the MTA’s leadership has been engaged in a number of lucrative business dealings involving renting office space to its corporate sub-contractors. All this has been financed through public debt, and overseen by the presence on the MTA of the very people who stand to gain the most from such arrangements, but whose interest in public transportation is unclear. At NYU, the body with whom ultimate authority rests is the Board of Trustees (here is some background on its chair and vice-chairs). In an example of determination in the last instance by the economic sphere, to again allude to Louis Althusser, this board is populated by people with very different interests to those of university teachers. Comprised largely of financiers, corporate lawyers, real estate developers, and the leaders of media conglomerates, the board has shown very little interest in the sympathetic appeals of graduate assistants and our claim that the union palpably improved working and learning conditions at NYU. Of course, the commonly held conception of the university as the privileged space outside of the dominance of corporations in American society tends to disable the recognition that, in fact, universities reside within the sphere of economic determination, and are not necessarily any more amenable to arguments based on social justice than any other type of institution. The indifference of the board to the measurable benefits of unionized graduate assistants only reconfirms this. In fact, perhaps one can go so far as to postulate an inverse relation between the progressive prestige of a university and its hostility to a collectivized workforce: as evidence, one can adduce the immensely anti-union positions of the Ivy League schools. An ambitious school such as NYU is no doubt under immense pressure from the administrators of its more established siblings to resist precedent-setting unionization, and along the way absorb all the costs and bad publicity that accrue to union busting. Sadly, NYU seems more than happy to take one for the team it wishes to join, and thus to leave in place this inversion by which institutions who loudly condone progressive agendas in their publicity materials are the same ones who most viciously fight to prevent them from gaining any ground. A consolation: if we win, perhaps they will eventually realize that they have too.

Dispatches:
Divisions of Labor II ( NYU Strike)
Divisions of Labor (NYU Strike)
The Thing Itself (Coffee)
Local Catch (Fishes)
Where I’m Coming From (JFK)
Optimism of the Will (Edward Said)
Vince Vaughan…Eve Sedgwick (Homosocial Comedies)
The Other Sweet Science (Tennis)
Rain in November (Downtown for Democracy)
Disaster! (Movies)
On Ethnic Food and People of Color (Worcestershire Sauce)
Aesthetics of Impermanence (Street Art)

Monday, December 26, 2005

Monday Musing: ‘Tis the Season for Lists

For many years Abbas and I have spent the occasional evening composing lists of the greatest this, the smartest that, and the most overrated other. As you can imagine, it usually comes at some late moment when we’re tipsy. It’s a silly act of camaraderie which I would do with very, very few others. For me, it’s also a very private affair, which is precisely the opposite role that lists play in society.

I was reminded of it this holiday season, as I am on every other holiday season, because it is the season for collective judgment. Sometime between the beginning of November and the end of January, we are bombarded with lists, usually top 10 lists—and not just the best books, best fiction, best non-fiction, best movies, best albums, best songs, and their complement “worst’s”, but also worst disasters, worst web design mistakes, best and worst toys, and industry or sub-culture specific objects that are, so to speak, too numerous to list.

A list is different in kind and in effect from a simple “person of the year” or other declaration of a superlative. The latter sorts of things usually require some extensive justification of the judgment. If I were to say that Tony Judt’s Postwar was the best book that came out this year, you may reasonably ask why I thought so. And I would give a host of reasons to defend my claim. (In this instance, the claim is hypothetical.) But once I list runners up, I’m forced to answer different questions—why a work of history over fiction? why this prose style over that one?

This comparative quality of lists is the seductive virtue that turns the whole affair into a participatory event. (I was thinking about this when Abbas was soliciting top 10 books of the year from 3Quarks editors.) Relative judgments seem to engage us more than absolute ones. Say Hitler is a monster, you have no quarrel. Say Hitler is a worse monster than Stalin, and then you have a debate. Or if that’s too contentious, try: Franklin Roosevelt was a great wartime leader, against Franklin Roosevelt was a greater wartime leader than Churchill. This is not to say that the judgments of the former kind aren’t debated but that the latter elicits more responses and wider audiences. The Prospect/FP poll of the global public intellectuals did probably far more to create an audience for Oliver Kamm (with his neurotic Chomsky-phobic rant) than it did for Chomsky. Kamm was part of the debate; Chomsky was its object. And for the wider circles, Chomsky’s ordinal rank relative to Daniel Dennett, Richard Posner, or Slavo Zizek, is more contentious affair than whether he is well-known and well-respected public intellectual (at least in many circles).

This fun-silly exercise is not restricted to dilettantes such as yours truly. Sidney Morgenbesser once recounted a dinner with Isaiah Berlin spent classifying philosophers into gods, geniuses, brilliant men, smart guys, and some fourth category, whose title I don't recall. They got into a fight over where to place Leibniz, and wound up creating the category of demigods, which became populated solely by Leibniz. The story made me feel less silly.

Now with the audience that Amazon.com brings, these exercises grow more and more common, so much so that it calls itself listmania. (But some times I wonder whether this need to state our judgments even over matters of taste to wider and wider audiences doesn't make us kin to Judge Judy or the mobs found in Jerry Springer.)

Criticisms or reflective assessments of lists commonly begin with something like: “List say more about those that construct them than they do about . . .” the object, or the real world, or whatever else they’re supposed to tell us about. That of course is trivially true, in the sense that any made object tell us something, often a lot, about the maker. But it is true that lists generally deflect attention away from the criteria for judgment and, quite often, the judge. (“Judge, lest ye be judged,” Karl Kraus once said.) This is so even when the criteria for judgment are made fairly explicit.

Interesting lists offer us not so much new rankings but new dimensions for evaluation. The lists that fill much of the Pillow Book of Sei Shonagon, lady in waiting to the Empress Sadako (or Teishi) during the Hiean, are wonders. Each list evokes memories and sensations rather than judgments and thereby disagreements. Some of my favorite lists of Shonagon’s:

109. Things That Are Distant Though Near

Festivals celebrated near the Palace

Relations between brothers, sisters, and other members of a family who do not love each other.

The zigzag path leading up to the temple at Kurama

The last day of the Twelfth Month and the first of the First

And especially,

44. Things That Cannot Be Compared

Summer and winter. Night and day. Rain and sunshine. Youth and age. A person's laughter and his anger. Black and white. Love and hatred. The little indigo plant and the great philodendron. Rain and mist.

When one has stopped loving somebody, one feels that he has become someone else, even though he is still the same person.

In a garden full of evergreens the crows are all asleep. Then, towards the middle of the night, the crows in one of the trees suddenly wake up in a great flurry and start flapping about. Their unrest spreads to the other trees, and soon all the birds have been startled from their sleep and are cawing in alarm. How different from the same crows in daytime!

The lady Murasaki Shikibu, author of the Tale of Genji, one of the earliest novels ever written (circa 1000 A.D.) and contemporary of Shonagon, described her as “frivolous”, and concluded that “[h]er chief pleasure consists in shocking people, and, as each eccentricity becomes only too painfully familiar, she gets driven on to more and more outrageous methods of attracting notice.” But this is precisely the virtue of lists such as Shonagon’s; they get people to notice by pointing to new dimensions and new collections, and not simply to our judgment. If we can't be outrageous with the playful, where can we be?

The lists don’t have to consist of exotica. Nick Hornby did a remarkable job of using simple lists to construct a seductive story in High Fidelity. But when they do consist of exotica they really seduce, as in the case of many of Borges' stories. It’s probably a little late now, but for next season, I suggest new kinds of lists, ones that speak of our wit, creativity, and even whim.

Happy Monday and a Happy New Year.

Sunday, December 25, 2005

Happy Newton’s Day!

Isaac20newtonDespite the fact that December 25th happens to be the birthday of a number of important historical figures (for example, Mohammad Ali Jinnah, the founder of Pakistan, which is where I am from), last year we at 3 Quarks Daily thought we would celebrate Newton’s birthday on this date. Unbeknownst to us, Richard Dawkins had just published an article suggesting the same thing. We were flattered. So here we are again, on Newton’s Day!

To commemorate this auspicious occasion, I thought I would try to deal with the apple today. You know what I am talking about: the apple that supposedly fell on Sir Isaac’s head while he was resting under a tree, and which jarred him into formulating the theory of gravity. The story is almost certainly apocryphal (no getting away from the Bible, is there?), but what could it mean? There are many ways to try and understand this story, but I just want to point out something simple but very cool: look at my drawing of me lobbing a ball over to a friend of mine below.

Newton_1

The ball follows a parabolic path from my hand to those of my friend. But what if my friend wasn’t there, and nor was the surface of the Earth? What if the ball could just pass through the Earth as if all its (the Earth’s) mass were concentrated at its center? What would happen then? Look at my next drawing below.

Newton_2

The ball would go in an elliptical path, with one of its foci being the center of the Earth! The parabola above the surface of the Earth is just one end of the bigger ellipse! Where had Newton heard of ellipses before? Yep, from Kepler, who had shown that planets travel in elliptical orbits around the Sun. How’s that for a connection between small objects falling on Earth, and the heavenly spheres? Of course, we’ll never know Newton’s real line of reasoning, but here’s a possible one:

  • apple falls on Sir Isaac’s head
  • he starts to think how freely falling objects behave
  • he generalizes to objects following parabolic paths
  • he imagines what happens if the surface of the Earth doesn’t stop the object
  • he realizes the object falls into elliptical path
  • he realizes planets are just “falling” around Sun

Okay, it probably wasn’t that way, but I still think it’s a nice thought. And in case you’re wondering just how big Newton’s intellectual reputation is, check this out from the London Times:

Newton trounces Einstein in vote on their relative merits

His most famous equation, E=mc², is 100 years old, and 2005 has been named Einstein Year in his honour, but Albert Einstein has been trounced in a scientific beauty contest held to celebrate his own greatest achievements.

The most famous head of hair in science was soundly beaten by Sir Isaac Newton yesterday in a poll on the relative merits of their breakthroughs, with both scientists and the public favouring the Englishman by a surprisingly wide margin.

Asked by the Royal Society to decide which of the two made the more important contributions to science, 61.8 per cent of the public favoured the claims of the 17th-century scientist who developed calculus and the theory of gravity.

More here.  And, oh, what the… 

MERRY CHRISTMAS!

[This post dedicated to LWP.]

Monday, December 19, 2005

Dispatches: The Thing Itself, or the Sociology of Coffee

In the movie “My Dinner With Andre,” a touchstone for the antic film buff, Wally Shawn muses about the things that make life bearable despite the heavy weight of human suffering and existential dread that torment his friend Andre Gregory. “I just don’t know how anybody could enjoy anything,” he says, “more than I enjoy… you know, getting up in the morning and having the cup of cold coffee that’s been waiting for me all night, that’s still there for me to drink in the morning, and no cockroach or fly has died in it overnight – I’m just so thrilled when I get up, and I see that coffee there, just the way I want it, I just can’t imagine enjoying something else any more than that.”

This little reverie has always struck me as a note-perfect piece of writing (or speaking) by Shawn, who here depends on a long-running association of coffee with a form of escape from the prosaic, even as it fuels that most prosaic form of labor, writing. The social meaning of coffee combines its conception as the fuel upon which workers of all kinds rely with the notion of the coffee break, the oasis in the day in which workers are temporarily freed from adherence to their routinized schedules and can indulge in idleness. The twin sites of coffee drinking, the coffee shop and the cafe, represent the two class locations in which these escapes can occur: the coffee shop for laborers and the cafe for the intellectual, who turns her own idle philosophizing into her special form of production.

The association of coffee with both labor and the emancipation from labor is a long one. In the standard narrative of the Enlightenment, coffee shops in London in the late seventeenth and early eighteenth centuries play a large role as sites that hosted the workingmen’s collectives and other forms of nascent intelligentsia. Jurgen Habermas, for example, famously identified the London coffee houses as the birthplace of the modern critique of aristocratic power in the name of liberty in his influential The Structural Transformation of the Public Sphere. Habermas’ claim that the public sphere expanded and developed into an inclusive site in which middle-class interests could be voiced also gestures at the interesting social connotations of coffee drinking: a practice that bridges the public world of letters with the private world of internal reflection, which duality that remains in effect to the present time. Coffee is the special beverage of intellectual labor and mental stimulation, and along with other products of the tropical colonial world, such as tea, sugar, and spices, perhaps accrued its social meaning precisely because of its novelty and the absence of pre-existing traditional associations with its consumption (as would be the case in Europe with beer, for example).

Until 1690 or so, nearly all the coffee imported to Europe came from Yemen, after which time the West Indies began to dominate, due to large plantations established by European colonialists, until roughly 1830. London at this time was the major trading center for the world’s coffee supply, supplanted by Rotterdam and coffee from Java later in the nineteenth century, and in the twentieth by Brazil’s production and New York’s factorage, or management of trade. What is interesting about this extremely truncated potted history is how little known it is, beyond the vaguest associations with these locations and coffee drinking: Java, for instance, or Colombia more recently, being places generally associated with the commodity. The fact that coffee as a crop is extremely amenable to the large-scale plantation system had much to do with its spread around the world, and also with the inculcation of the desire to drink it. Coffee as a commodity has also been extremely important to the development of the global economy, perhaps second only to oil. In between the world wars, coffee surpluses in Brazil grew so large that enough beans to supply the world for two and a half years were destroyed, prompting the development of international agreements to govern the flow of trade and prevent the destructive influence on prices of huge surpluses.

As you will have guessed, what I’m interested in here is the caesura between the social meanings of coffee and its consumption, on the one hand, and the economic and historical conditions in which it is produced, on the other. Note that our airy and metaphysical associations of coffee with scribal labor, and our notion of cart coffee as the fuel for wage workers, show no trace of the globally instituted plantation system of production and distribution that allows for its availability. Now, a sociologist friend of mine, upon hearing my thoughts on this subject, remarked dryly: “Well, yes, standard Marxism, the commodity always conceals the conditions of its production.” Which is true, yes, my friend, but I think there’s a bit more to it than that, when we come to our present age of late capitalism (to adopt the favored descriptor). For we specialize in nothing so much as the inflection of meanings in order to create and reinforce markets for products: the process called branding. Coffee has presented an interesting problem for marketers because it suffered from the problem of inelastic demand.

What this bit of jargon means is simply that coffee drinking was typically habitual and not generally considered to be divisible into gradations of luxury. In other words, people do not make fine distinctions when it comes to coffee – and indeed, the world’s coffee market is dominated by one varietal, arabica (though robusta is often used in cheaper brands as a blending ingredient – in fact, the whole question of why arabica is considered superior to robusta is of interest, though not sufficient relevance here). Or at least, coffee was considered to be this type of commodity through the nineteen-eighties. At that point, a revolution occurred with the application of European connoisseurship to coffee-drinking. I am referring, of course, to the vogue for Italian coffee that swept the world at this time. Finally, with the nomenclature of espresso, macchiato, cappuccino, etcetera, marketers had an opportunity to make gradations, to identify a style of coffee drinking with sophistication and taxonomies of taste in such a way as to basically invent a whole lifestyle involving coffee preferences, and thereby to supplant the inelasticity of demand that was preventing consumers from changing their buying patterns.

Ironically, of course, most of these gradations have little to do with the coffee itself; rather, they involve the milk, whether to steam it or froth it, add it or not, or whether to adulterate the espresso with hot water (americano), and so on. The effect is the same: making coffee drinking into a form of connoisseurship. My sociologist friend and I recently walked by a Starbucks, the apotheosis, of course, of the current technocratic style of coffee drinking. Outside was a chalkboard, the faux-handwritten message on which inspired these reflections: “The best, richest things in life cannot be seen or bought… but felt in the heart. Let the smooth and rich taste of eggnog latte fulfill your expectations.” Depressingly contradictory, the message also advertises a beverage which may or may not include coffee itself, though the misuse of the Italian word for milk, in the world of Starbucks, usually signifies its presence. Yet there’s also something honest about it, in that it baldly announces the contradictions that the drinking of coffee embodies. Seen to be the escape from the prosaic, the fuel for the laborer, the joiner of public and private, the psychoactive stimulant that incites philosophy, coffee, so far from a purely metaphysical vapor, contains all the strange compressed complexity of the world of real objects and the webs of relations that bring them to our lips.

Dispatches:

Divisions of Labor II
Divisions of Labor I
Local Catch
Where I’m Coming From
Optimism of the Will
Vince Vaughan…Eve Sedgwick
The Other Sweet Science
Rain in November
Disaster!
On Ethnic Food and People of Color
Aesthetics of Impermanence

Federico Fellini: Circus Maximus

Australian poet and author Peter Nicholson writes 3QD’s Poetry and Culture column (see other columns here). There is an introduction to his work at peternicholson.com.au and at the NLA.

The Circus Maximus was the arena for mass entertainment instituted by the Etruscan kings and then enhanced by subsequent emperors for tens of thousands of spectators. Trajan, Julius Caesar and Augustus, among others, added to, and enhanced, this structure. Its chariot races were a particular feature of its activities in later times.

FelliniIn the theatre of his mind, the great Italian auteur, Federico Fellini, casts forth his films as entertainment, serious entertainment which is worthy of the greatest art that the twentieth century produced. However, it is no good coming to Fellini looking for Thomas Mann or Persona. Fellini is just not that kind of artist. He puts his trust in his feelings, and he believes that feeling is the way to discover the reality of the world. He doesn’t believe in intellectualising about life or art, or in theorising about art either. But to say Fellini is not intelligent in his films would be wrong. Fellini is supremely intelligent as film director. He shapes his films as carefully as any novelist or poet does in the silence of their rooms. Circus Maximus translates from the Latin as largest circle, and it is this largest circle which Fellini draws around the world, enclosing in its phantasmagoric visions the poetry and pain of a loving heart. He invites his audience to participate in his films as would the audience at the Circus Maximus for some games spectacular. You may sit around the edge of the circle and enjoy the surreal passing parade, smell the sawdust, see the most startling use of colour, and of black and white. If he allowed himself to be styled emperor in his domain, Cinecittà, he always did so with a light touch, and he could be scathing about his own persona—he virtually accuses himself of fraudulence in Otto e mezzo. Of course Fellini was no fraudster but a subtle artist of the most unusual kind. The caricaturist from Rimini went on to become a true maestro.

Fellini’s films are musical, and the word maestro is not inappropriate to use in association with his work. His orchestra is his production team—and what a singular group of artists he gathered together for his purposes. It is doubtful that films like Fellini’s could ever be made without this kind of team to work with. Underwriting the whole endeavour is the music of Nino Rota who provides such an insouciant soundtrack to Fellini’s visual panoramas, by turns tender, melancholic, wistful, or vital and exuberant, music for eating, laughter, dancing and loving. But Fellini knows when to keep the soundtrack silent too. Usually, somewhere in a Fellini film, there is that sudden silence followed by the sound of wind, premonition of an ending Fellini doesn’t try to understand. He simply accepts death as part of the spectacle we must all participate in.

It is to be regretted that the main way people now come to Fellini is through re-release on DVD, or on television. If ever a director needed the big screen it is Fellini who designed his films as a medium in which there is a participatory audience. I remember two experiences in my early years of picture going. The first time I saw Otto e mezzo it was a revelation to me, and I also found it profoundly moving. And I almost hurt myself laughing, along with the rest of the audience, when I saw the family argument around the table in Amarcord. I doubt I would have had these reactions if my first viewing of these films had been via the television set. Fellini embraces you through the screen. If you can’t participate in the manner of an Italian feast, you won’t get the best out of his films. These are not works of art for people who want to sit at a distance in judgement. They are meant for enjoyment, involvement. His camera is lascivious, and it gets very close to its subject matter, which some people find disturbing. And people who think Pulp Fiction instituted some new kind of film narrative need to have another look at Fellini’s work, especially from Otto e mezzo onwards, just as cliched ideas about Fellini’s sexism ignore a lifelong preoccupation with the facade of Italian machismo.

For some, Fellini’s films can be a stretch, they ‘don’t wear well’, his sensibility, with its strangely compatible dual carriageway of sensuality and moral prodding, being at odds with present conformities. Satyricon especially is difficult to get hold of. On one hand it is a spectacle which Fellini fills with characteristic striking visuals. On the other, it comes across as cold, as if one was visiting a moonscape. Fellini called it science fiction of the past. Perhaps it was his comment on what he saw going on about him in supposedly liberated times. True, you don’t want to sit down to a tranche of his films in one go. His work is intense, baroque. There is maximum sensory overload. He is like Emily Dickinson and Bruckner in that way; you can’t take on too much of their intensity at one time. It would be a mistake to try to because then the appetite sickens. These are artists who are for a lifetime. You can always come back to them and their depth and seriousness will always be there for you when you have need of it. The fact that Fellini insists on joy being part of his sensibility makes him the major artist he is—he refuses to degrade himself in the manner of so many European intellectuals and artists who mortify themselves with doubt, self-hate and cynicism. It is not as if Fellini avoids tragedy. Who could forget Zampano’s despair on the beach at the end of La Strada or Marcello’s horror at Steiner’s suicide and the murder of his children in La Dolce Vita. Fellini is the realist who accepts suffering but who nevertheless insists on the pleasure principle too. One of the things Fellini takes most pleasure in is the human face. For him, it is endlessly fascinating. Fellini does not, contrary to a lot of Fellini criticism, put freaks in his films, but the variousness and beauty of the human face and form. In that sense he is a portrait painter, filling the screen with characters that give witness to the strangeness and majesty of the human: the alluring image of Anita Ekberg standing in the silent Trevi Fountain, the fantastical ecclesiastical fashion parade in Roma, the out-of-touch aesthetes on board the Gloria N. in E la nave va.

Was ever a director luckier than to have Giulietta Masina as a life companion? How one marvels at this actor’s performances in Fellini’s films. I am especially fond of her work in Giulietta degli spiriti. Here was companionship that led to beauty and greatness. But all Fellini’s actors seem to belong to a troupe. The circus master may crack the whip, but what performances he gets from his casts. How vital his characters seem with their dreams and delusions, their grandeur and pettiness, their gross appetites, their inwardness and hopefulness.

Opera, theatre, cabaret, vaudeville, circus. Luminous and celebratory, fantastical but all too real. Cinema. Art. Fellini is all of these things. For me, his films are inimitable, poetic, unforgettable.

— *** —

The following poem was written in late October 1993 when the press reported Fellini’s stroke.

               Intervista
        Federico Fellini 1920–1993

Maestro, lover, dreaming poet,
Must we say farewell just now?
Here on this uncertain street
Of a tawdry century
You encompassed multitudes,
From the fountain’s quietude
To a seaside ecstasy.

Trumpets at the darkened gates?
But how are we, who trust you still,
Beyond the failings that we share,
To live without your gaiety,
Except that in film flickering
And in Giulietta’s eyes
We know your passion will be strong.

Soon to sawdust you must drop
And circus clowns will hang their heads,
But while you live the world seems good,
For a pure heart brings such grace
And mischief that shows kindliness.
Stay to see our wretchedness
You maker of the marvellous!

But stillness is approaching now,
Tender, as this last spool spins
To silence in unending night.
Ciao, dear artist. May you slip
Quickly to that other side
Behind the screen, and leave us with a smile
Whose joy is deep, whose laughter was so wise.

Intervista: Fellini’s penultimate film
Giulietta’s eyes: Giulietta Masina, Fellin’s wife

Written 1993