Head to Toe

From Harvard Magazine:

Daniel Lieberman tracks the evolution of the human head:

Dan “The head presents an interesting evolutionary paradox,” explains Lieberman, chair of the new department of human evolutionary biology, “because on the one hand it is so complicated that if anything goes wrong, the organism dies. On the other hand, it is where natural selection can and has acted powerfully to make us what we are.” Everything is closely connected. For example, the roof of the orbits is the floor of the brain—if one changes, they both do.

“How is it,” he asks, “that something so complicated and so vital can also be so evolvable?” One explanation involves modularity and integration. Not only do heads contain many modules (instructions for building an eye to see, for example, or an ear to listen), but each module is itself “intensely integrated in terms of development, structure, and function….Changes to the size, the shape, or the relative timing of development of each of the head’s many modules offer a variety of opportunities for change.” Studying the head’s modules, Lieberman writes, may help us understand why “the human head has changed substantially since our lineage diverged from the chimpanzee’s lineage.” It also provides an opportunity for “exploring how nature tinkers with development in ways that affect function and permit the evolution of complex structures.”

More here.

Unlocking the Mystery of Human Nature

From The Telegraph:

Ramachandran_main_1797924f In 1864, during a charged debate on Darwin’s theory of evolution, Disraeli asked whether we are apes or angels. Over this question, the highly regarded (perhaps too highly regarded) neuroscientist V S Ramachandran sits proudly on the fence. We are both apes and angels, he suggests.

Humans, unlike other creatures, possess language, empathy, humour, plus the capacity for abstract thinking and self-awareness. But our uniqueness is based on structures that evolved for other reasons – our hearing, for example, derived from our chewing (two redundant jaw bones worming their way into the ear). Vital to what makes us special is our brain. As Ramachandran put it in his 2003 Reith Lectures: “Science tells us we are merely beasts, but we don’t feel like that. We feel like angels trapped inside the bodies of beasts, forever craving transcendence.” That, in a nutshell, is for Ramachandran the human predicament, and in The Tell-Tale Brain he sets out to crack it.

More here.

Learning From the Master

1224285797332_1Gabriel Josipovici reviews Colm Tóibín's All a Novelist Needs: Colm Tóibín on Henry James, in The Irish Times:

THERE ARE GREAT scholar critics, such as Erich Auerbach, Northrop Frye and Christopher Ricks, and great cultural critics, such as Walter Benjamin and Roland Barthes, but by and large the best literary criticism has always come from the practitioners themselves: Coleridge, Eliot, Proust, Auden, Jarrell, Hill. This is not surprising: more is at stake for the writer than for most readers as he seeks to grapple with the mystery of why a predecessor feels so significant, has helped release so much in his own art. Colm Tóibín’s relation to Henry James is of this kind.

In devoting several years of his life to re- creating a small period of James’s life for his novel The Master he was of course devoting them, as any artist devotes his working life, to trying to discover what it was he needed and wanted to say. In other words he wants to understand James, his life and his art, because in that way he will come to understand himself. We can feel sure, therefore, that a collection of his incidental essays on James, written between 2002 and 2009, the years surrounding his writing of The Master, will enrich our understanding of both artists.

And the book does not disappoint. The essays may be incidental – reviews, introductions, lectures – but each conveys a sense of Tóibín’s deep engagement with his subject and his writer’s way with words. Reviewing Sheldon Novick’s biography of James he quickly but firmly insists on replacing the biographer’s easy conflation of silence with sexual repression with something more subtle but to my mind far more convincing: the artist’s reticence. “When Novick says in his prologue that James wrote ‘frank love letters’ to Henrik Andersen (xviii) and adds soon afterwards that James’s ‘only indisputable love letters were written to men’ (13 ), the reader who knows these letters is entitled to feel that Novick’s reading skills are not subtle. These letters . . . are many things, but they are not ‘frank’ and they are not ‘indisputable’. James was not given to frankness or indisputability. That is why we read him.”

In other words the web of allusions may protect not a secret but the sanctity and complexity of life. Tóibín comes to this in the most profound piece in this volume, an essay that would by itself be worth the price of the book, A More Elaborate Web: Becoming Henry James. This is his account of how and why he wrote The Master, and it is one of the best essays on how a work of art comes into being that I know.

Nowhere Man

1294084012kirsch_122310_380px Adam Kirsch on my old teacher, in The Tablet:

With most writers, the passage of time helps to consolidate their achievement and fix their reputation. Fifteen years after a poet’s death would seem like ample time for this posthumous process to be completed—especially in the case of a poet as famous as Joseph Brodsky, who became internationally known in his twenties and won the Nobel Prize in 1987. Certainly there is no mystery about the standing of poets like Seamus Heaney or Derek Walcott, Brodsky’s friends, contemporaries, and fellow-laureates. Whether you enjoy reading Heaney or not, the shape of his achievement is clear; his name stands for a certain kind of writing and thinking.

Brodsky, however, continues to look a little blurry to American readers. His work does not have the currency or influence, among younger poets, that his reputation would suggest. Some critics, especially in England, are prepared to dismiss him entirely, to call his work overrated and his reputation unearned. But most simply ignore him, as though he did not belong to the same conversation that includes Heaney or John Ashbery or Adrienne Rich.

In one crucial sense, of course, he does not. All those poets write in English; but Iosif Aleksandrovich Brodsky, born in Leningrad in 1940, was a Russian poet. This means that it is Russian readers, familiar with Brodksy’s language and literary tradition, who must decide his claims to greatness. And as Lev Loseff shows in Joseph Brodsky: A Literary Biography, his clarifying new book, the best Russian judges have been unanimous about Brodsky from the beginning.

When he was 21 years old, for instance, he was introduced to Anna Akhmatova, the tragic heroine of 20th-century Russian poetry. Loseff, a poet and friend of Brodsky’s, explains that such “pilgrimages” to Akhmatova were common for young writers, who would arrive “bearing flowers and notebooks full of poetry.” Unsurprisingly, the encounter made a deep impression on Brodsky: “I suddenly realized—you know, somehow the veil suddenly lifts—just who or rather just what I was dealing with.” What is more surprising is that Akhmatova, then 72 years old, immediately accepted Brodsky as an equal: “Iosif, you and I know every rhyme in the Russian language,” she told him. In 1965, after reading a poem of Brodsky’s, she wrote in her diary: “Either I know nothing at all or this is genius.”

There is nothing new about English readers being baffled by poetry that Russians adore. On the contrary, it’s a critical truism that Russian poetry doesn’t translate well. Pushkin occupies the same place in Russian literature as Shakespeare does in English, but it has always been hard for us to really understand why. Twentieth-century masters like Osip Mandelstam and Marina Tsvetaeva are probably as well known in America for their life stories as for their writings. If Brodsky belongs in their company, then it makes sense for him to remain a little obscure to Americans, just as they do.

What makes Brodsky’s case so unusual is that this Russian poet spent almost half his life in America.

Lords of the Rings: Understanding Tree Ring Science

Ae_douglass Tim De Chant in ars technica:

Ask any second grader what you can do with the rings on a tree, and they'll respond, “Learn the age of the tree!” They're not wrong, but dendrochronology—the dating of trees based on patterns in their rings—is more than just counting rings. The hundred year-old discipline has given scientists access to extraordinarily detailed records of climate and environmental conditions hundreds, even thousands of years ago.

The ancient Greeks were the first people known to realize the link between a tree's rings and its age but, for most of history, that was the limit of our knowledge. It wasn’t until 1901 that an astronomer at Arizona's Lowell Observatory was hit with a very terrestrial idea—that climatic variations affected the size of a tree's rings. The idea would change the way scientists study the climate, providing them with over 10,000 years of continuous data that is an important part of modern climate models.

A. E. Douglass, the astronomer in question, is revered as the father of dendrochronology even though one of the field's basic concepts—crossdating, or the matching of ring patterns between trees—was independently discovered on four earlier occasions. (Pioneering computer scientist Charles Babbage was among that group.) Douglass was the first to apply truly scientific rigor to the study of tree rings, using a quantitative approach to tie variations in ring width to available climate records.

For the next dozen years, Douglass scoured Arizona for Ponderosa pine—dead or alive—to construct his first chronology. Completed in 1914, Douglass's chronology stretched back nearly 500 years, a feat accomplished by crossdating. Months later, Douglass teamed with an anthropologist to date timbers in pueblos in the American Southwest. For the rest of his life, Douglass continued to develop the science of dendrochronology. Though he was never able to tie sunspot activity to ring patterns—his original inspiration—his new field found favor with climatologists.

Picking the Wrong Witch

240px-DubravkaUgresic Richard Byrne on the books of Dubravka Ugrešić in The Common Review:

Once upon a time there was a magical empire of letters called Central Europe. Its borders were fuzzy but recognizable. Vienna was its capital. The receding Ottoman Empire provided more of its territory. It was a place that existed largely in cafés and castles, train stations and brothels. The empire’s writers found inspiration in the uneasy play of imperialism, capitalism, and burgeoning nationalism in its borders. Psychoanalysis and Marxism and Zionism overlapped and clashed and conspired, depending on whom you asked. Austria’s defeat in the First World War did not end that empire— far from it. The new states formed after Versailles solidified and expanded its reach. The sustained and vicious assault of Nazism could not eradicate it, either. Many of its leading lights survived even that horror, through Holocaust and exile, to find themselves at the front lines of the Cold War, their fame fanned by the exigencies of dissidence and samizdat.

Dubravka Ugrešić, daughter of a Croatian father and a Bulgarian mother, was born into that Central Europe in 1949. It was a literary empire built by the likes of Franz Kafka, Jaroslav Hašek, Robert Musil, and Karl Kraus, and its expansion had writers from Yugoslavia—Miroslav Krleža, Ivo Andrić, and Meša Selimović—busy discovering new vistas.

But that Central Europe, which survived two wars, did not survive a third—the Cold War that ended with the fall of the Berlin Wall. Ironically, the greatest writers of the empire when it finally disappeared—Václav Havel and Danilo Kiš—were outsized figures in events that led to its vanishing. Central Europe’s end, though sad, was largely peaceful—the empire itself dissolving fizzily in the great political and economic scramble westward to the European Union and NATO. The difference that was dissidence was erased. Central Europe’s writers found themselves the wards of small nations competing in the larger European marketplace. But in the former Yugloslavia, Ugrešić’s neighborhood, the empire collapsed in a spasm of blood and fire. Many of her fellow writers sought protection by dividing themselves into competing camps. But Ugrešić did not join a pack. She stood aloof at first, and then ran off to the woods, shouting aloud about the perfidy and terror of it all. By doing so, she became so strange and powerful that those whom she would not join branded her a witch. Ugrešić and four other women writers were attacked in a prominent Croatian newspaper as “unpatriotic” and as “witches,” and the novelist found herself ostracized and isolated in a newly independent country that she never wanted to live in—cast into exile.

These attacks on Ugrešić made her more powerful still.

the pen is mightier…

8b25b132-19fc-11e0-a8d8-00144feab49a

Down a leafy cul-de-sac in Aldershot, south-west of London, tucked away among suburban officers’ married quarters, sits the Prince Consort’s Library. The library, which last year celebrated its 150th anniversary, houses the British army’s collection of specialist military literature. If current publishing trends are anything to go by, it may soon need to expand. The popularity of military books is sky-high, the constant appetite for military history – particularly of the two world wars – augmented by the dramatic rise in recent years of books about current conflicts, notably Iraq and Afghanistan. The library also hosts the British Army’s Military Book of the Year Award; an accolade that is all the more keenly sought after because the judges are all soldiers. Last year’s deserved winner was Andrew Roberts’s The Storm of War, but what was extraordinary was the fact that six finalists were drawn from among the several thousand military books published in the UK in 2009. Against this background, General Sir Robert Fry, the former senior British military representative in Iraq, was only voicing the concerns of many inside and outside the military when he was reported to have expressed unease at the “excessive reverence” with which the British military is held. Criticism of the armed forces seems to be off-limits, while soldiers are uncomfortably paraded on The X-Factor – all for excellent charitable causes, of course, but further evidence of the “mawkishness” that worried the general. Whether the slew of military books loading the shelves of bookshops and supermarkets across the country is a cause or a symptom of this condition is not immediately obvious.

more from Patrick Hennessey at the FT here.

How I Killed Pluto

58629577

Pluto. Poor little guy. He never wanted much. The others could be bigger, they could be better-looking or brag about themselves (“I’m burning hot!” or “I have rings!” or “I support life!”). He didn’t care. All he wanted was to be part of the planet club. And for about 75 years, that tiny frozen world billions of miles from the sun was a card-carrying member. Then, in 2006, Pluto was kicked out — reclassified as a dwarf planet. The credit — or, for the outraged nine-planet fans, the blame — goes to the International Astronomical Union. It also goes to Caltech astronomer Mike Brown, who just couldn’t help finding other tantalizing objects at the edges of the solar system that challenged Pluto’s planetary status. “I would hear from many people who were sad about Pluto,” Brown writes in “How I Killed Pluto: And Why It Had It Coming.” “And I understood. Pluto was part of their mental landscape, the one they had constructed to organize their thinking about the solar system and their own place within it.”

more from Nick Owchar at the LA Times here.

not weird bad

Carr-t_CA0-articleLarge

Oh boy, yet another book about yet another modern thinker who suggests that “electronic inter­dependence” is the defining aspect of our time. All very ho-hum, except Marshall McLuhan, the subject of this book, figured it out 50 years before anybody ever updated his Facebook page or posted his whereabouts on Twitter. “Marshall McLuhan: You Know Nothing of My Work!” is an odd title for a weird book. Not weird bad, just weird in a way that makes you stop and think about what precisely the author, Douglas Coupland, is up to. Like the man it chronicles, Coupland’s book is full of unconventional angles, ricochets and resonances. Rather than offering a ­doorstop-size addition to the Great Man canon, it comes in at just over 200 pages that nonetheless sprawl and unfold to their own idiosyncratic rhythm. This is the kind of book that will deliver major annoyance to academics who have made a career out of deconstructing McLuhan’s effort to define the modern media ecosystem. But to a reader interested in a little serious fun, a dip into someone we pretend to understand but don’t really know, “You Know Nothing of My Work!” is a welcome taunt. The book rewards by refusing to slip into the numbing vortex of academic discourse, taking a fizzy, pop-culture approach to explaining a deep thinker, one who ended up popularized almost in spite of himself.

more from David Carr at the NY Times here.

How do we deal with a purposeless universe and the finality of death?

From The Guardian:

Immortalization The séance that Charles Darwin attended in January 1874 at the house of his brother Erasmus brought the pioneering biologist together with Francis Galton, eugenicist and one of the founders of modern psychology, and the novelist George Eliot. All three were anxious that the rise of spiritualism would block the advance of scientific materialism. They were unimpressed with what they witnessed – Darwin found the experience “hot and tiring” and left before sparks were seen and rapping heard – but they would have been seriously concerned had they known the future career of a fourth participant in the séance, the classical scholar and psychologist FWH Myers.

The inventor of the word “telepathy” and the writer who first introduced the work of Freud into Britain, Frederic Myers went on to become one of the founders of the Society for Psychical Research. Supported by some of the leading figures of the day, including the Cambridge philosopher Henry Sidgwick and Arthur Balfour, president of the society and later prime minister, the psychical researchers believed human immortality might prove to be a scientifically demonstrable fact. Their quest for an afterlife was partly driven by revulsion against materialism. Science had revealed a world in which humans were no different from other animals in facing oblivion when they died and eventual extinction as a species. For nearly everyone the vision was intolerable. Not fully accepted by Darwin himself, it led the biologist and explorer Alfred Russel Wallace – acknowledged by Darwin as the co-discoverer of natural selection – to become a convert to spiritualism. Wallace insisted he did not reject scientific method. Like Sidgwick and Myers, he was convinced science could show the materialist view of the world to be mistaken.

More here.

Cyberspace When You’re Dead

From The New York Times:

Cybers Suppose that just after you finish reading this article, you keel over, dead. Perhaps you’re ready for such an eventuality, in that you have prepared a will or made some sort of arrangement for the fate of the worldly goods you leave behind: financial assets, personal effects, belongings likely to have sentimental value to others and artifacts of your life like photographs, journals, letters. Even if you haven’t made such arrangements, all of this will get sorted one way or another, maybe in line with what you would have wanted, and maybe not.

But many of us, in these worst of circumstances, would also leave behind things that exist outside of those familiar categories. Suppose you blogged or tweeted about this article, or dashed off a Facebook status update, or uploaded a few snapshots from your iPhone to Flickr, and then logged off this mortal coil. It’s now taken for granted that the things we do online are reflections of who we are or announcements of who we wish to be. So what happens to this version of you that you’ve built with bits? Who will have access to which parts of it, and for how long?

More here.

Rising-Tide Economics

In the twenty-first-century economy, growth and equality must go hand in hand.

Gene Sperling has just been appointed the new director of the National Economic Council by President Obama. To get a sense of his ideas and philosophy, one might look at this essay he published in 2007 in Democracy:

106737 In my White House days, I was known for tormenting the speechwriters by insisting that we should rip off Ben Franklin’s caution that we “must indeed all hang together, or … hang separately” with the economic refrain “we will grow together or grow apart.” My line never made it into a speech, but with the spread of globalization it has never been more apt. Indeed, the question of whether spreading globalization and information technology (IT) is strengthening or hollowing out our middle class may be the most paramount economic issue of our time.

Perhaps a better phrase to capture the notion of shared prosperity was John F. Kennedy’s observation that “a rising tide lifts all boats.” For progressives, the rising-tide metaphor is not a causal assumption that growth will automatically raise everyone. Rather, it is the aspiration and test for economic policy: Does it both raise the tide and lift all boats? This vision of shared prosperity is not only demanded by the global, interdependent economy, but rooted in the historic values of the progressive vision of the United States. Moving forward, we must recognize that the economy is undergoing a profound transformation, making it distinct from both the industrial era and even the beginnings of the Internet Age just a decade ago. In such a world, economic growth can be explosive, but growth alone is not enough. For Americans, shared prosperity, an opportunity for upward mobility, and economic outcomes determined more by merit than the accident of birth are fundamental to who we are as a nation.

More here.

India Calling: An Intimate Portrait of a Nation’s Remaking

Gaiutra Bahadur in the New York Times Book Review:

ScreenHunter_05 Jan. 08 11.45 In the middle of his accomplished book, “India Calling,” Anand Giridharadas tells of meeting a Maoist revolutionary in Hyderabad. The city, nicknamed Cyberabad, serves as a base for both the globalized Indian economy and an armed insurgency at war against the country’s inequalities, rooted and new. India’s Maoist — or Naxalite — movement began as a rural struggle against exploitative landlords in a caste-conscious, socialist nation but has now arrayed itself against the forces of global capitalism reshaping India. When Giridharadas pushes the Naxalite — What does one fight have to do with the other? — the man answers with a striking notion: globalization is reducing people to their specific economic task, stripping them of their humanity, just as caste had done. And software engineers in gated communities have become the new Brahmins. Giridharadas follows the curve of this argument, allowing it to seduce us. Then, he reveals that this rebel, although waging revolution by night, reports by day for a newspaper he himself describes as a shill for the multinational transformation of India. “I have to earn my lunch,” the man explains. “I’m not a whole-timer for revolution.”

The scene accentuates Giridharadas’s appeal as a writer. “India Calling” has what Hanif Kureishi once described as “the sex of a syllogism.” Full-figured ideas animate every turn. So, simultaneously, does Giridharadas’s eye for contradiction. The combination both pleases us and makes us wary — distrustful of shapely ideas, including the author’s own.

More here.

Timothy Donnelly’s 6 favorite contemporary poets

The Boston Review's poetry editor recommends authors whose poetry collections are as surprising as they are insightful.

From The Week:

ScreenHunter_06 Jan. 08 11.53 Ten Walks/Two Talks by Jon Cotner and Andy Fitch (Ugly Duckling, $14). In this book about New York, inspired by the travel diaries of the Japanese poet Basho, Cotner and Fitch perfected a style—hip, wry, goofy, chill, patient, wide-eyed, curious, wise—that’s as difficult to pin down as it is infectious. Reading this book enhances the way you perceive what’s new as it gently reanimates what you think you already know.

The Waste Land and Other Poems by John Beer (Canarium, $14). This tongue-in-cheek homage to various literary monuments (including works by Marx, Rilke, and, of course, Eliot) is also a serious sendup of literary momentousness. Beer might have found 100 ways to go wrong in this audacious debut, but he writes his way around all of them and triumphs.

English Fragments: A Brief History of the Soul by Martin Corless-Smith (Fence, $19). With great discernment and one of the best-tuned ears in poetry today, British-born Corless-Smith sifts excerpts from his vast reading into lyric fragments of rare elegance.

More here.

7 Billion

By 2045 global population is projected to reach nine billion. Can the planet take the strain?

Robert Kunzig in National Geographic:

ScreenHunter_04 Jan. 08 11.37 One day in Delft in the fall of 1677, Antoni van Leeuwenhoek, a cloth merchant who is said to have been the long-haired model for two paintings by Johannes Vermeer—“The Astronomer” and “The Geographer”—abruptly stopped what he was doing with his wife and rushed to his worktable. Cloth was Leeuwenhoek’s business but microscopy his passion. He’d had five children already by his first wife (though four had died in infancy), and fatherhood was not on his mind. “Before six beats of the pulse had intervened,” as he later wrote to the Royal Society of London, Leeuwenhoek was examining his perishable sample through a tiny magnifying glass. Its lens, no bigger than a small raindrop, magnified objects hundreds of times. Leeuwenhoek had made it himself; nobody else had one so powerful. The learned men in London were still trying to verify Leeuwenhoek’s earlier claims that unseen “animalcules” lived by the millions in a single drop of lake water and even in French wine. Now he had something more delicate to report: Human semen contained animalcules too. “Sometimes more than a thousand,” he wrote, “in an amount of material the size of a grain of sand.” Pressing the glass to his eye like a jeweler, Leeuwenhoek watched his own animalcules swim about, lashing their long tails. One imagines sunlight falling through leaded windows on a face lost in contemplation, as in the Vermeers. One feels for his wife.

More here.

Journal’s Paper on ESP Expected to Prompt Outrage

Jp-ESP-popupBenedict Carey in the NYT:

One of psychology’s most respected journals has agreed to publish a paper presenting what its author describes as strong evidence for extrasensory perception, the ability to sense future events.

The decision may delight believers in so-called paranormal events, but it is already mortifying scientists. Advance copies of the paper, to be published this year in The Journal of Personality and Social Psychology, have circulated widely among psychological researchers in recent weeks and have generated a mixture of amusement and scorn.

The paper describes nine unusual lab experiments performed over the past decade by its author, Daryl J. Bem, an emeritus professor at Cornell, testing the ability of college students to accurately sense random events, like whether a computer program will flash a photograph on the left or right side of its screen. The studies include more than 1,000 subjects.

Some scientists say the report deserves to be published, in the name of open inquiry; others insist that its acceptance only accentuates fundamental flaws in the evaluation and peer review of research in the social sciences.

“It’s craziness, pure craziness. I can’t believe a major journal is allowing this work in,” Ray Hyman, an emeritus professor of psychology at the University Oregon and longtime critic of ESP research, said. “I think it’s just an embarrassment for the entire field.”

Are English Departments Killing the Humanities?

Pharos_of_AlexandriaFeisal G. Mohamed in Dissent:

The focus of this post is not the thousand-and-one times told tale of how the corporatization of the university and state divestment from higher education has had a particularly disastrous impact upon humanities departments. There are several informed and important books on those economic realities, of which Stanley Fish provides a partial bibliography in a recent blog post and David A. Bell provides a review essay in the Fall 2010 print issue of Dissent. We can treat these realities as facts to be taken for granted. But even as we strain against such pressures, we can engage in difficult self-scrutiny. We might wonder if there are conditions of intellectual deprivation for which the institutional structures governing the humanities are partly to blame. And any such consideration must look squarely at that elephant in the olive grove, the English department, and ask if it does more harm than good.

This upstart institution has had a brief if also voracious life. Professorships of English language and literature began to appear in earnest in the late nineteenth century, spurred by an unlikely and uneasy alliance between philological study and the kind of civilizing errand of literature one might associate with Matthew Arnold. For Arnoldians, literature would play the cultural role once occupied by religion, with beauty civilizing the modern individual. Such views reached their climax in the era of the Second World War, with the cultural mobilization against fascism that made liberal values seem all the more worth cultivating in their fragility.

Interactive Art: What Video Games Can Learn from Freud

0307378705.01.MZZZZZZZ Rob Goodman in The Millions:

What if the best thing art has to offer is freedom from choice?

There’s a reason it’s high praise, not criticism, to say that a film or a piece of music or a good novel “sweeps you along.” There’s a selflessness in it: not just the pleasure in pausing the parts of the brain that plan and calculate and select, but in the temporary surrender of investing in someone else’s choices. Good art can be where we go for humility: when we’re encouraged to treat each of our thoughts as worthy of being made public, it can be almost counter-cultural to admit, in the act of being swept along, that someone else is simply better at arranging the keys of a song or the twists of a book and making them look like fate.

Freedom from choice is a seductive way of thinking about art—and it’s at the heart of the debate over the cultural value of video games. Video games, for their cultural boosters, promise an art based on choice: an interactive art, possibly the first ever. For their detractors, “interactive art” is a contradiction in terms. Critics can point to video games’ narrative clichés or sloppy dialogue or a faith in violence as the answer to everything; but at base, they seem to be bothered by the idea of an art form that can be “played.” Choice is their bright line.

Last spring, Roger Ebert nominated himself to hold that bright line on his blog. And though the 4,799 comments (to date) on his original post weighed in overwhelmingly against his claim that “Video games can never be art,” and encouraged him to back off of that blanket assertion, he summed up as eloquently as anyone the danger posed to narrative by video games’ possibility of limitless choice:

If you can go through “every emotional journey available,” doesn’t that devalue each and every one of them? Art seeks to lead you to an inevitable conclusion, not a smorgasbord of choices. If next time I have Romeo and Juliet go through the story naked and standing on their hands, would that be way cool, or what?

It’s possible, as Owen Good did, to write off the whole argument as empty, just a chest-thumping proxy war between generations or subcultures: “Art is fundamentally built on the subjective: inspiration, interpretation and appraisal. To me that underlines the pointlessness of the current debate for or against video games as art….Is there some validation the games community seeks but isn’t getting right now?”