From The Atlantic Monthly: (This review was originally published in the Atlantic Monthly in August 1906)
When Mrs. Wharton’s stories first appeared, in that early period which, as we have now learned, was merely a period of apprenticeship, everybody said, “How clever!” “How wonderfully clever!” and the criticism — to adopt a generic term for indiscriminate adjectives — was apt, for the most conspicuous trait in the stories was cleverness. They were astonishingly clever; and their cleverness, as an ostensible quality will, caught and held the attention. And yet, though undoubtedly correct, the term owes its correctness, in part at least, to its ready-to-wear quality, to its negative merit of vague amplitude, behind which the most diverse gifts and capacities may lie concealed. No readers of Mrs. Wharton, after the first shock of bewildered admiration, rest content with it, but grope about to lift the cloaking surtout of cleverness and to see as best they may how and by what methods her preternaturally nimble wits are playing their game, — for it is a game that Mrs. Wharton plays, pitting herself against a situation to see how much she can score.
To most people the point she plays most brilliantly is the episode, which in the novel is merely one of the links in the concatenation of the plot, but in the short story is the form and substance, the very thing itself; and so to be mistress of the art of the episode almost seems to leave any other species of mastery irrelevant and superfluous. In Mrs. Wharton this aptitude is not single, but a combination. It includes the sense of proportion, and markedly that elementary proportion of allotting the proper space for the introduction of the story, — so much to bring the dramatis personae into the ring, so much for the preliminary bouts, so much for the climax, and, finally, the proper length for the recessional. It includes the subordination of one character to another, of one picture to another, the arrangement of details in proper hierarchy to produce the desired effect.
More here.
From Scientific American:
A new study finds that neural stem cells may be able to save dying brain cells without transforming into new brain tissue, at least in rodents. Researchers from the University of California, Irvine, report that stem cells rejuvenated the learning and memory abilities of mice engineered to lose neurons in a way that simulated the aftermath of Alzheimer’s disease, stroke and other brain injuries.
Researchers expect stem cells to transform into replacement tissue capable of replacing damaged cells. But in this case, the undifferentiated stem cells, harvested from 14-day-old mouse brains, did not simply replace neurons that had died off. Rather, the group speculates that the transplanted cells secreted protective neurotrophins, proteins that promote cell survival by keeping neurons from inducing apoptosis (programmed cell death). Instead, the once ill-fated neurons strengthened their interconnections and kept functioning.
More here.
Thursday, November 1, 2007
Lisa Belkin in the NYT:
DON’T get angry. But do take charge. Be nice. But not too nice. Speak up. But don’t seem like you talk too much. Never, ever dress sexy. Make sure to inspire your colleagues — unless you work in Norway, in which case, focus on delegating instead.
Writing about life and work means receiving a steady stream of research on how women in the workplace are viewed differently from men. These are academic and professional studies, not whimsical online polls, and each time I read one I feel deflated. What are women supposed to do with this information? Transform overnight? And if so, into what? How are we supposed to be assertive, but not, at the same time?
“It’s enough to make you dizzy,” said Ilene H. Lang, the president of Catalyst, an organization that studies women in the workplace. “Women are dizzy, men are dizzy, and we still don’t have a simple straightforward answer as to why there just aren’t enough women in positions of leadership.”
Jess deCourcy Hinds at Small Sprial Notebook:

Rotten English is the freshest anthology you’ll pluck off the shelves this summer. There’s simply no other book like it. A collection of two centuries of world literature, each page snaps with flavor and color. So why is it considered “rotten”? Because this vernacular writing breaks grammatical rules and alters spelling to capture the nuances of pronunciation. Here, you’ll find Langston Hughes, Rudyard Kipling, Amy Tan, and many lesser-known but equally compelling voices that demonstrate the boldness of vernacular writing—artistically and politically.
“If Black English Isn’t Language, What Is?”, the title of a James Baldwin essay in this collection, sums up the message of this book. Just substitute the word “black” with other nationalities included here. In other words, if these Pakistani, Chinese, Chicano, New Zealander, and Jamaican vernacular Englishes aren’t worthy, then what is?
In Bookforum, Gershom Gorenberg on Making Israel (Benny Morris, ed.), Nakba (eds., Ahmad H. Sa’di and Lila Abu-Lughod), and historiography:
Within Israeli discourse, Morris’s devaluation of oral testimony served to break the hegemony of the founding generation, those who remembered the war. As seen by Palestinians, though, he is maintaining Israeli power over history. He is silencing the victims, who do not have archives precisely because of the catastrophe of 1948. For Sa’di, a lecturer in politics and government at Ben-Gurion University in Beersheba, justice demands affirming the victims’ story, but it also requires listening to them speak. Memory serves the Palestinians as plaintiffs, as he and Abu-Lughod write in their introduction: It “asserts Palestinian political and moral claims to justice, redress, and the right to return.”
Sa’di asserts that the Israeli archival evidence complements Palestinian testimony. Samera Esmeir argues the opposite in an essay on the dispute over whether Israeli soldiers carried out a massacre at the village of Tantura in 1948. The issue reached an Israeli court in 2000 in a libel suit by veterans against an Israeli researcher, Theodore Katz. Esmeir challenges the court’s preference for Israeli state documents over Palestinian testimony. She asserts that “the very project of the state” requires erasing atrocities against Palestinians. Palestinian villagers’ oral testimony is useful, she argues. If it is contradictory, incomplete, and incoherent, that’s partly a result of the traumatic experience and the shattering of community.
Doree Shafrir in The New York Observer:

One of the [Paris Review’s] current board members, Antonio Weiss, who is a managing director in Paris at the investment bank Lazard, is Plimpton’s former assistant and a former editor at the magazine, and is married to the magazine’s Paris editor, Susannah Hunnewell. He recalled that he was an editor of the literary magazine as an undergraduate at Yale, “which was sort of a link into The Paris Review,” he told The Observer by phone. “I got to know George just by being around.”
Does that New York really still exist? In some ways, that’s the question that faces Mr. Gourevitch’s Paris Review. He probably wouldn’t put it that way, but he does think that a magazine has to be relevant, has to be of its time.
“Even the ones that are really great, they belong to a moment, a certain kind of getting together of energy and taste,” he said. “And often the editors themselves are new writers, and everyone either fails miserably or succeeds spectacularly, and the energy is not in that place anymore and another group starts up another magazine.”
Mr. Gourevitch’s Paris Review is another magazine. Though he never, exactly, criticizes his predecessor, and certainly not by name, Mr. Gourevitch seems to regard Plimpton’s tenure as one of some rather unrealized potential.
Zinovy Zinik in Eurozine:
Is the notion of the émigré author a dated phenomenon that has outlived itself in the age of global communications? I don’t think so. I think it is still a useful concept to define a specific type of literature. While the native author deals with moral ambiguities by proxy, using his characters, the personality of the émigré writer is part of his fiction’s plot – he himself has to decide on which side of the border his mind is. What an ordinary human being lives through, the writer tries to describe. What for an ordinary writer is mental exercise, for the émigré author is lived experience. The émigré writer physically lives this metaphor of life in transit. (Elias Canetti, my neighbour in Hampstead, preferred to write sitting in his car, parked in front of his house.) The dilemma of the émigré author is, therefore, linked with his sense of belonging; and since he is a writer, the question arises for whom he writes and where his audience is located. The citizenship of the émigré writer is not necessarily that of the country of his main readership, and his sense of belonging or his religion might differ from his loyalty as a citizen of the country of his residence. That is, the émigré writer is the one who feels he is displaced – geographically or in language, separated from his readers in one way or another.
Vampires, doomed to exist between two worlds forever, provide the ultimate example of the mental state of exile. But they are émigrés of a very specific kind: they don’t cast a shadow. In other words, they have no real identity in this world. The writer’s existence in the outside world is measured by the influence that his creation exerts – by the shadows his words cast. Vampires are like émigré writers, understood neither in the country of their dwelling, nor able to reach across the border to their readers in the motherland.
Daniel Grant in the Chronicle of Higher Education:
Job security is a relatively new concept in the ancient field of teaching art. Historically artists have created, and been judged on, their own credentials — that is, their art. And the master of fine-arts degree, often described as a “terminal degree,” or the endpoint in an artist’s formal education, has long been sufficient for artists seeking to teach at the college level. But significant change may be on the horizon, as increasing numbers of college and university administrators are urging artists to obtain doctoral degrees.
We shouldn’t be surprised; the M.F.A. has been under attack for some time now. The M.F.A. has become a problem for many administrators, who are increasingly uncomfortable with different criteria for different faculty members. They understand the lengthy process required to earn a doctorate — of which the master’s degree is only a small, preliminary part — and see hiring a Ph.D. over an M.F.A. as the difference between buying a fully loaded showroom automobile and a chassis. Administrators like the background Ph.D.’s have in research, publishing, and grant writing (though if their principal concern were the teaching of studio art to undergraduates, they wouldn’t focus so much on the doctorate).
More here.
Theodore Dalrymple in City Journal:
The British parliament’s first avowedly atheist member, Charles Bradlaugh, would stride into public meetings in the 1880s, take out his pocket watch, and challenge God to strike him dead in 60 seconds. God bided his time, but got Bradlaugh in the end. A slightly later atheist, Bertrand Russell, was once asked what he would do if it proved that he was mistaken and if he met his maker in the hereafter. He would demand to know, Russell replied with all the high-pitched fervor of his pedantry, why God had not made the evidence of his existence plainer and more irrefutable. And Jean-Paul Sartre* came up with a memorable line: “God doesn’t exist—the bastard!”
Sartre’s wonderful outburst of disappointed rage suggests that it is not as easy as one might suppose to rid oneself of the notion of God. (Perhaps this is the time to declare that I am not myself a believer.) At the very least, Sartre’s line implies that God’s existence would solve some kind of problem—actually, a profound one: the transcendent purpose of human existence. Few of us, especially as we grow older, are entirely comfortable with the idea that life is full of sound and fury but signi-fies nothing.
More here.

It’s 39 years since Jane Birkin fell in love with Serge Gainsbourg, 27 years since they split up, and 16 years since Gainsbourg died, but you’d never guess. Paris has never let its most iconic couple separate – you can, Birkin says, still not get through a day in this city without hearing the immortal intimacies of ‘Je t’aime… moi non plus’ from somewhere – and anyway Birkin herself, at 60, has chosen to be living proof that love can survive divorce and death. She still spends most nights with Gainsbourg, singing his songs on an endless cabaret tour, breathing life into words he wrote with her, his muse, in mind. Birkin’s apartment, just off the Boulevard St Germain, decked in crimson silk, cast in permanent twilight, crammed with old photographs and a collection of stuffed animals, is made for this perpetual seance. She shares it with a corpulent bulldog, Dora, who lounges on a chaise.
more from The Observer Review here.

It is not hard to understand why John Cowper Powys has never had the recognition he deserves as one of the twentieth century’s most remarkable novelists. Until he was nearly sixty he earned his living as an itinerant lecturer, much of the time in America, where he thrilled his audiences by his seeming ability to transmit, medium-like, the inmost thoughts of the writers he loved. For a time Powys’s electrifying performance as a kind of literary magus attracted a considerable following. His admirers included some of America’s best-known writers – Theodore Dreiser was a notable supporter, for example – but Powys’s method of ‘dithyrambic analysis’ never caught on. An idiosyncratic exercise that he described as ‘hollowing himself out’ so he could become the writer he was interpreting, it was too obviously adapted to the needs of the lecture circuit and the quirks of Powys’s personality to have any lasting influence. Powys removed himself further from any kind of critical acceptance when, in an effort to generate an income that would enable him to give up lecturing, he published a series of self-help manuals. With titles like The Art of Forgetting the Unpleasant and In Defence of Sensuality, these forays into popular psychology were refreshingly unorthodox in their prescriptions for personal happiness; but they reinforced the perception of Powys as an eccentric figure flailing about on the outer margins of literary and academic respectability.
more from Literary Review here.

THE BELIEVER: In your experience of knowing artists, do you think there’s a discrepancy between why artists tell themselves they’re making art, and the actual reason you perceive them to be making art?
DAVE HICKEY: In my experience, you always think you know what you’re doing; you always think you can explain, but you always discover, years later, that you didn’t and you couldn’t. This leads me to suspect that the principal function of human reason is to rationalize what your lizard brain demands of you. That’s my idea. Art and writing come from somewhere down around the lizard brain. It’s a much more peculiar activity than we like to think it is. The problems arise when we try to domesticate the practice, to pretend that it’s a normal human activity and that “everybody’s creative.” They’re not. Honestly, I never sit down to write anything without thinking, This is a weird thing to be doing! Why am I sitting here writing? Why am I looking at the Ellsworth Kelly on my wall? I don’t know. It feels funny to do these things, but it feels funnier not to, so I write and look.
more from The Believer here.

His reticence, it might be said, extended to narrative itself. Beyond brilliantly meshing visual form with theme—empty canvases with empty lives—Antonioni contributed early to cinema’s migration from Victorian narrative modes, as necessary and welcome a move as was that from Great Expectations to Mrs. Dalloway for literature. Beginning with L’avventura, his films are firmly liberated from Hollywood’s obsessive insistence on the conclusive denouement, on tying things up, whether for better (Mildred Pierce; Stagecoach) or worse (Sunset Boulevard). This was not easy or profitable for the director. The sophisticated audience at Cannes in 1960 was no more prepared than the general public to watch a film whose ostensible heroine not only disappears but is forgotten by the other characters. Probably expecting another film noir, where the body would be found and the mystery solved, the Cannes crowd booed vigorously. But, as Antonioni explained, L’avventura was a noir in reverse. Fortunately, the audience’s disapproval was quickly rejected by great cineastes and critics alike.
more from Artforum here.
From Edge:
What is the self? How does the activity of neurons give rise to the sense of being a conscious human being? Even this most ancient of philosophical problems, I believe, will yield to the methods of empirical science. It now seems increasingly likely that the self is not a holistic property of the entire brain; it arises from the activity of specific sets of interlinked brain circuits. But we need to know which circuits are critically involved and what their functions might be. It is the “turning inward” aspect of the self — its recursiveness — that gives it its peculiar paradoxical quality.
In a wide-ranging talk, Vilayanur Ramachandran explores how brain damage can reveal the connection between the internal structures of the brain and the corresponding functions of the mind. He talks about phantom limb pain, synesthesia (when people hear color or smell sounds), and the Capgras delusion, when brain-damaged people believe their closest friends and family have been replaced with imposters.
More here.
From Science:
What makes Maine coons cuddly and Russian shorthairs standoffish? The answer may lie in the first sequence of the cat genome, published today. In all, geneticists have turned up 20,285 genes, the analysis of which could shed light on everything from human diseases to the underpinnings of feline domestication. The latest sequence comes courtesy of a 4-year-old Abyssinian cat named Cinnamon. The analysis is only a first pass and so is not as complete as that done for humans or dogs. Still, evolutionary biologist Stephen O’Brien and bioinformaticist Joan Pontius of the National Cancer Institute in Frederick, Maryland, and their colleagues were able to decipher about 65% of the gene-containing parts of the genome, or more than 20,000 genes. That’s perhaps 95% of the total, based on comparisons with other genomes.
Based on the structure of the chromosomes, the cat genome much more closely resembles that of humans than that of other nonprimate species. Unlike cats and humans, pieces of chromosomes in “the dog, mouse, rat, and others have been reshuffled like a poker deck,” says O’Brien. This conservation suggests that the cat’s genome has more in common with the ancient ancestor of cats, humans, and other mammals than, say, the dog’s does. Other surprises included an excessive amount of mitochondrial DNA stuck into the cat genome, although researchers have yet to figure out the reason or significance.
More here.
Wednesday, October 31, 2007
From Nature:
It’s not often that research results look this good. An elegant new way to visualize individual brain cells not only provides a major boost to scientists trying to understand how the brain works, but has also won one of its developers a major prize in science photography. The method — described by neuroscientists at Harvard University in Cambridge, Massachusetts, in today’s Nature — allows researchers to see more clearly how individual neurons connect with each other by colouring each one from a palette of about 90 shades. In this way they will be able to build up a detailed diagram of the brain’s wiring, which will help to study how it computes.
More than a century ago, neuroscientists developed the first method of staining individual neurons — with silver chromate. Work with this technique was the basis of the Nobel Prize in Physiology or Medicine in 1906. But this could only stain neurons with one colour. Only in the last decade have scientists improved on this technique, using genetic engineering to transfer genes for fluorescent proteins into mice such that they are expressed in neurons. But until now they could transfer no more than two florescent-protein genes at a time, lighting up the brain with two colours. “It was clear that two colours were not enough to map connections efficiently in the brain’s complex tangle of neurons,” says Joshua Sanes, one of the paper’s senior scientists.
More here.
Natasha Tsangarides in Electronic Intifada:

Despite the promise of the international community, the camp [Shatila] was later desecrated. Using oral histories, Naji and the audience are taken into the world of Shatila refugee camp, where it is estimated that between 2,000 and 3,500 people were murdered in 1982. Here, we learn of the testimonies of a population displaced not once, but several times, who endure hardship within a hostile environment.
The play [Sunlight at Midnight] is thought-provoking, bringing into question many themes such as identity formation, exile and the power of memory, while simultaneously highlighting the failures of the international community and the need to commemorate this tragedy.
As we are introduced to different characters throughout the play, we gain an insight into selective memory and historical narrative. The process of exile has affected each person differently, with the sense of belonging to a Palestinian heritage stronger within the camp. History adopts two meanings in the two worlds we enter: one is associated with power, knowledge and purity, and the other with something threatening or irrelevant.
In the LRB:
This picture – that our minds were formed by processes of evolutionary adaptation, and that the environment they are adapted to isn’t the one that we now inhabit – has had, of late, an extraordinarily favourable press. Darwinism has always been good copy because it has seemed closer to our core than most other branches of science: botany, say, or astronomy or hydrodynamics. But if this new line of thought is anywhere near right, it is closer than we had realised. What used to rile Darwin’s critics most was his account of the phylogeny of our species. They didn’t like our being just one branch among many in the evolutionary tree; and they liked still less having baboons among their family relations. The story of the consequent fracas is legendary, but that argument is over now. Except, perhaps, in remote backwaters of the American Midwest, the Darwinian account of our species’ history is common ground in all civilised discussions, and so it should be. The evidence really is overwhelming.
But Darwin’s theory of evolution has two parts. One is its familiar historical account of our phylogeny; the other is the theory of natural selection, which purports to characterise the mechanism not just of the formation of species, but of all evolutionary changes in the innate properties of organisms. According to selection theory, a creature’s ‘phenotype’ – the inventory of its heritable traits, including, notably, its heritable mental traits – is an adaptation to the demands of its ecological situation. Adaptation is a name for the process by which environmental variables select among the creatures in a population the ones whose heritable properties are most fit for survival and reproduction. So environmental selection for fitness is (perhaps plus or minus a bit) the process par excellence that prunes the evolutionary tree.
More often than not, both halves of the Darwinian synthesis are uttered in the same breath; but it’s important to see that the phylogeny could be true even if the adaptationism isn’t.

Fifty years ago, New American Library published the Mentor Philosophers series, each with a title beginning The Age of . . . Belief, Ideology, Reason, and so on; the 20th-century selections bore the title The Age of Analysis. Had the series continued to the end of that century and into this, the volume should no doubt be The Age of Apology. Our postmodern ethos seems to hold that if anything can be proved to have happened, then surely someone needs to apologize for it.
We live amid a veritable tsunami of apology. The Catholic Church, which, of course, has much to apologize for, has, of late, offered mea culpas to Galileo, the Jews, the gypsies, Jan Hus, whom it burned at the stake in 1415, even to Constantinople (now Istanbul) for its sacking 800 years ago by the knights of the Fourth Crusade, an event for which the late John Paul II expressed “deep regret.” No wonder that a group in England, claiming descent from the medieval Knights Templars, is asking the Vatican to apologize for the violent suppression of the order and for torturing to death its Grand Master Jacques de Molay in 1314, an apology timed to commemorate the 700th anniversary of that fell deed.
more from The American Scholar here.

Thousands of Cubans and foreigners have been flocking to a mausoleum in central Cuba to commemorate the 40th anniversary of Che Guevara’s death. For 10 years, the Cuban government has been telling the world that the body inside the mausoleum is that of the famous guerrilla.
It’s a lie designed to bamboozle the population into worshiping the Argentine-born revolutionary as if he were a saint–and the Cuban Revolution as if it were a religion. A brilliant investigation by French journalist Bertrand de la Grange, recently published in Spain’s El Pais newspaper, demolishes the official version.
In 1995, Bolivian Gen. Mario Vargas, who had fought Che’s guerrillas in the 1960s, revealed that the revolutionary’s body was buried a few meters from the airport runway in Vallegrande, a town close to La Higuera, the village in eastern Bolivia where Guevara was killed on Oct. 9, 1967. (Guevara had been executed after the Bolivian president ordered the soldiers who ambushed and captured him to get rid of him.) Cuba sent a forensic, diplomatic and legal team to Vallegrande. On June 28, 1997, they claimed to have found the body, which was brought to Cuba a few weeks before the 30th anniversary of the guerrilla’s death.
Numerous facts belie the Cuban claim.
more from TNR here.