Zinovy Zinik in Eurozine:
Is the notion of the émigré author a dated phenomenon that has outlived itself in the age of global communications? I don’t think so. I think it is still a useful concept to define a specific type of literature. While the native author deals with moral ambiguities by proxy, using his characters, the personality of the émigré writer is part of his fiction’s plot – he himself has to decide on which side of the border his mind is. What an ordinary human being lives through, the writer tries to describe. What for an ordinary writer is mental exercise, for the émigré author is lived experience. The émigré writer physically lives this metaphor of life in transit. (Elias Canetti, my neighbour in Hampstead, preferred to write sitting in his car, parked in front of his house.) The dilemma of the émigré author is, therefore, linked with his sense of belonging; and since he is a writer, the question arises for whom he writes and where his audience is located. The citizenship of the émigré writer is not necessarily that of the country of his main readership, and his sense of belonging or his religion might differ from his loyalty as a citizen of the country of his residence. That is, the émigré writer is the one who feels he is displaced – geographically or in language, separated from his readers in one way or another.
Vampires, doomed to exist between two worlds forever, provide the ultimate example of the mental state of exile. But they are émigrés of a very specific kind: they don’t cast a shadow. In other words, they have no real identity in this world. The writer’s existence in the outside world is measured by the influence that his creation exerts – by the shadows his words cast. Vampires are like émigré writers, understood neither in the country of their dwelling, nor able to reach across the border to their readers in the motherland.
Daniel Grant in the Chronicle of Higher Education:
Job security is a relatively new concept in the ancient field of teaching art. Historically artists have created, and been judged on, their own credentials — that is, their art. And the master of fine-arts degree, often described as a “terminal degree,” or the endpoint in an artist’s formal education, has long been sufficient for artists seeking to teach at the college level. But significant change may be on the horizon, as increasing numbers of college and university administrators are urging artists to obtain doctoral degrees.
We shouldn’t be surprised; the M.F.A. has been under attack for some time now. The M.F.A. has become a problem for many administrators, who are increasingly uncomfortable with different criteria for different faculty members. They understand the lengthy process required to earn a doctorate — of which the master’s degree is only a small, preliminary part — and see hiring a Ph.D. over an M.F.A. as the difference between buying a fully loaded showroom automobile and a chassis. Administrators like the background Ph.D.’s have in research, publishing, and grant writing (though if their principal concern were the teaching of studio art to undergraduates, they wouldn’t focus so much on the doctorate).
More here.
Theodore Dalrymple in City Journal:
The British parliament’s first avowedly atheist member, Charles Bradlaugh, would stride into public meetings in the 1880s, take out his pocket watch, and challenge God to strike him dead in 60 seconds. God bided his time, but got Bradlaugh in the end. A slightly later atheist, Bertrand Russell, was once asked what he would do if it proved that he was mistaken and if he met his maker in the hereafter. He would demand to know, Russell replied with all the high-pitched fervor of his pedantry, why God had not made the evidence of his existence plainer and more irrefutable. And Jean-Paul Sartre* came up with a memorable line: “God doesn’t exist—the bastard!”
Sartre’s wonderful outburst of disappointed rage suggests that it is not as easy as one might suppose to rid oneself of the notion of God. (Perhaps this is the time to declare that I am not myself a believer.) At the very least, Sartre’s line implies that God’s existence would solve some kind of problem—actually, a profound one: the transcendent purpose of human existence. Few of us, especially as we grow older, are entirely comfortable with the idea that life is full of sound and fury but signi-fies nothing.
More here.

It’s 39 years since Jane Birkin fell in love with Serge Gainsbourg, 27 years since they split up, and 16 years since Gainsbourg died, but you’d never guess. Paris has never let its most iconic couple separate – you can, Birkin says, still not get through a day in this city without hearing the immortal intimacies of ‘Je t’aime… moi non plus’ from somewhere – and anyway Birkin herself, at 60, has chosen to be living proof that love can survive divorce and death. She still spends most nights with Gainsbourg, singing his songs on an endless cabaret tour, breathing life into words he wrote with her, his muse, in mind. Birkin’s apartment, just off the Boulevard St Germain, decked in crimson silk, cast in permanent twilight, crammed with old photographs and a collection of stuffed animals, is made for this perpetual seance. She shares it with a corpulent bulldog, Dora, who lounges on a chaise.
more from The Observer Review here.

It is not hard to understand why John Cowper Powys has never had the recognition he deserves as one of the twentieth century’s most remarkable novelists. Until he was nearly sixty he earned his living as an itinerant lecturer, much of the time in America, where he thrilled his audiences by his seeming ability to transmit, medium-like, the inmost thoughts of the writers he loved. For a time Powys’s electrifying performance as a kind of literary magus attracted a considerable following. His admirers included some of America’s best-known writers – Theodore Dreiser was a notable supporter, for example – but Powys’s method of ‘dithyrambic analysis’ never caught on. An idiosyncratic exercise that he described as ‘hollowing himself out’ so he could become the writer he was interpreting, it was too obviously adapted to the needs of the lecture circuit and the quirks of Powys’s personality to have any lasting influence. Powys removed himself further from any kind of critical acceptance when, in an effort to generate an income that would enable him to give up lecturing, he published a series of self-help manuals. With titles like The Art of Forgetting the Unpleasant and In Defence of Sensuality, these forays into popular psychology were refreshingly unorthodox in their prescriptions for personal happiness; but they reinforced the perception of Powys as an eccentric figure flailing about on the outer margins of literary and academic respectability.
more from Literary Review here.

THE BELIEVER: In your experience of knowing artists, do you think there’s a discrepancy between why artists tell themselves they’re making art, and the actual reason you perceive them to be making art?
DAVE HICKEY: In my experience, you always think you know what you’re doing; you always think you can explain, but you always discover, years later, that you didn’t and you couldn’t. This leads me to suspect that the principal function of human reason is to rationalize what your lizard brain demands of you. That’s my idea. Art and writing come from somewhere down around the lizard brain. It’s a much more peculiar activity than we like to think it is. The problems arise when we try to domesticate the practice, to pretend that it’s a normal human activity and that “everybody’s creative.” They’re not. Honestly, I never sit down to write anything without thinking, This is a weird thing to be doing! Why am I sitting here writing? Why am I looking at the Ellsworth Kelly on my wall? I don’t know. It feels funny to do these things, but it feels funnier not to, so I write and look.
more from The Believer here.

His reticence, it might be said, extended to narrative itself. Beyond brilliantly meshing visual form with theme—empty canvases with empty lives—Antonioni contributed early to cinema’s migration from Victorian narrative modes, as necessary and welcome a move as was that from Great Expectations to Mrs. Dalloway for literature. Beginning with L’avventura, his films are firmly liberated from Hollywood’s obsessive insistence on the conclusive denouement, on tying things up, whether for better (Mildred Pierce; Stagecoach) or worse (Sunset Boulevard). This was not easy or profitable for the director. The sophisticated audience at Cannes in 1960 was no more prepared than the general public to watch a film whose ostensible heroine not only disappears but is forgotten by the other characters. Probably expecting another film noir, where the body would be found and the mystery solved, the Cannes crowd booed vigorously. But, as Antonioni explained, L’avventura was a noir in reverse. Fortunately, the audience’s disapproval was quickly rejected by great cineastes and critics alike.
more from Artforum here.
From Edge:
What is the self? How does the activity of neurons give rise to the sense of being a conscious human being? Even this most ancient of philosophical problems, I believe, will yield to the methods of empirical science. It now seems increasingly likely that the self is not a holistic property of the entire brain; it arises from the activity of specific sets of interlinked brain circuits. But we need to know which circuits are critically involved and what their functions might be. It is the “turning inward” aspect of the self — its recursiveness — that gives it its peculiar paradoxical quality.
In a wide-ranging talk, Vilayanur Ramachandran explores how brain damage can reveal the connection between the internal structures of the brain and the corresponding functions of the mind. He talks about phantom limb pain, synesthesia (when people hear color or smell sounds), and the Capgras delusion, when brain-damaged people believe their closest friends and family have been replaced with imposters.
More here.
From Science:
What makes Maine coons cuddly and Russian shorthairs standoffish? The answer may lie in the first sequence of the cat genome, published today. In all, geneticists have turned up 20,285 genes, the analysis of which could shed light on everything from human diseases to the underpinnings of feline domestication. The latest sequence comes courtesy of a 4-year-old Abyssinian cat named Cinnamon. The analysis is only a first pass and so is not as complete as that done for humans or dogs. Still, evolutionary biologist Stephen O’Brien and bioinformaticist Joan Pontius of the National Cancer Institute in Frederick, Maryland, and their colleagues were able to decipher about 65% of the gene-containing parts of the genome, or more than 20,000 genes. That’s perhaps 95% of the total, based on comparisons with other genomes.
Based on the structure of the chromosomes, the cat genome much more closely resembles that of humans than that of other nonprimate species. Unlike cats and humans, pieces of chromosomes in “the dog, mouse, rat, and others have been reshuffled like a poker deck,” says O’Brien. This conservation suggests that the cat’s genome has more in common with the ancient ancestor of cats, humans, and other mammals than, say, the dog’s does. Other surprises included an excessive amount of mitochondrial DNA stuck into the cat genome, although researchers have yet to figure out the reason or significance.
More here.
Wednesday, October 31, 2007
From Nature:
It’s not often that research results look this good. An elegant new way to visualize individual brain cells not only provides a major boost to scientists trying to understand how the brain works, but has also won one of its developers a major prize in science photography. The method — described by neuroscientists at Harvard University in Cambridge, Massachusetts, in today’s Nature — allows researchers to see more clearly how individual neurons connect with each other by colouring each one from a palette of about 90 shades. In this way they will be able to build up a detailed diagram of the brain’s wiring, which will help to study how it computes.
More than a century ago, neuroscientists developed the first method of staining individual neurons — with silver chromate. Work with this technique was the basis of the Nobel Prize in Physiology or Medicine in 1906. But this could only stain neurons with one colour. Only in the last decade have scientists improved on this technique, using genetic engineering to transfer genes for fluorescent proteins into mice such that they are expressed in neurons. But until now they could transfer no more than two florescent-protein genes at a time, lighting up the brain with two colours. “It was clear that two colours were not enough to map connections efficiently in the brain’s complex tangle of neurons,” says Joshua Sanes, one of the paper’s senior scientists.
More here.
Natasha Tsangarides in Electronic Intifada:

Despite the promise of the international community, the camp [Shatila] was later desecrated. Using oral histories, Naji and the audience are taken into the world of Shatila refugee camp, where it is estimated that between 2,000 and 3,500 people were murdered in 1982. Here, we learn of the testimonies of a population displaced not once, but several times, who endure hardship within a hostile environment.
The play [Sunlight at Midnight] is thought-provoking, bringing into question many themes such as identity formation, exile and the power of memory, while simultaneously highlighting the failures of the international community and the need to commemorate this tragedy.
As we are introduced to different characters throughout the play, we gain an insight into selective memory and historical narrative. The process of exile has affected each person differently, with the sense of belonging to a Palestinian heritage stronger within the camp. History adopts two meanings in the two worlds we enter: one is associated with power, knowledge and purity, and the other with something threatening or irrelevant.
In the LRB:
This picture – that our minds were formed by processes of evolutionary adaptation, and that the environment they are adapted to isn’t the one that we now inhabit – has had, of late, an extraordinarily favourable press. Darwinism has always been good copy because it has seemed closer to our core than most other branches of science: botany, say, or astronomy or hydrodynamics. But if this new line of thought is anywhere near right, it is closer than we had realised. What used to rile Darwin’s critics most was his account of the phylogeny of our species. They didn’t like our being just one branch among many in the evolutionary tree; and they liked still less having baboons among their family relations. The story of the consequent fracas is legendary, but that argument is over now. Except, perhaps, in remote backwaters of the American Midwest, the Darwinian account of our species’ history is common ground in all civilised discussions, and so it should be. The evidence really is overwhelming.
But Darwin’s theory of evolution has two parts. One is its familiar historical account of our phylogeny; the other is the theory of natural selection, which purports to characterise the mechanism not just of the formation of species, but of all evolutionary changes in the innate properties of organisms. According to selection theory, a creature’s ‘phenotype’ – the inventory of its heritable traits, including, notably, its heritable mental traits – is an adaptation to the demands of its ecological situation. Adaptation is a name for the process by which environmental variables select among the creatures in a population the ones whose heritable properties are most fit for survival and reproduction. So environmental selection for fitness is (perhaps plus or minus a bit) the process par excellence that prunes the evolutionary tree.
More often than not, both halves of the Darwinian synthesis are uttered in the same breath; but it’s important to see that the phylogeny could be true even if the adaptationism isn’t.

Fifty years ago, New American Library published the Mentor Philosophers series, each with a title beginning The Age of . . . Belief, Ideology, Reason, and so on; the 20th-century selections bore the title The Age of Analysis. Had the series continued to the end of that century and into this, the volume should no doubt be The Age of Apology. Our postmodern ethos seems to hold that if anything can be proved to have happened, then surely someone needs to apologize for it.
We live amid a veritable tsunami of apology. The Catholic Church, which, of course, has much to apologize for, has, of late, offered mea culpas to Galileo, the Jews, the gypsies, Jan Hus, whom it burned at the stake in 1415, even to Constantinople (now Istanbul) for its sacking 800 years ago by the knights of the Fourth Crusade, an event for which the late John Paul II expressed “deep regret.” No wonder that a group in England, claiming descent from the medieval Knights Templars, is asking the Vatican to apologize for the violent suppression of the order and for torturing to death its Grand Master Jacques de Molay in 1314, an apology timed to commemorate the 700th anniversary of that fell deed.
more from The American Scholar here.

Thousands of Cubans and foreigners have been flocking to a mausoleum in central Cuba to commemorate the 40th anniversary of Che Guevara’s death. For 10 years, the Cuban government has been telling the world that the body inside the mausoleum is that of the famous guerrilla.
It’s a lie designed to bamboozle the population into worshiping the Argentine-born revolutionary as if he were a saint–and the Cuban Revolution as if it were a religion. A brilliant investigation by French journalist Bertrand de la Grange, recently published in Spain’s El Pais newspaper, demolishes the official version.
In 1995, Bolivian Gen. Mario Vargas, who had fought Che’s guerrillas in the 1960s, revealed that the revolutionary’s body was buried a few meters from the airport runway in Vallegrande, a town close to La Higuera, the village in eastern Bolivia where Guevara was killed on Oct. 9, 1967. (Guevara had been executed after the Bolivian president ordered the soldiers who ambushed and captured him to get rid of him.) Cuba sent a forensic, diplomatic and legal team to Vallegrande. On June 28, 1997, they claimed to have found the body, which was brought to Cuba a few weeks before the 30th anniversary of the guerrilla’s death.
Numerous facts belie the Cuban claim.
more from TNR here.

The meaning of Kahlo’s art comes across in reproductions, but not its full dynamic, which involves brooding subtleties of surface and color. The reproduced images are shiny and bright. The paintings are matte and grayish, drinking and withholding light. (Their display calls for intense illumination—that of the Mexican sun, say. They should not be hung on white walls, as they are at the Walker, where the contrast makes them look like holes in a snowbank.) The lovely, highly varied, blushing colors (even Kahlo’s browns and greens blush) don’t radiate. Fused with represented flesh, foliage, fabrics, and, yes, ribbons and jewelry, they turn their backs to us. The payoff of this reticence is an absorption in the artist’s touch. It’s easy to fantasize that Kahlo’s brushes were fingertips, able to mold her own more than familiar features in the dark. The tactility of certain self-portraits is, among other things, staggeringly sexy. In “Me and My Parrots” (1941), it combines with sharp tonal contrasts of warm color to convey invisible moistness, as of a summertime, full-body, delicate sweat. Elsewhere, the felt oneness of sight and touch stirs harrowing empathy, as in “The Broken Column” (1944). Kahlo’s nude body is split open to reveal a crumbling pillar, nails penetrating her flesh everywhere. Tears flow from her eyes, but her face is dispassionate, as always. Her pain is not her. It just won’t let her mind stray to anything else, for the moment. The work belongs to a category of images with which Kahlo confronted and endured episodes of agony, including heartbreak and rage. (Most piercing are laments of her disastrous pregnancies; she longed for children but physically could not bring a baby to term.) They aren’t great art, but they are moving testaments of a great artist.
more from The New Yorker here.

“There was no Herodotus before Herodotus.” This little pearl, courtesy of the historical polymath Arnaldo Momigliano (1908–1987), belongs to the class of truly illuminating tautologies. When Herodotus, in the middle of the 5th century B.C.E., composed his “history” of the Persian wars, there was simply no one around to tell him how it was done. The result, as anyone who has lost the thread amid one of Herodotus’s labyrinthine geographic detours knows, is anything but a “history” in the familiar sense of the term — that is, scrupulous, meticulous, and humorless. The project is better understood as an “inquiry” — a more accurate translation of the Greek word anyhow — into the shape of the known world, almost as if such an inquiry were necessary to understand, as Herodotus put it in his preface to the work, “the reason why the Greeks and barbarians fought one another.”
more from the NY Sun here.

The cultural understanding of mountains seems so bound up with the aesthetics of the sublime and the advent of Romanticism that it is hard to understand exactly what mountains meant before the eighteenth century. Were they simply seen as blanks, deserts, wild places?
It would be wrong to propose that there’s no refined mountain perception prior to Romanticism. You only need to look at someone like Leonardo da Vinci. He’s making extraordinary sketches of mountain phenomena in the Italian Alps: they’re beautiful, and attentive both to meteorology and geology. In so many ways—as he always does—da Vinci anticipates what’s to come by several centuries. There’s also a biblical tradition of revelation at height: Moses, obviously, on Sinai, or on Mount Pisgah, looking down into the promised land. So there are visionary traditions that precede the late eighteenth century. Petrarch claims to have climbed Mount Ventoux in April 1336; doubts have been voiced about whether he actually made the ascent, but the falsifiability of the account doesn’t really matter, because he gives us one of the first expedition journals (the book of Exodus would be another of these), and one of the first mountain descriptions in which mountain and text, or mountain and representation, become blurred almost to the point of interchangeability. So, in one sense, you can construct a tradition of the visionary and the beautiful for mountains which precedes Romanticism, going as far back as you want to go. But on the other hand, it’s quite possible to argue that mountains existed as little more than wallpaper, by and large, through the Medieval and Early Modern periods.
more from Cabinet here.
From CNN:
Kroo believes the way we fly planes may change. Currently all commercial airliners cruise at speeds of around Mach 0.85 (85 percent of the speed of sound). Kroo believes in the future planes may slow down, say from Mach 0.85 to Mach 0.75.
He also believes planes could fly at lower altitudes because of concerns that contrails affect the atmosphere. Other environmental impacts would also be reduced. Nitrogen oxide emissions, unburned hydrocarbons and water vapor all have an impact strongly related to how long they stay in the atmosphere: lower altitudes help reduce this.
“It’s uncertain, but people are actively planning flight paths at lower speeds and altitudes. The sky in the future may not be filled with white lines,” Kroo said.
This would mean very efficient airplanes flying at slightly slower speeds — a small change in convenience but a profound reduction in environmental impact. A reduction in fuel burn of 50 percent is not out of the question, according to Kroo.
More here.
From The Guardian:
Harper Lee, the author of To Kill a Mockingbird, has been awarded the Presidential Medal of Freedom, by George Bush. Whether or not one of the world’s most publicity-shy literary stars will relish being given America’s highest – and very public – award remains to be seen. According to the citation the reclusive author has been honoured for “an outstanding contribution to America’s literary tradition. At a critical moment in our history, her beautiful book, To Kill a Mockingbird, helped focus the nation on the turbulent struggle for equality.”
Lee was born in Monroeville in 1926, in the deep South, at a time of strict racial segregation. She was a voracious reader who moved to New York determined to become a writer, and succeeded with To Kill A Mockingbird. The book was an instant bestseller and won a Pulitzer prize. It was also made into a hit film starring Gregory Peck, which quickly gained similar “classic” status to the book’s.
More here.