Over at Comment is Free, Terry Eagleton offers a weird piece that implicitly argues that the East lacks civilization and the West lacks culture.
Ever since the early 19th century, culture or civilisation has been the opposite of barbarism. Behind this opposition lay a kind of narrative: first you had barbarism, then civilisation was dredged out of its murky depths. Radical thinkers, by contrast, have always seen barbarism and civilisation as synchronous. This is what the German Marxist Walter Benjamin had in mind when he declared that “every document of civilisation is at the same time a record of barbarism”. For every cathedral, a pit of bones; for every work of art, the mass labour that granted the artist the resources to create it. Civilisation needs to be wrested from nature by violence, but the violence lives on in the coercion used to protect civilisation – a coercion known among other things as the political state.
These days the conflict between civilisation and barbarism has taken an ominous turn. We face a conflict between civilisation and culture, which used to be on the same side. Civilisation means rational reflection, material wellbeing, individual autonomy and ironic self-doubt; culture means a form of life that is customary, collective, passionate, spontaneous, unreflective and arational. It is no surprise, then, to find that we have civilisation whereas they have culture. Culture is the new barbarism. The contrast between west and east is being mapped on a new axis.
At first glance, you might think that everything you need to know about Chris Farley could be written with a dull crayon on the back of a used paper plate—and it essentially was, in the tabloid frenzy following his death: fat, clumsy, loud guy who OD’d like his hero Belushi. Farley’s shtick, as expressed in five seasons of Saturday Night Live and three No. 1 films, was massively simple: He was the fattest of the fat, loudest of the loud, sweatiest of the sweaty, drunkest of the drunk. His comedy consisted almost exclusively of pratfalls and nudity and shouting. To many, he epitomizes arguably the worst era of SNL: the catchphrase-addicted, innovation-free, post-Myers, pre-Ferrell frat-house nadir of a once-mighty institution. The Farley canon, as he left it when he died in 1997 at age 33, is tiny and tainted: the discordant bellowing of Cindy, his fry-eating Gap Girl; his virtuosically incompetent celebrity interviews on “The Chris Farley Show”; Matt Foley, his supremely unmotivating motivational speaker who lives “in a van down by the river.” While even the most skeptical comedy snob must acknowledge, in Farley’s best work, glimmers of something great—a mastery of the algorithms of physical comedy so fresh and weird it seems to border on genius (cf. Foley’s gyroscopic belt-hitching)—every brilliant move tends to get washed out by lazy waves of thoughtless pandering.
The Chris Farley Show—a new biography by Farley’s older brother, Tom, and a former biographer of Belushi, Tanner Colby—shows that Farley’s simplicity was in fact a tremendously complex construct.
Fiction has an entrepreneurial element, akin to the inventor’s secret machine, elixir, or formula. Many novelists have had the experience of falling upon the perfect scene or situation or character, the one that will breed meaning and metaphor throughout the book. Gogol surely knew that he had invented a devastating symbolic structure when he came up with the story of a devil figure who travels around Russia buying up the names of dead serfs; he carefully garaged his secret—in a letter, he warned his correspondent not to tell anybody what “Dead Souls” was about. When we read “Herzog,” we think: how brilliant and simple, like the best of inventions, to have turned something we all do (writing letters in our heads to people we have never met) into a new way of representing consciousness. And when we read “Midnight’s Children” we feel that Salman Rushdie has found a powerful controlling image in the impending midnight of Indian partition, the clock’s hands meeting in prayer.
I don’t know whether Joseph O’Neill jumped out of his bath in Manhattan shrieking “Eureka!” when he realized that, of all the possible subjects in the world, he had to write a novel about playing cricket in New York City, but he should have. Despite cricket’s seeming irrelevance to America, the game makes his exquisitely written novel “Netherland” (Pantheon; $23.95) a large fictional achievement, and one of the most remarkable post-colonial books I have ever read.
At first the corpses in Palermo all look the same: stiff, emaciated, and vague in the features. Some of the attempts to keep them straight seem ludicrous. Monks come first, often swaddled in their habits like babies. Then priests; here ecclesiastical ranks are vigorously maintained. Bishops wear miters and more expensive fabrics. But clothes slip down on the shrunken frames and obscure the features, whatever might be left of them. One of the great lessons of the crypt is that clothes decay too; corpses decay first, and then the possessions they bring, becoming corpses of themselves in their turn. You would have to be an expert on eighteenth and nineteenth century costume to make much of the shredded residues. Not quite true perhaps—as in a child’s drawing, you can tell what the clothes are trying to represent and can summon up the right kind of collar or waistcoat from a Daumier sketch you vaguely remember.
The expressions remind you of Daumier too. Once they begin to turn their necks and stretch their jaws, you interpret the corpses’ strategically deployed spasms as interested looks or angry stares or cries of distress. They may be only crude imitations of character, but like a dog who can stand up or a horse who can count, these people hold your attention just because they can manage something like a smile or a pout.
Proclaimed brilliant for its portrayal of the “gene’s-eye view” of evolution, Mr. Dawkins’s book inverted the focus of natural selection, from Darwin’s weight on species to Mr. Dawkins’s emphasis on the lowly gene itself: Simply put, Mr. Dawkins’s argument is that the crux of natural selection is whether a particular gene — not an individual or a group of individuals — replicates itself in future generations. Those genes that are not replicated into the future have failed at evolution, and those that produce many copies of themselves have succeeded. In Mr. Dawkins’s view, the organisms containing those genes are merely “lumbering robots” or “survival machines” that house and carry genetic information. The implication is that, in these terms, selfishness, even ruthless selfishness, pays off, and altruism does not. Some predicted that this book would be the death knell of the idea of group selection. No longer would evolutionary biologists suggest that natural selection worked to promote the good of the species (group selection) or even the individual and his close relatives who share many of his genes (kin selection, a type of group selection).
But prediction is difficult in a contingent world such as ours, where life is complex and accidents and coincidences wield so much power. Has “The Selfish Gene” in fact killed off group selection ideas? Why not? And what effect has the book had instead? Though selfish genes are still fashionable among evolutionary biologists, group selection and kin selection, its subset, are not dead. In 2007, David Sloan Wilson, professor at Binghampton University, and E.O. Wilson (no relation), a professor emeritus at Harvard University and a Pulitzer Prize winner, proclaimed that Mr. Dawkins had celebrated the death of group selection prematurely. The pair asserted persuasively that altruism and cooperation can be adaptive if they are directed toward relatives who share a suite of one’s genes (kin selection) or if relationships can be established within a group in which cooperation is rewarded with future reciprocity.
This spring, the President’s Council on Bioethics released a 555-page report, titled Human Dignity and Bioethics. The Council, created in 2001 by George W. Bush, is a panel of scholars charged with advising the president and exploring policy issues related to the ethics of biomedical innovation, including drugs that would enhance cognition, genetic manipulation of animals or humans, therapies that could extend the lifespan, and embryonic stem cells and so-called “therapeutic cloning” that could furnish replacements for diseased tissue and organs. Advances like these, if translated into freely undertaken treatments, could make millions of people better off and no one worse off. So what’s not to like? The advances do not raise the traditional concerns of bioethics, which focuses on potential harm and coercion of patients or research subjects. What, then, are the ethical concerns that call for a presidential council?
Many people are vaguely disquieted by developments (real or imagined) that could alter minds and bodies in novel ways. Romantics and Greens tend to idealize the natural and demonize technology. Traditionalists and conservatives by temperament distrust radical change. Egalitarians worry about an arms race in enhancement techniques. And anyone is likely to have a “yuck” response when contemplating unprecedented manipulations of our biology. The President’s Council has become a forum for the airing of this disquiet, and the concept of “dignity” a rubric for expounding on it. This collection of essays is the culmination of a long effort by the Council to place dignity at the center of bioethics. The general feeling is that, even if a new technology would improve life and health and decrease suffering and waste, it might have to be rejected, or even outlawed, if it affronted human dignity.
…………………………………………………………………………. …………………………………………………………………………….. A hot wind curls the leaves and chases the dogs digging deep into the dry soil. I live in the gut of the bright failure called America. I live in this hell named Nebraska. It’s one hundred and seven today and grasshoppers from outer space are dancing in my brain. The air-conditioner is broke so I run a tub of cold water and submerge every half hour. There’s a wet trail from the bath to the couch and nearby fan. The air is heavy with grain dust. The “wheaties” are up from Oklahoma with their caravan of combines. I crave winter. I want a blizzard that blinds me to my fellow man. These are my dark times. Every other day I grieve for the me that was and every man or woman I see fills me with contempt. Nine out of ten Skins in town are hang-around-the-fort welfare addicts. Every weekend their violence and drunken wretchedness fills the county jail, but I’m far beyond embarrassment because the white people are even worse. Varied branches of that inbred, toothless mountain trash in “Deliverance,” settled here and now own the bank and most businesses. It’s undeniably true that these white people in Cowturdville could be hillbillies except for the fact that these are The Plains.
Drive on, rednecks, to the edge of your flat world and fall down to a better hell.
Every single thing about this town is sadly second-rate and I haven’t been laid in more than two years and there’s this fat lady with varicose veins who calls me late at night and begs me to come over to her trailer for a drink. Here, in this Panhandle town, farm kids speed desperately up and down the main drag wearing baseball caps backwards and throwing gang signs they’ve seen on the tube and their parents, glad they’re old and tired, truly believe that those pictures we’re now getting from Mars have meaning. As far as I can tell, I’m one of the few people in Cowturdville who’s gone to college and I often wish I never had, but Christ on a pogo stick . . . I think I’m starting to like it here in this American heartland.
Thunderheads are forming and the sweet-ass rain of forgiveness is in the air.
Well, possums, we have to hand it to the Ursus Major himself. In the above clip, which was shot in Puerto Rico during the finale, Tom Colicchio fillets this season of Top Chef as if it were a monkfish, and stops just short of throwing the remains in the trash.
Granted, he does it very politely—this season “is a hard one to read,” “a hard season to sort of get your hands around,” “a funny season,” “lots of ups and downs”—but the import is the same: this season has been rather meh.
We couldn’t agree more.
At Judges’ Table, and elsewhere, Tom is rather fond of saying, “This is Top Chef, not Top [Fill in the Blank].” And yet, with a few exceptions, this season has been exactly that: Top Caterer, Top Block-partyer, Top Tailgater, Top Home Cook, and Top Single Mother.
Indeed, we found it particularly revealing when, during that ghastly kids’ challenge,** Gail Simmons said of Stephanie’s dish that it was typical of a restaurant chef who doesn’t cook much at home. Oh wait, that’s a problem? Because, you know, we thought this show was called Top Chef.
But that statement, we think, lays bare the ethos (and the problem) of this season.
Errol Morris steps into the controversial over at the NYT blog Zoom [also see the depressing comments]:
The following essay shows how a photograph aided and abetted a terrible miscarriage of justice. I invite readers to offer their own interpretation of the considerable amount of material contained in the footnotes…
“How can you say she’s a good person?” I am sitting in an editing-room in Cambridge, Mass. arguing with one of my editors. I reply, “Well, exactly what is it that she did that is bad?” We are arguing about Sabrina Harman, one of the notorious “seven bad apples” convicted of abuse in the notorious Abu Ghraib scandal. My editor becomes increasingly irritable. (I have that effect on people.) He looks at me as you would a child. “What did she do that is bad? Are you joking?” And then he brings up the trump card, the photograph with the smile. “How do you get past that? The smile? Just look at it. Come on.”
The question kept coming up. How do you explain the smile? What does it mean? Not only is she smiling, she is smiling with her thumbs-up – over a dead body. The photograph suggests that she may have killed the guy, and she looks proud of it. She looks happy.
I should back up a moment.
This is one of the central images in a rogue’s gallery of snapshots, a photograph taken at Abu Ghraib prison in the fall of 2003. It is a photograph taken by Chuck Graner of Sabrina Harman – posed and looking into the lens of the camera.
In my filmed interview for my documentary “Standard Operating Procedure” Sabrina explains her thumbs-up and her smile:[1]
SABRINA HARMAN: I kind of picked up the thumbs-up from the kids in Al Hilla, and so whenever I would get into a photo, I never know what to do with my hands… So any kind of photo, I probably have a thumbs-up because it’s just — I just picked it up from the kids. It’s just something that automatically happens. Like when you get into a photo, you want to smile. It’s just, I guess, something I did.
The uncanny valley comes into play here, which we usually think of in terms of robots, cartoon characters, and other pseudo anthropomorphic characters attempting and failing to look sufficiently human and therefore appearing creepy and scary. With an increasing amount of photo retouching, postproduction in film, plastic surgery, and increasingly effective makeup & skin care products, we’re being bombarded with a growing amount of imagery featuring people who don’t appear naturally human.
On 4 October 2007 the Council of Europe Parliamentary Assembly passed Resolution 1580, which issued a stark warning: creationism, the denial of Darwinian evolution, is on the rise in Europe. The resolution focused on the way that creationists across the continent, using the model pioneered in America, have been targeting education, and warned of “a real risk of serious confusion being introduced into our children’s minds between what has to do with convictions, beliefs, ideals of all sorts and what has to do with science”. “An ‘all things are equal’ attitude,” it concludes, “may seem appealing and tolerant, but is in fact dangerous.”
The resolution urged member states to “defend and promote scientific knowledge” and “firmly oppose the teaching of creationism as a scientific discipline on an equal footing with the theory of evolution.” But what provoked this European body to issue such an uncharacteristically clear and forthright statement?
The resolution was based on a comprehensive report prepared by the Committee on Culture, Science and Education and delivered to the Assembly by the special rapporteur Guy Lengage on 5 June 2007. This report synthesised research from across the EU citing examples of the rise of creationism in 14 member states, as well as significant non-members Russia, Serbia and Turkey. Examples cited of a growing creationist influence ranged from subtle downgrading of evolution in science education to outright attacks on the validity of Darwinism and the personality of Darwin himself.
Seeing is easy. We open our eyes, and there the world is–in starlight or sunlight, still or in motion, as far as the Pleiades or as close as the tips of our noses. The experience of vision is so common and effortless that we rarely pause to consider what an astounding feat it is: Every time our eyes open, they encode our surroundings as a pattern of electrical signals, which the brain translates into our moving, colorful, three-dimensional perception of the world.
This everyday miracle has attracted the devotion and expertise of an unlikely individual–Alan Litke, an experimental particle physicist based at the University of California, Santa Cruz. When not in Geneva, Switzerland, where he is working on the ATLAS particle detector for the Large Hadron Collider, Litke is working with neuroscientists and engineers, adapting the technology of high-energy physics to study the visual system.
The central challenge is to understand the language the eye uses to send information to the brain. Light reflected from our surroundings enters our eyes through the transparent window of the cornea and is focused by the lens, forming an image on the retina. The retina of each eye contains about 125 million light-sensitive rods and cones, which translate light into electrical and chemical signals. These signals travel to the visual centers of the brain through a million retinal ganglion cells, or RGCs.
The retina thus encodes the activity of 125 million cells in the signals of one million output cells, which deliver the brain a highly compressed neural code from which our entire visual experience is derived. Litke wants to understand how this neural network processes information from our surroundings and portrays it to the brain.
The relative weights of sensory and emotional influences on orgasm may differ between the sexes, perhaps because of its diverging evolutionary origins. Orgasm in men is directly tied to reproduction through ejaculation, whereas female orgasm has a less obvious evolutionary role. Orgasm in a woman might physically aid in the retention of sperm, or it may play a subtler social function, such as facilitating bonding with her mate. If female orgasm evolved primarily for social reasons, it might elicit more complex thoughts and feelings in women than it does in men.
But does it? Researchers are trying to crack this riddle by probing changes in brain activity during orgasm in both men and women. Neuroscientist Gert Holstege of the University of Groningen in the Netherlands and his colleagues attempted to solve the male side of the equation by asking the female partners of 11 men to stimulate their partner’s penis until he ejaculated while they scanned his brain using positron-emission tomography (PET). During ejaculation, the researchers saw extraordinary activation of the ventral tegmental area (VTA), a major hub of the brain’s reward circuitry; the intensity of this response is comparable to that induced by heroin. “Because ejaculation introduces sperm into the female reproductive tract, it would be critical for reproduction of the species to favor ejaculation as a most rewarding behavior,” the researchers wrote in 2003 in The Journal of Neuroscience.
Ah, Grief, I should not treat you like a homeless dog who comes to the back door for a crust, for a meatless bone. I should trust you.
I should coax you into the house and give you your own corner, a worn mat to lie on, your own water dish.
You think I don’t know you’ve been living under my porch. You long for your real place to be readied before winter comes. You need your name, your collar and tag. You need the right to warn off intruders, to consider my house your own and me your person and yourself my own dog.
Future generations will suffer most of the harmful effects of global climate change. Yet if the world economy grows, they will be richer than we are.
The present generation must decide, with the help of expert advice from economists, whether to aggressively reduce the chances of future harm or to let our richer descendants largely fend for themselves.
Economists cannot avoid making ethical choices in formulating their advice.
Even the small chance of utter catastrophe from global warming raises special problems for ethical discussion.
What should we do about climate change? The question is an ethical one. Science, including the science of economics, can help discover the causes and effects of climate change. It can also help work out what we can do about climate change. But what we should do is an ethical question.
When older people can no longer remember names at a cocktail party, they tend to think that their brainpower is declining. But a growing number of studies suggest that this assumption is often wrong. Instead, the research finds, the aging brain is simply taking in more data and trying to sift through a clutter of information, often to its long-term benefit. The studies are analyzed in a new edition of a neurology book, “Progress in Brain Research.” Some brains do deteriorate with age. Alzheimer’s disease, for example, strikes 13 percent of Americans 65 and older. But for most aging adults, the authors say, much of what occurs is a gradually widening focus of attention that makes it more difficult to latch onto just one fact, like a name or a telephone number. Although that can be frustrating, it is often useful.
“It may be that distractibility is not, in fact, a bad thing,” said Shelley H. Carson, a psychology researcher at Harvard whose work was cited in the book. “It may increase the amount of information available to the conscious mind.” For example, in studies where subjects are asked to read passages that are interrupted with unexpected words or phrases, adults 60 and older work much more slowly than college students. Although the students plow through the texts at a consistent speed regardless of what the out-of-place words mean, older people slow down even more when the words are related to the topic at hand. That indicates that they are not just stumbling over the extra information, but are taking it in and processing it.
JEJU-DO—I’ve been meaning to respond to a reader of my post on weird Korean stuff, who suggested that I should have included kimchi. There’s a good reason I didn’t. For every item on that list, I’m sure you could find at least a few Koreans to vouch for its weirdness—someone to say, “Listen, I agree with you: It’s a little off that my kid wants to stick his finger up your ass.”
I don’t believe there is a Korean person alive or dead who would concede that kimchi is weird. Nor, having lived in Korea for more than a year, am I able to do so. (Smelly, yes; weird, no.) In Korea, kimchi is more than a foodstuff. It’s a national icon, a cultural treasure, a palpable expression of the country’s feisty spirit and determination throughout history to grow and protect its own unique soul—to resist wholesale assimilation into the more megalithic cultures of Asia, through culinary defense. It’s a cure-all, a protective shield, a magic balm and a goddess of plenty. Without kimchi, Korea would not be the same country—there might be a nation in the same place, and it might even be called the same thing, but it would not be Korea.
There is a general recognition of a ‘late style’ in music and literature – a turn to a vital asperity towards the end of a life of composition à la Beethoven or Yeats – but less so in visual art, at least among prominent Modernists. One exception is Matisse, who, in his late cutouts, returned with gusto to ‘the purity of means’ that marked his early Fauve paintings. With a temporary piece at the Grand Palais in Paris that also combines simplicity and grandeur, Richard Serra anticipates a late style of his own.
Just a year ago a retrospective at the Museum of Modern Art charted the rigorous development of Serra’s sculptural language, from a direct engagement with rubber and lead in his early pieces to an elaborate turning of steel plates in his celebrated arcs, ellipses and spirals of the last three decades. An early example of this later idiom, Clara-Clara, first exhibited in a Serra retrospective at the Centre Pompidou in 1983, has now reappeared on its original site in the Tuileries. (The director of the Pompidou, Alfred Pacquement, curator of that show, is also curator of the two pieces presently in Paris.) Set along the grand axis from the Louvre to the Arc de Triomphe, Clara-Clara consists of two opposed curves of steel, 33 metres long and four metres high, one of which leans towards the central line, the other away. Placed near the place de la Concorde on the esplanade designed by Le Nôtre for Louis XIV, Clara-Clara is baroque in its own manner, playing boldly with the strict geometry of the grand axis. In this way it also initiates the promenade to the new piece at the Grand Palais, which Serra, in an acknowledgment of the ambulatory sociability featured in Impressionist painting as well as the directed movement of the viewer through his own work, has titled Promenade.
In earlier times, when original sin was taken more seriously than it generally is today, the suffering of animals posed a particularly difficult problem for thoughtful Christians. The 17th-century French philosopher René Descartes solved it by the drastic expedient of denying that animals can suffer. Animals, he maintained, are merely ingenious mechanisms, and we should not take their cries and struggles as a sign of pain, any more than we take the sound of an alarm clock as a sign that it has consciousness.
People who live with a dog or a cat are not likely to find that persuasive. Last month, at Biola University, a Christian college in southern California, I debated the existence of God with the conservative commentator Dinesh D’Souza. In recent months, D’Souza has made a point of debating prominent atheists, but he, too, struggled to find a convincing answer to the problem I outlined above.
He first said that, because humans can live forever in heaven, the suffering of this world is less important than it would be if our life in this world were the only life we had. That still fails to explain why an all-powerful and all-good god would permit it. Relatively insignificant as this suffering may be from the perspective of eternity, the world would be better without it, or at least without most of it.