An Excerpt from ‘Ernest Gellner: An Intellectual Biography’

OB-JI496_Gellne_DV_20100722170456From John Hall's new book, in the WSJ:

When Ernest Gellner died in December 1995, the flags of the University of Cambridge, where he had taught from 1984 to 1992, were set at half mast. This reflected the status he had achieved in the last years of his life, as a public intellectual able to comment on a very wide range of issues. It did not mean, however, that his views had lost their bite. If Gellner's name had been made during the scandal surrounding his early attack on Oxford linguistic philosophy, his late essays – not least his attack on Isaiah Berlin as a 'Savile Row postmodernist' – were capable of causing just as much outrage. Still, many felt affection for Gellner, with whose voice they had become familiar, and to whom they often turned for guidance and insight. All the same, very few people knew what to make of him. He was hard to pin down. For two decades he had the curious title of Professor of Sociology with special reference to philosophy at the London School of Economics and Political Science (LSE) – held, it should be noted, in two different departments: first Sociology, then Philosophy, Logic and Scientific Method – before taking up the William Wyse Professorship of Social Anthropology at the University of Cambridge. He had separate reputations as scholar of Islam, theorist of nationalism, philosopher of history, and historian of ideas. He ended his career in Prague, the city in which he had grown up as a boy, though in his final years he was most interested in developments in Russia. His status as public intellectual rested on this background, that of a multilingual polymath, a modern philosophe. He was sometimes cited as one of the last great thinkers from Central Europe whose Jewish background meant a direct experience of the twentieth century's horrors.

It is possible to hint at what follows by noting the very particular way in which Gellner fits into this last category. The contours of his formative experiences are clear, and were pungently expressed by Gellner himself when discussing the work of Hannah Arendt. The rise of nationalist sentiment at the end of the nineteenth century created a dilemma for Jews, especially those who had experienced the Enlightenment and an end to anti-Jewish discrimination by the state. Gellner insisted that the return to cultural roots was always an illusion, a piece of pure romanticism he neatly illustrated by noting sardonically that 'it was the great ladies at the Budapest Opera who really went to town in peasant dresses, or dresses claimed to be such'. Illusion or no, the Jews felt the pull of belonging just as much as others did – perhaps even more. But the romantic call to belong affected the minority Jewish community and the demographic majority in two very different ways.

Emotions help animals to make choices

From PhysOrg:

Emotionshelp To understand how animals experience the world and how they should be treated, people need to better understand their emotional lives. A new review of animal emotion suggests that, as in humans, emotions may tell animals about how dangerous or opportunity-laden their world is, and guide the choices that they make.

An animal living in a world where it is regularly threatened by predators will develop a negative emotion or 'mood', such as anxiety, whereas one in an environment with plenty of opportunities to acquire resources for survival will be in a more positive . The researchers argue that these emotional states not only reflect the animal's experiences, they also help it decide how to make choices, especially in ambiguous situations, which could have good or bad outcomes. An animal in a negative state will benefit from adopting a safety-first, 'pessimistic' response to an ambiguous event — for example interpreting a rustle in the grass as signalling a predator – while an animal in a positive state will benefit from a more 'optimistic' response, interpreting it as signalling prey.

More here.

Jilted salamanders lash out

From Nature:

Salamandar Amphibian Casanovas beware: the ladies aren't likely to take infidelity lying down. Male salamanders returning home after a night of disloyalty can expect a beating, a new study reveals. The finding has surprised behavioural researchers: most instances of infidelity punishment place females on the receiving end of the abuse. Female red-backed salamanders are no different, but they don't just take it — they dish it out too, finds behavioural ecologist Ethan Prosen of the University of Louisiana in Lafayette, who led the research1.

“This is the only species I know of where the male is intimidating the female and the female returns the favour,” says Prosen. No one is sure how common infidelity is among salamanders. But male red-backed salamanders are known to be aggressive toward female partners that have visited other males. But as males and females of this species are evenly matched in size, Prosen wondered why the females were putting up with this violent treatment.

More here.

Wednesday Poem

Snowsbanks North of the House

Those great sweeps of snow that stop suddenly six
feet from the house …
Thoughts that go so far.
The boy gets out of high school and reads no more
books;
the son stops calling home.
The mother puts down her rolling pin and makes no
more bread.
And the wife looks at her husband one night at a
party, and loves him no more.
The energy leaves the wine, and the minister falls
leaving the church.
It will not come closer
the one inside moves back, and the hands touch
nothing, and are safe.

The father grieves for his son, and will not leave the
room where the coffin stands.
He turns away from his wife, and she sleeps alone.

And the sea lifts and falls all night, the moon goes on
through the unattached heavens alone.

The toe of the shoe pivots
in the dust …
And the man in the black coat turns, and goes back
down the hill.
No one knows why he came, or why he turned away,
and did not climb the hill.

by Robert Bly
from The Man in the Black Coat Turns

derivatives

Babylon_0324

In “Babylon Revisited,” F. Scott Fitzgerald’s 1931 short story about the aftermath of the 1929 Wall Street crash, Fitzgerald makes the point that such collapses are slips in morality as much as financial failures. Charlie Wales, the story’s emotionally fragile hero, returns to Paris in a desperate effort to regain custody of his nine-year-old daughter. “I heard that you lost a lot in the crash,” says the Ritz bartender. Implying his moral lapses, Charlie replies that yes, he did, “but I lost everything I wanted in the boom.” In fact, upper-middle-class people like Charlie hesitated during the first months of the market’s run-up—until early in 1928. That was when they joined the gambling frenzy, and that was when, as John Kenneth Galbraith wrote in The Great Crash, 1929, the “mass escape into make-believe, so much a part of the true speculative orgy, started in earnest.” Eight decades later, stock-market investors like Charlie had no role in bringing on or profiting from the 2008 financial crisis. This time they stood on the sideline as major financial institutions engaged in a speculative orgy. Guided by no moral compass, the most sophisticated financial players in the world were betting big with one another about interest rates, commodity prices, and whether companies or governments would default.

more from William J. Quirk at the American Scholar here.

the new great game

Azcauccamap

In the 19th Century the British Empire and Tsarist Russia competed for hegemony in Central Asia. London fought to slow down Moscow’s expansionism, fearing that tessera after tessera Russia would have reached the borders of India, which was Britain’s most prized possession. Russia instead worked to restrict British influence, in a region perceived both as its own backyard and as a buffer zone. The comings and goings of spies and wheeler-dealers, ruthless traders and officers, the ups and downs of plots and intrigue, double-crossing and diplomatic discourtesies reported between the Caspian and Kabul during the 19th Century were catalogued under one single heading, the Great Game. The copyright, it is said, came from Arthur Conolly, an English secret agent serving with the East India Company. The novelist Rudyard Kipling took possession of the saying, bringing it to the attention of the public. Times change, as do situations, empires die and imperial democracies are born, but Central Asia, this vast portion of the world bordered on the west by the Caspian, on the east by China, on the north by Russia and on the south by Pakistan, Afghanistan and Iran, continues to be the theatre of significant manoeuvring. It is no coincidence that geopolitical analysts call it the New Great Game.

more from Matteo Tacconi at Reset here.

the listeners

Levi_1

“You do not interest me. No man can say these words to another without committing a cruelty and offending against justice,” writes philosopher Simone Weil. To turn a deaf ear is an offence not only to the ignored person but also to thinking, justice and ethics. Coleridge’s Ancient Mariner is cursed because no one will listen to his story. The Italian chemist-turned-writer Primo Levi was preoccupied with this fable because of his fear that on returning from Auschwitz people like him would be either ignored or simply disbelieved. Regardless, listening gets a very mixed press amongst critics and intellectuals. There is a suspicion of “wistful optimism” or the quasi-religious appeal to “hold hands” and play priest at the confessional. These qualms miss the centrality of listening to a radical humanism which recognises that dialogue is not merely about consensus or agreement but engagement and criticism. This is something that Primo Levi understood.

more from Les Back at Eurozine here.

james franco is interesting

Franco100726_1_250

Not so long ago, James Franco’s life and career were fairly normal. He grew up in Palo Alto, California, where his parents had met as Stanford students. Young James was, at his father’s urging, a math whiz—he even got an internship at Lockheed Martin. As a teenager, he rebelled, got in trouble with the law (drinking, shoplifting, graffiti), and eventually migrated toward the arts. His hero was Faulkner. He fell in love with acting when he played the lead in a couple of dark and heavy high-school plays. After freshman year, he dropped out of UCLA, very much against his parents’ wishes, to try to make a career of it. He was good, lucky, and driven, and within a couple of years, he got his first big break: Judd Apatow cast him in what would become the cult TV series Freaks and Geeks. When the series was canceled after just a season, Franco landed the lead in the TNT biopic James Dean. He played the part with a slumping intensity that seemed like a reasonable replication of the real thing—or at least much closer than anyone had a right to expect from a TNT biopic—and the performance won a Golden Globe. Soon after, he was cast as Robert De Niro’s drug-addicted son in the film City by the Sea. That same year, he entered mainstream consciousness as Peter Parker’s best friend in Spider-Man. Franco had become, in other words, a working Hollywood actor. An unusual actor—he overprepared for minor roles, read Dostoyevsky and Proust between takes, and occasionally drove colleagues crazy with his intensity—but still identifiably an actor, with an actor’s career. As he climbed toward leading-man status, however, Franco had a crisis of faith. He found himself cast in a string of mediocre films—Annapolis, Flyboys, Tristan + Isolde—most of which bombed. He felt like he was funneling all his effort into glossy, big-budget entertainment over which he had no control, and of which he wasn’t proud.

more from Sam Anderson at New York Magazine here.

The Honor of Exile

647_thumb Norman Manea in Project Syndicate:

The Romanian sculptor Brancusi once said that when the artist is no longer a child, he is dead. I still don't know how much of an artist I have become, but I grasp what Brancusi was saying. I can grasp – even at my age – my childish enduring self. Writing is a childish profession, even when it becomes excessively serious, as children often are.

My long road of immaturity began more than half a century ago. It was July 1945, a few months after I returned from a concentration camp called Transnistria. I lived that paradisiacal summer in a little Moldavian town, overwhelmed by the miraculous banality of a normal, secure environment. The particular afternoon was perfect, sunny and quiet, the room's semi-obscurity hospitable. I was alone in the universe, listening to a voice that was and wasn't mine. My partner was a book of Romanian fairy tales with a hard green cover, which I had been given a few days before when I turned the solemn age of 9.

That is when the wonder of words, the magic of literature started for me. Illness and therapy began at the same time. Soon, too soon, I wanted myself to be part of that family of word wizards, those secret relatives of mine. It was a way of searching for “something else'' beyond the triviality of everyday life, and also of searching for my real self among the many individuals who inhabited me.

What Darwin Got Wrong

Fodor200 John Dupre reviews Fodor and Piatelli-Palmarini's What Darwin Got Wrong, in The Philosopher's Magazine:

Neo-Darwinism is, very roughly, the claim that natural selection is by far the most important explanation of biological form, the particular characteristics of particular kinds of organism. It usually includes a commitment to gradualism (the idea that evolution occurs in small steps), and often involves attributing central importance to genes as the units that natural selection selects, or at any rate as the objective measure of evolutionary change. Versions have been prominently defended in recent years by such authors as Richard Dawkins, Daniel Dennett and Jerry Coyne.

Neo-Darwinism is, however, a perspective under ever-growing pressure, not (or not only) from the antiscientific assaults of the religious, but from the advancement of science. The decline of this intellectual monolith is generally to be welcomed, not least because it may be expected to bring down with it some of its less appetising academic fellow travellers, most notably Evolutionary Psychology. At the same time those contributing to the demise of neo-Darwinism must be aware of the risk, especially in the United States, that they will provide succour for fundamentalist Creationists and aficionados of so-called Intelligent Design.

Fodor and Piatelli-Palmarini’s (henceforth FPP) book is intended as a contribution to the critical task just mentioned, and they are well aware of the potential hazards. Sadly, however, the book is an almost tragic failure: it is unlikely to be taken seriously as a contribution to the dismantling of neo-Darwinism and it has been, and will continue to be, picked up by the fundamentalist enemies of science.

The first half of the book does a decent job of summarising the recent scientific insights responsible for the growing difficulties facing neo-Darwinism. Neo-Darwinism, by virtue of its emphasis on natural selection, sees evolution as driven from outside, by the environment. Central among the difficulties that FPP emphasise are crucial respects in which evolution is constrained, or even driven, by internal features of the organism. This realisation has been promoted by evolutionary developmental biology (“evo-devo”), which has also highlighted the unacceptable black-boxing of development in mainstream evolutionary theory, a concomitant of the exclusive focus on external determinants of change. Also crucial has been a gradual move away from excessively atomistic views of organisms and an appreciation of the necessity of treating them as integrated wholes, illustrated by the impossibility of analysing the genome into a unique set of discrete elements, “genes”. And equally important has been the disclosure of the complexity of the relations between genomes and phenotypes.

While much material is presented that does indeed reveal the dire straits in which neo-Darwinism finds itself, the overall argument is generally elusive. I speculate that this is because there are two quite different conclusions in the offing.

Hard to Find

ScienceHardtoFind__1279308039_2472Samuel Arbesman in The Boston Globe:

If you look back on history, you get the sense that scientific discoveries used to be easy. Galileo rolled objects down slopes. Robert Hooke played with a spring to learn about elasticity; Isaac Newton poked around his own eye with a darning needle to understand color perception. It took creativity and knowledge to ask the right questions, but the experiments themselves could be almost trivial.

Today, if you want to make a discovery in physics, it helps to be part of a 10,000-member team that runs a multibillion dollar atom smasher. It takes ever more money, more effort, and more people to find out new things.

But until recently, no one actually tried to measure the increasing difficulty of discovery. It certainly seems to be getting harder, but how much harder? How fast does it change?

This type of research, studying the science of science, is in fact a field of science itself, and is known as scientometrics. Scientometrics may sound self-absorbed, a kind of inside baseball for scientists, but it matters: We spend billions of dollars annually on research, and count on science to do such things as cure cancer and master space travel, so it’s good to know what really works.

From its early days of charting the number of yearly articles published in physics, scientometrics has broadened to yield all sorts of insights about how we generate knowledge. A study of the age at which scientists receive grants from the National Institutes of Health found that over the past decades, older scientists have become far more likely to receive grants than younger ones, suggesting that perhaps younger scientists are being given fewer chances to be innovative. In another study, researchers at Northwestern University found that high-impact research results are more likely to come from collaborative teams — often spanning multiple universities — rather than from a single scientist. In other words, the days of the lone hero scientist are vanishing, and you can measure it.

Birth of a Salesman

Salesman-bodyAmitava Kumar in Guernica:

So, sadly, the dreamers and the haters are not two groups. They are often one and the same persons. —Arjun Appadurai, Fear of Small Numbers

The U.S. government’s exhibit 1002 in United States of America v. Hemant Lakhani was a document from the commissary in the Corrections Bureau in Passaic County, New Jersey. It indicated that on March 16, 2005, in the “early afternoon hours the defendant went to the commissary and notwithstanding his medical condition ordered four bags of hot buffalo chips.” That same afternoon, the defendant also purchased one bag of crunchy cheese chips. Assistant U.S. Attorney Stuart Rabner flipped through the rest of the pages of exhibit 1002. On March 21, Rabner told the U.S. Court of Appeals for the Third Circuit, the defendant had received five bags of hot buffalo chips, five bags of salty peanuts, and five bags of crunchy cheese chips. On March 28, he received one cheese pizza, and again, five bags each of hot buffalo chips, salted peanuts, and crunchy cheese chips—and five apple pies. Turning to another page, Rabner said that on April 8 the defendant had ordered five bags of hot buffalo chips, five bags of salted peanuts, and two bags of crunchy cheese chips. And then on April 11, the food items ordered were five bags of hot buffalo chips, five bags of salted peanuts, three apple pies, two honey buns, and a cheese pizza.

“The defendant’s conduct,” the prosecutor argued, “can indeed be determined to be a contributing factor to the swollen legs that he now complains about and on which basis he seeks an adjournment of this trial. He should not be allowed.”

Hemant Lakhani’s diet was under scrutiny because he had undergone three surgeries in three weeks. The trial had begun in early January, but only ten days later the defendant had needed to be hospitalized. On the morning of January 14, a deputy marshal informed the court that the defendant had been admitted the previous evening at the St. Barnabas Medical Center in New Jersey with a variety of problems: a hernia, a congenital heart condition, and renal failure. Speaking on record four days later, Lakhani’s doctor reminded the court that his patient was nearly seventy. He was probably suffering from hypertension. And it was possible that his heart needed surgical treatment. Later that week, Lakhani underwent an angioplasty and a pacemaker was inserted into his body. He was having problems with one of his knees and a rheumatologist had been pressed into service. The court couldn’t meet for three weeks because the defendant had needed time to recuperate.

Henry Klingeman, the defendant’s lawyer, stated that his client had described the jail food as “inedible,” and had complained that he wasn’t given rice, which had “been a staple of his diet for his entire life.” The commissary food was used as a “supplement” and, because he had a “sweet tooth,” he used to order apple pies.

The judge in the case, Katharine Hayden, took a considered view of the medical opinion she had been provided about the defendant. She declared that Lakhani was “ready to go” and commented with some concern that the diet the defendant had chosen was “loaded with salt” and “loaded with sugar.” She noted that Lakhani had more than once refused nutritious meals consisting of salad, bread, beans, apples, cookies, and hard-boiled eggs. With adequate good reason, the appeal to adjourn was denied by the judge.

Judicial trials by their very nature are about acts. They concern themselves with what has actually been done by an individual or group. But the Lakhani trial from the very beginning had seemed to be about who he was rather than what he had done.

It Wasn’t a War

Kate Perkins interviews Norman Finkelstein in Guernica:

Finkle-Body The career of radical political scholar Norman Finkelstein might be described as a sort of heroic painting-into-a-corner. The son of Holocaust survivors, his life’s work has been dedicated to exposing the hypocrisy, ideology, and violence that sustains the Israeli occupation of Palestine. The dimensions of his emphatic anti-Zionism, expounded over the course of six meticulously researched and often polemical books on Israel, Palestine, and the legacy of the Holocaust, have made him a pariah in the mainstream and a hero amongst supporters of Palestinian liberation. The high controversy around Finkelstein’s politics has penetrated university walls on more than one occasion, making his academic career fraught with defensive, uphill battles. I first met Finkelstein in 2007, in the eye of a storm of controversy surrounding his academic status at DePaul University. Despite his prolific and highly influential body of critical scholarship—and after first having been approved for tenure at DePaul by both department and faculty committees—Finkelstein’s tenure had ultimately been denied—minority dissenters had campaigned successfully against his appointment. Flanked by a supporting cast of speakers including Tariq Ali, Tony Judt, and Noam Chomsky (via satellite), Finkelstein stood before some one thousand six hundred people in the University of Chicago’s packed Rockefeller Chapel to make the case for academic freedom. Contrary to his reputedly prickly demeanor, he appeared extraordinarily collected and calm, his heavy brow furrowing only slightly over sharp, dark eyes as he prepared to publicly address the charges against him. (The university’s final word on the matter was that Dr. Finkelstein’s reputation for outspoken criticism of Israel and of Israeli apologists like Harvard Law Professor Alan Dershowitz made Finkelstein unfit for tenure at DePaul, a school of “Vincentian values.”)

It was the culmination of a long struggle to advance his radical political critique of Israel and of the American Israeli lobby from within the academy. Now an independent scholar, Dr. Finkelstein remains a leading voice of dissent against the pro-Israel policies that underwrite an apartheid regime enforced by egregious war crimes and human rights violations. In This Time We Went Too Far: Truth and Consequences of the Gaza Invasion, his first book since departing from DePaul—he argues that Israel’s November, 2008 invasion of Gaza, which decisively ended a fragile ceasefire brokered by Egypt that June, marked the beginning of an unprecedented decline in public support for Israel. The book’s epilogue is devoted to the Goldstone Report, a document authored by renowned South African jurist Richard Goldstone that describes the damning conclusions of a U.N.-commissioned investigation into the Gaza invasion, including charges of war crimes against Israel.

More here.

A System for Connecting Brains to the Outside World

From The New York Times:

Brain About four years ago, John Donoghue’s son, Jacob, then 18, took his father aside and declared, “Dad, I now understand what you do — you’re ‘The Matrix’!” Dr. Donoghue, 61, is a professor of engineering and neuroscience at Brown University, studying how human brain signals could combine with modern electronics to help paralyzed people gain greater control over their environments. He’s designed a machine, the BrainGate, that uses thought to move objects. We spoke for two hours in his Brown University offices in Providence, R.I., and then again by telephone. An edited version of the two conversations follows:

Q. WHAT EXACTLY IS BRAINGATE?

A. It’s a way for people who’ve been paralyzed by strokes, spinal cord injuries or A.L.S. to connect their brains to the outside world. The system uses a tiny sensor that’s been implanted into the part of a person’s brain that generates movement commands. This sensor picks up brain signals, transmits them to a plug attached to the person’s scalp. The signals then go to a computer which is programmed to translate them into simple actions.

Q. WHY MOVE THE SIGNALS OUT OF THE BODY?

A. Because for many paralyzed people, there’s been a break between their brain and the rest of their nervous system. Their brains may be fully functional, but their thoughts don’t go anywhere. What BrainGate does is bypass the broken connection. Free of the body, the signal is directed to machines that will turn thoughts into action.

More here.

to be tête-à-tête with things

Andre-kertesz_the_fork_1928_500px

Why photograph inanimate objects, which neither move nor change? Set aside for the moment explorations of abstract form (Paul Strand’s flower pots, Edward Weston’s peppers) and glamorous advertisements for material luxuries (Edward Steichen’s cigarette lighters, Irving Penn’s melted brie). Many of the earliest photographs were still life of necessity: only statues, books, and urns could hold still long enough to leave their images on salted paper. But with the still lifes of Roger Fenton, sharpness of detail and richness of texture introduce a new note: the dusty skin of a grape puckers around the stem, a flower petal curls and darkens at the edge. Photographic still life, like painted still life, is about our sensual experience of everyday objects, and the inevitability of decay. Penn famously photographed cigarette butts and trash collected from the gutter, rotting fruit and vegetables, discarded clothes, and other examples of dead nature. The nineteenth-century art critic Théophile Thoré objected to the French term for still life, nature morte, proclaiming, “Everything is alive and moves, everything breathes in and exhales, everything is in a constant state of metamorphosis… There is no dead nature!” The Czech photographer Josef Sudek tersely echoed this thought when he said that to the photographer’s eye, “a seemingly dead object comes to life through light or by its surroundings.”

more from Imogen Sara Smith at Threepenny Review here.

righteous and wrong

Ruthven_1-081910_jpg_230x790_q85

Obsessed as they are with their model of a “totalitarian threat” to Enlightenment liberalism, both Berman and Hirsi Ali fail to take account of well-documented facts that would challenge their presuppositions. Berman muddles kin-patronage politics, a constant in Arab societies, with fascism. Hirsi Ali—oblivious of changes in gender roles that are occurring within more developed Muslim polities, and ignoring the way that traditional systems of authority tend to oppress women in cultures as different as China, Japan, and India—confuses Islam (a malleable religious tradition) with patriarchy (a specific set of social relationships built around masculine power). As Julien Benda himself might acknowledge, a failure to look at all the facts, however complex they may be, is a kind of intellectual betrayal, a trahison des clercs.

more from Malise Ruthven at the NYRB here.

man is not a reasoning animal

Cardinal-newman

Newman was aware that he was regarded in some circles as a saint, but thought he was quite unworthy of the honour. This is just the kind of humility one needs to be canonised, though that is not why he said it. To be canonised, one has among other things to perform a posthumous miracle, and the geographical distribution of miracles (they are less common in the unbelieving north of the globe) tends to work against Anglo-Saxon candidates. One, however, has been reported in the US. One reason Newman doubted he would be canonised was that he thought ‘literary men’ like himself were not the stuff of sainthood. In this splendidly readable biography, which seems to get everything right except the first name of Archbishop McHale of Tuam, Cornwell recognises, as so many others have not, that Newman was first and foremost a writer – that his genius lay in ‘creating new ways of imagining and writing about religion’. It is a rather more illuminating approach to the cardinal than wondering whether he ever got into bed with Ambrose St John.

more from Terry Eagleton at the LRB here.

chan

100809_r19867_p233

Chan’s Hollywood career was launched in 1926, with a film adaptation of “The House Without a Key,” starring the Japanese actor George Kuwa, after which Chan went on to appear in forty-six more movies; he was most memorably played, in the nineteen-thirties, by a Swede named Warner Oland. He also appeared in countless comic strips and, in the nineteen-seventies, in sixteen episodes of Hanna-Barbera’s “The Amazing Chan and the Chan Clan,” which aired on CBS television on Saturday mornings and featured a dog named Chu Chu, Jodie Foster’s voice as one of Chan’s ten children, and the cri de coeur “Wham bam, we’re in a jam!” Charlie Chan is also one of the most hated characters in American popular culture. In the nineteen-eighties and nineties, distinguished American writers, including Frank Chin and Gish Jen, argued for laying Chan to rest, a yellow Uncle Tom, best buried. In trenchant essays, Chin condemned the Warner Oland movies as “parables of racial order”; Jen called Chan “the original Asian whiz kid.” In 1993, the literary scholar Elaine Kim bid Chan good riddance—“Gone for good his yellowface asexual bulk, his fortune-cookie English”—in an anthology of contemporary Asian-American fiction titled “Charlie Chan Is Dead,” which is not to be confused with the beautiful and fantastically clever 1982 Wayne Wang film, “Chan Is Missing,” and in which traces of a man named Chan are all over the place, it’s just that no one can find him anymore.

more from Jill Lepore at The New Yorker here.

life without incandescence

0730lightbulb_178__1280533761_5090

The hot filament of the incandescent bulb has illuminated our loved ones, our books, our rooms for so long that its glow has come to feel as natural as daylight — maybe more so, since most of us spend the majority of our waking hours indoors and accompanied by its light. But now the days of Edison’s bulb are numbered. The Energy Independence and Security Act, passed by Congress and signed into law by President George W. Bush in 2007, set strict efficiency standards for lighting that most incandescent bulbs will never be able to meet. The standards will be phased in over several years, beginning in 2012. Our familiar 100-watt bulbs will disappear from store shelves first, then 75-watt, then 60….By 2014, almost all the incandescent light we have traditionally used in our homes will be unavailable. Similar restrictions have already begun to take effect in Europe, Australia, Brazil, and Venezuela. China has begun its phase-out of incandescence, and Canada will begin its in 2012.

more from Jane Brox at the Boston Globe here.