‘Wake the people and make them think big’

From The Guardian:

Of the two greatest dramatists of the 19th century, Chekhov and Ibsen, it is the infinitely lovable Ibsenh1 Dr Chekhov who holds the highest place in our affections, both as man and as author. But Ibsen, the forbidding man of the north – accusatory eyes fiercely staring out at us from behind steel-rimmed spectacles, thin, severe lips tightly pursed amid the bizarre facial topiary – may be the one who speaks most urgently to us today. At the time of his death, almost 100 years ago, Henrik Ibsen’s significance as a leader of thought was overwhelming. In 1900, the young James Joyce, still a student, wrote of him: “It must be questioned whether any man has held so firm an empire over the thinking world in modern times … his genius as an artist faces all, shirks nothing … the long roll of drama, ancient or modern, has few better things to show.” Joyce (and later Wittgenstein) learned Norwegian specifically in order to read Ibsen’s plays in the original.

More here.



The government’s dubious bioterror case has sent a dangerous message

William B. Greenough III in the Bulletin of the Atomic Scientists:

The U.S. government’s post-9/11 effort to make citizens more “secure” has had some frighteningly destructive consequences. One of the most egregious examples is the government’s prosecution–using new departmental powers of Homeland Security and the PATRIOT Act–of Thomas C. Butler, a distinguished scientist and doctor.

By the time you read this, Butler will have been in prison for well over a year. It’s a curious place for the U.S. government to put the man who is credited by the World Health Organization with saving the lives of more than two million children every year through a cholera treatment he helped to develop. Moreover, at the time of his arrest in 2003, Butler was researching ways to protect Americans from plague, a weaponizable pathogen.

Human plague, endemic in the United States, in Texas and other parts of the Southwest, is caused by the bacteria Yersinia pestis and carried by rodents. From time to time, plague spreads to Americans who hunt or handle the infected rodents. But across the globe in Tanzania, plague is far more common and kills people regularly.

Butler, then a professor of medicine, chief of the Infectious Diseases Division of Texas Tech University, and one of the most knowledgeable and experienced clinical scientists in plague infections, was worried that if the disease were to arise in the United States–either naturally or through terrorism–the country would be ill-equipped to treat the victims. Both streptomycin and chloromycetin, the two antibiotics currently recommended to treat plague and prevent death, are old and not readily available at U.S. health centers. Butler felt that it was urgent to test the efficacy of two other readily available antibiotics–gentamicin and doxycycline. Researchers at the Centers for Disease Control and Prevention, Fort Detrick, Fort Collins, and at the Food and Drug Administration agreed and were happy to collaborate in his research.

More here.

Why do we believe in God?

“Faith in a higher being is as old as humanity itself. But what sparked the Divine Idea? Did our earliest ancestors gain some evolutionary advantage through their shared religious feelings? In these extracts from his latest book, Robert Winston ponders the biggest question of them all.”

From The Guardian:

Story1…it is easy to suggest a mechanism by which religious beliefs could help us to pass on our genes. Greater cohesion and stricter moral codes would tend to produce more cooperation, and more cooperation means that hunting and gathering are likely to bring in more food. In turn, full bellies mean greater strength and alertness, greater immunity against infection, and offspring who develop and become independent more swiftly. Members of the group would also be more likely to take care of each other, especially those who are sick or injured. Therefore – in the long run – a shared religion appears to be evolutionarily advantageous, and natural selection might favour those groups with stronger religious beliefs.

But this is not the whole story. Although religion might be useful in developing a solid moral framework – and enforcing it – we can quite easily develop moral intuitions without relying on religion. Psychologist Eliot Turiel observed that even three- and four-year-olds could distinguish between moral rules (for example, not hitting someone) and conventional rules (such as not talking when the teacher is talking). Furthermore, they could understand that a moral breach, such as hitting someone, was wrong whether you had been told not to do it or not, whereas a conventional breach, such as talking in class, was wrong only if it had been expressly forbidden. They were also clearly able to distinguish between prudential rules (such as not leaving your notebook next to the fireplace) and moral rules.

This would suggest that there is a sort of “morality module” in the brain that is activated at an early age. Evidence from neuroscience would back this up, to a degree.

More here.

birnbaum, lethem

03855121631

Robert Birnbaum interviews Jonathan Lethem (recent MacArthur genius).

Lethem: Now we’re arriving at the bug [that was] in my ear when I said we should talk again. It’s all coming back. Certainly, yes, there’s a kind of relentless bad faith expressed when reviewers or critics remark on one element in a novel as though it’s a remarkable piece of metaphor or surrealism, as though they’ve never encountered such a thing before. They’re shocked, just shocked that something is being proposed—they act as though it is utterly unfamiliar to them, what they really mean is that they object to it on principle, on class or political grounds like those I just described. So, by reacting as though the incursion were new, instead of familiar, it permits a kind of disingenuous head-scratching: “Hmm, perhaps this new method is of interest, or could be, in the hands of the most serious of writers. We’ll have to watch closely and see.” You saw this happening when Roth’s new book was reviewed. Roth’s use of the “alternate history” was treated, in certain quarters, as though, first of all, Roth himself had never written a book that challenged mimetic propriety—suddenly The Breast didn’t exist, suddenly The Great American Novel didn’t exist. Suddenly Counterlife didn’t exist. To write about this thing with a 10-foot pole, and say, “What’s this strange method? What have we got here? One of the great pillars of strictly realist fiction has inserted something very odd into his book. We’ll puzzle over this as though it’s unprecedented.” It was as though there had been no Thomas Pynchon. As though Donald Barthelme, Kurt Vonnegut, Angela Carter, Robert Coover had been thrown into the memory hole. Was there never a book called The Public Burning? Do we really have to retrace our steps so utterly in order to reinscribe our class anxieties? Not to mention, of course, the absolute ignorance of international writing implicit in the stance: where’s Cortazar, Abe, Murakami, Calvino, and so very many others? Well, the status quo might argue, patronizingly, those cute magical-realist methods—how I despise that term—are fine for translated books, but we here writing in English hew to another standard of ‘seriousness.’ Not to mention, of course, the quarantine that’s been implicitly and silently installed around genre writing that uses the same method as Roth’s with utmost familiarity. Well, the status quo might argue, sounding now like an uncle in a P.G. Wodehouse novel: Ah, yes, well, we all know that stuff is, how do you say it, old boy? Rather grubby. No, I say, no. This isn’t good enough, not for the New York Times Book Review and the New York Review of Books, in 2004. Let me say it simply: there is nothing that was proposed in Roth’s book that could be genuinely unfamiliar to a serious reader of literary fiction of the last 25 years, 30 years, 50 years. To treat it as unfamiliar is a bogus naiveté—one that disguises an attack on modernism itself, in the guise of suspiciousness about what are being called post-modern techniques. It actually reflects a discomfort with the entire century.

more at The Morning News here.

early american democracy

Jill Lepore on the origins of American democracy in the New Yorker.

Readers may weary at the length of Wilentz’s book, but, as a model for integrating social and political history, it’s hard to dispute. That it will be disputed is, however, certain, if only because Wilentz has been such a vigorous critic of his colleagues. He has had little use for historians who defend Federalists like Noah Webster. To those who celebrate Federalists for their opposition to slavery, Wilentz counters, “Rarely has any group of Americans done so little to deserve such praise.” In his New Republic reviews, Wilentz has been particularly indignant about historians who place Federalists in a better light than Republicans or who dismiss Jefferson’s entire career because he owned slaves (including some who were almost certainly his own children). David McCullough’s “John Adams” was, in his view, “popular history as passive nostalgic spectacle.” Garry Wills’s book about Jefferson’s election, “Negro President,” he deemed “misadventurous.” In another essay, Wilentz concluded, “Were he alive today, Jefferson would probably regard modern American historians as a rascally bunch.”

But one thing that Federalists understood—for all their failings, for all their unmitigated snobbery—was the fragility of democracy. I’d be willing to consider you an angel, Webster told Jefferson, if you could show me a democracy that isn’t corrupt, or if you could protect the United States from “the instruments with which vicious and unqualified men destroy the freedom of elections, and exalt themselves into power, trampling first on the great and good, and afterwards on the very people to whom they owe their elevation.” Webster may have been a prig, but he wasn’t a duffer.

negativland

Picksimg_large_3

Sandwiched between ‘70s agitators like Ant Farm and more recent groups like the Yes Men, Negativland has been going strong for twenty-five years, an anniversary commemorated by this retrospective of their work. Among their best-known culture jamming exploits is their album U2, which includes liberal sampling from U2’s album Joshua Tree (prompting a landmark 1991 intellectual property case in which U2’s record label, Island Records, sued Negativland and SST).

more from Artforum here.

Tuesday, October 18, 2005

‘Dr. Atomic’: Unthinkable Yet Immortal

From The New York Times:

18atom2184There is physics. And then there is physics with music.

And so, when a new opera about the atomic bomb, of all things, opened to acclaim this month in San Francisco, I traveled across the bay from a conference in Berkeley. It has been 60 years since the atomic bomb emerged from its cradle at the Trinity test site in Alamogordo, N.M., to punctuate the end of World War II with a blinding flash and the cries of the burned in Hiroshima and Nagasaki. I am old enough to have emerged from that same cradle, fussing and complaining, old enough to have seen the Beatles and to have watched and felt the desert shake one cold loud morning in 1968 as one of those beasts went off and a radioactive dust ball drifted off toward Canada.

As an ex-physics major, sci-fi addict, science writer and lover of apocalypse, I long ago concluded that there was not much new to say about the atomic bomb. But I was wrong.

More here.

Imagine a World Without Copyright

Joost Smiers and Marieke van Schijndel in the International Herald Tribune:

Copyright was once a means to guarantee artists a decent income. Aside from the question as to whether it ever actually functioned as such – most artists never made a penny from the copyright system – we have to admit that copyright serves an altogether different purpose in the contemporary world. It now is the tool that conglomerates in the music, publishing, imaging and movie industries use to control their markets.

These industries decide whether the materials they have laid their hands on may be used by others – and, if they allow it, under what conditions and for what price. European and American legislation extends them that privilege for a window of no less than 70 years after the passing of the original author. The consequences? The privatization of an ever-increasing share of our cultural expressions, because this is precisely what copyright does. Our democratic right to freedom of cultural and artistic exchange is slowly but surely being taken away from us.

More here.

Mega-cities: Crowded and Environmentally Stressed

Divya Abhat, Shauna Dineen, Tamsyn Jones, Jim Motavalli, Rebecca Sanborn, and Kate Slomkowski in Emagazine.com:

1125343317f_egyptWe take big cities for granted today, but they are a relatively recent phenomenon. Most of human history concerns rural people making a living from the land. But the world is rapidly urbanizing, and it’s not at all clear that our planet has the resources to cope with this relentless trend. And, unfortunately, most of the growth is occurring in urban centers ill-equipped for the pace of change. You’ve heard of the “birth dearth”? It’s bypassing Dhaka, Mumbai, Mexico City and Lagos, cities that are adding population as many of their western counterparts contract.

The world’s first cities grew up in what is now Iraq, on the plains of Mesopotamia near the banks of the Tigris and Euphrates Rivers. The first city in the world to have more than one million people was Rome at the height of its Empire in 5 A.D. At that time, world population was only 170 million. But Rome was something new in the world. It had developed its own sophisticated sanitation and traffic management systems, as well as aqueducts, multi-story low-income housing and even suburbs, but after it fell in 410 A.D. it would be 17 centuries before any metropolitan area had that many people.

The first large city in the modern era was Beijing, which surpassed one million population around 1800, followed soon after by New York and London. But at that time city life was the exception; only three percent of the world’s population lived in urban areas in 1800.

More here.

Isaac Julien’s Fantom Creol

Malcom Le Grice reviews Isaac Julien’s new video installation, Fantome Creol, at the Centre Pompidou in Paris.

Isaac Julien’s video installation Fantôme Créol (Creole Phantom, 2005) was shown on four very large adjacent screens: two on either side of the gallery; Baltimore (2003) had three equally large screens, but placed in an arc so that they could be viewed as a whole. The visual quality was fine, the carpeted gallery provided excellent acoustics, and comfortable seating encouraged spectators to watch the full cycle of each work – sadly a rare experience with film and video installations.

One quality underlying Julien’s work is his exceptional command of the aesthetics of cinema. His images are often sumptuous, and the film construction shows a great control of visual rhythm. These qualities, evident in his single-screen films such as Frantz Fanon: Black Skin, White Mask (1996), take on another, spectacular dimension when two or more screens are edited to harmonize or counterpoint image content, colour and movement.

isaiah Berlin: 1989

Sirisaiahberlin1

Granta makes available Isaiah Berlin’s thoughts on the revolutions of ’89. Kind of interesting to read now.

You ask me for a response to the events in Europe. I have nothing new to say: my reactions are similar to those of virtually everyone I know, or know of—astonishment, exhilaration, happiness. When men and women imprisoned for a long time by oppressive and brutal regimes are able to break free, at any rate from some of their chains, and after many years know even the beginnings of genuine freedom, how can anyone with the smallest spark of human feeling not be profoundly moved? One can only add, as Madame Bonaparte said when congratulated on the historically unique distinction of being mother to an emperor, three kings and a queen, ‘Oui, pourvu que ça dure.’ If only we could be sure that there will not be a relapse, particularly in the Soviet Union, as some observers fear.

Relations as Art

From Art Papers:

Call Jillian Mcdonald a relations artist. Relationships are her medium, fleeting encounters her material. Mcdonald’s works often exist on the margins of art institutions and activate public space in disquieting ways. Yet, they are all about intimacy.

If her performance projects—in Mile Share, 2004, she invited strangers to run a mile with her; in Advice Lounge, 2003, she provided free, non-professional advice to passersby; for Houseplant, 2002-2003, she offered to deliver a houseplant to strangers’ residences; and in Shampoo, 2001, she posted a message in a local newspaper, inviting strangers to come for a free shampoo—seem light years away from her media works, the focus on relations provides an interpretive continuum.

For Me and Billy Bob, Mcdonald inserted carefully constructed footage of her image into scenes from Thornton’s movies, thus creating a compendium of flirtation—a wink, a light touch, a brief kiss, and a heart break.

unshocking

Rada_233_f21

There is a refreshing eclecticism about this year’s Turner prize exhibition at Tate Britain. The four short-listed artists represent a diverse range of practices: there is sculpture, video, conceptual work, and even some traditional painting. This is a comforting rather than a confrontational show. Stephen Deuchar, director of Tate Britain, hinted as much on Monday when he described how contemporary art had become “less scary” over the past few years, due in no small part to the Turner’s persistent demystification of recondite art forms. But how to find the balance between intimidating and just plain timid?

more from the Financial Times here.

A Passage to India: A Nobel Prize-winning economist explores his homeland’s rich and quarrelsome heritage.

From The Washington Post:India_2

If you laid all the economists in the world end to end, the old joke goes, you would never reach a conclusion. So it’s all the more remarkable that it is as a practitioner of the “dismal science” that Amartya Sen won the Nobel Prize in 1998. Sen is a man of conclusions; he is also brilliant at marshalling, with both extensive research and empirical evidence, the arguments that justify his conclusions. The Argumentative Indian — a collection of 16 essays, many reworked and expanded from lectures and previously published articles — is an intellectual tour de force from an economist who can lay equal claim to the designations of sociologist, historian, political analyst and moral philosopher. It is a magisterial work, except that the adjective is not one of which Sen would approve.

That is because Sen uses it, along with “exoticist” and “curatorial,” to describe the three perspectives from which the West has tended to view India (each of which he dissects and discredits with precision and finesse). He is particularly critical of the Western overemphasis on India’s religiosity at the expense of any recognition of the country’s equally impressive rationalist, scientific, mathematical and secular heritage, fields treated by Orientalists as “Western spheres of success.”

More here.

Hope over stem cell ethical fears

From BBC:Stem_cell

The Massachusetts-based Advanced Cell Technology removed embryonic stem cells from mice embryos with no apparent damage, the journal Nature said. Scientists believe it would sidestep some of the ethical objections to stem cell research if repeated in humans. But campaigners said they still had concerns and urged scientists to concentrate on other areas. Stem cells are “master” cells that can become many kinds of tissue. Those harvested from early-stage human embryos are thought to hold the most potential for research, as they have the ability to become almost any kind of adult cell in the body.

More here.

Monday, October 17, 2005

PERCEPTIONS: Victor Hugo’s Other Art

Victor_hugomysterious_ruin 

Victor Hugo: Mysterious Ruin

Hatching from a nameless gleam of light I see
Monstrous flowers and frightening roses
I feel that out of duty I write all these things
That seem, on the lurid, trembling parchment,
To issue sinisterly from the shadow of my hand.
Is it by chance, great senseless breath
Of the Prophets, that you perturb my thoughts?
So where am I being drawn in this nocturnal azure?
Is it sky I see? Am I in command?
Darkness, am I fleeing? Or am I in pursuit?
Everything gives way. At times I do not know if I am
The proud horseman or the fierce horse;
I have the scepter in my hand and the bit in my mouth.
Open up and let me pass, abysses, blue gulf,
Black gulf! Be silent, thunder! God, where are you leading me?
I am the will, but I am the delirium.
Oh, flight into the infinite! Vainly I sometimes say,
Like Jesus calling out “Lamma Sabacthani,”
Is the way still long? Is it finished,
Lord? Will you soon let me sleep?
The Spirit does what it will. I feel the gusting breath
That Elisha felt, that lifted him;
And in the night I hear someone commanding me to go!

VICTOR HUGO

From ‘Le bien germe parfois…’ (Good Sometimes Germinates…),
from the collection Toute la lyre, first published 1888.

Sunday, October 16, 2005

Mrs President: Clinton vs Rice

Dick Morris in The Guardian:

Riceclinton10 On 20 January 2009, at precisely noon, the world will witness the inauguration of the 44th President of the United States. As the chief justice administers the oath of office on the flag-draped podium in front of the US Capitol, the first woman President, Hillary Rodham Clinton, will be sworn into office. By her side, smiling broadly and holding the family Bible, will be her chief strategist, husband, and co-President, William Jefferson Clinton.

If the thought of another Clinton presidency excites you, then the future indeed looks bright. Because, as of this moment, there is no doubt that Hillary Clinton is on a virtually uncontested trajectory to win the Democratic nomination and, very likely, the 2008 election. She has no serious opposition in her party. The order of presidential succession from 1992 through 2008, in other words, may well become Bush, Clinton, Bush, Clinton. But her victory is not inevitable. There is one, and only one, figure in America who can stop Hillary Clinton: Secretary of State Condoleezza ‘Condi’ Rice. Among all of the possible Republican candidates for President, Condi alone could win the nomination, defeat Hillary and derail a third Clinton administration.

More here.

One-Fifth of Human Genes Have Been Patented

From The National Geographic:

The study, which is reported this week in the journal Science, is the first time that a detailed map has been created to match patents to specific physical locations on the human genome. Researchers can patent genes because they are potentially valuable research tools, useful in diagnostic tests or to discover and produce new drugs. “It might come as a surprise to many people that in the U.S. patent system human DNA is treated like other natural chemical products,” said Fiona Murray, a bDna_1 usiness and science professor at the Massachusetts Institute of Technology in Cambridge, and a co-author of the study. Gene patents were central to the biotech boom of the 1980s and 1990s. The earliest gene patents were obtained around 1978 on the gene for human growth hormone. The new study reveals that more than 4,000 genes, or 20 percent of the almost 24,000 human genes, have been claimed in U.S. patents. Of the patented genes, about 63 percent are assigned to private firms and 28 percent are assigned to universities. The top patent assignee is Incyte, a Palo Alto, California-based drug company whose patents cover 2,000 human genes.

More here.

A Culture of Rapture

From The New York Times:

16lou1841_1Fascination with the end of days is seemingly everywhere, in popular television ministries (like Pat Robertson’s), on best-seller lists (the “Left Behind” series) and even on bumper stickers (“In case of rapture, this car will be unmanned”).

What could be behind this fascination? Many church leaders and theologians, including evangelicals, give little effort to trying to interpret natural disasters and other events that might portend the end of history. The preoccupation these days stems mainly from the outsized influence of a specific, literalistic approach to biblical prophecy, called dispensationalism, which only came to occupy a dominant place in American evangelicalism relatively recently.

“Dispensationalists have never had the kind of public exposure and the kind of political power that they have now,” Mr. Weber said. As a whole, evangelical Christians are united in their belief that Jesus will come back in human form at some point in history. Where they, as well as members of other Christian groups, have differed is precisely how this will occur, depending on how each interprets a single verse in the 20th chapter of the Book of Revelation and its allusion to a 1,000-year reign by Christ.

More here.

U.S. Losing Its Competitive Edge In Science

  From News.com:

A panel of experts convened by the National Academies, the nation’s leading science advisory group, called yesterday for an urgent and wide-ranging effort to strengthen scientific competitiveness.

The 20-member panel, reporting at the request of a bipartisan group in Congress, said that without such an effort the United States “could soon lose its privileged position.” It cited many examples of emerging scientific and industrial power abroad and listed 20 steps the United States should take to maintain its global lead.

“Decisive action is needed now,” the report warned, adding that the nation’s old advantages “are eroding at a time when many other nations are gathering strength.”

More here.