The Honor of Exile

647_thumb Norman Manea in Project Syndicate:

The Romanian sculptor Brancusi once said that when the artist is no longer a child, he is dead. I still don't know how much of an artist I have become, but I grasp what Brancusi was saying. I can grasp – even at my age – my childish enduring self. Writing is a childish profession, even when it becomes excessively serious, as children often are.

My long road of immaturity began more than half a century ago. It was July 1945, a few months after I returned from a concentration camp called Transnistria. I lived that paradisiacal summer in a little Moldavian town, overwhelmed by the miraculous banality of a normal, secure environment. The particular afternoon was perfect, sunny and quiet, the room's semi-obscurity hospitable. I was alone in the universe, listening to a voice that was and wasn't mine. My partner was a book of Romanian fairy tales with a hard green cover, which I had been given a few days before when I turned the solemn age of 9.

That is when the wonder of words, the magic of literature started for me. Illness and therapy began at the same time. Soon, too soon, I wanted myself to be part of that family of word wizards, those secret relatives of mine. It was a way of searching for “something else'' beyond the triviality of everyday life, and also of searching for my real self among the many individuals who inhabited me.

What Darwin Got Wrong

Fodor200 John Dupre reviews Fodor and Piatelli-Palmarini's What Darwin Got Wrong, in The Philosopher's Magazine:

Neo-Darwinism is, very roughly, the claim that natural selection is by far the most important explanation of biological form, the particular characteristics of particular kinds of organism. It usually includes a commitment to gradualism (the idea that evolution occurs in small steps), and often involves attributing central importance to genes as the units that natural selection selects, or at any rate as the objective measure of evolutionary change. Versions have been prominently defended in recent years by such authors as Richard Dawkins, Daniel Dennett and Jerry Coyne.

Neo-Darwinism is, however, a perspective under ever-growing pressure, not (or not only) from the antiscientific assaults of the religious, but from the advancement of science. The decline of this intellectual monolith is generally to be welcomed, not least because it may be expected to bring down with it some of its less appetising academic fellow travellers, most notably Evolutionary Psychology. At the same time those contributing to the demise of neo-Darwinism must be aware of the risk, especially in the United States, that they will provide succour for fundamentalist Creationists and aficionados of so-called Intelligent Design.

Fodor and Piatelli-Palmarini’s (henceforth FPP) book is intended as a contribution to the critical task just mentioned, and they are well aware of the potential hazards. Sadly, however, the book is an almost tragic failure: it is unlikely to be taken seriously as a contribution to the dismantling of neo-Darwinism and it has been, and will continue to be, picked up by the fundamentalist enemies of science.

The first half of the book does a decent job of summarising the recent scientific insights responsible for the growing difficulties facing neo-Darwinism. Neo-Darwinism, by virtue of its emphasis on natural selection, sees evolution as driven from outside, by the environment. Central among the difficulties that FPP emphasise are crucial respects in which evolution is constrained, or even driven, by internal features of the organism. This realisation has been promoted by evolutionary developmental biology (“evo-devo”), which has also highlighted the unacceptable black-boxing of development in mainstream evolutionary theory, a concomitant of the exclusive focus on external determinants of change. Also crucial has been a gradual move away from excessively atomistic views of organisms and an appreciation of the necessity of treating them as integrated wholes, illustrated by the impossibility of analysing the genome into a unique set of discrete elements, “genes”. And equally important has been the disclosure of the complexity of the relations between genomes and phenotypes.

While much material is presented that does indeed reveal the dire straits in which neo-Darwinism finds itself, the overall argument is generally elusive. I speculate that this is because there are two quite different conclusions in the offing.

Hard to Find

ScienceHardtoFind__1279308039_2472Samuel Arbesman in The Boston Globe:

If you look back on history, you get the sense that scientific discoveries used to be easy. Galileo rolled objects down slopes. Robert Hooke played with a spring to learn about elasticity; Isaac Newton poked around his own eye with a darning needle to understand color perception. It took creativity and knowledge to ask the right questions, but the experiments themselves could be almost trivial.

Today, if you want to make a discovery in physics, it helps to be part of a 10,000-member team that runs a multibillion dollar atom smasher. It takes ever more money, more effort, and more people to find out new things.

But until recently, no one actually tried to measure the increasing difficulty of discovery. It certainly seems to be getting harder, but how much harder? How fast does it change?

This type of research, studying the science of science, is in fact a field of science itself, and is known as scientometrics. Scientometrics may sound self-absorbed, a kind of inside baseball for scientists, but it matters: We spend billions of dollars annually on research, and count on science to do such things as cure cancer and master space travel, so it’s good to know what really works.

From its early days of charting the number of yearly articles published in physics, scientometrics has broadened to yield all sorts of insights about how we generate knowledge. A study of the age at which scientists receive grants from the National Institutes of Health found that over the past decades, older scientists have become far more likely to receive grants than younger ones, suggesting that perhaps younger scientists are being given fewer chances to be innovative. In another study, researchers at Northwestern University found that high-impact research results are more likely to come from collaborative teams — often spanning multiple universities — rather than from a single scientist. In other words, the days of the lone hero scientist are vanishing, and you can measure it.

Birth of a Salesman

Salesman-bodyAmitava Kumar in Guernica:

So, sadly, the dreamers and the haters are not two groups. They are often one and the same persons. —Arjun Appadurai, Fear of Small Numbers

The U.S. government’s exhibit 1002 in United States of America v. Hemant Lakhani was a document from the commissary in the Corrections Bureau in Passaic County, New Jersey. It indicated that on March 16, 2005, in the “early afternoon hours the defendant went to the commissary and notwithstanding his medical condition ordered four bags of hot buffalo chips.” That same afternoon, the defendant also purchased one bag of crunchy cheese chips. Assistant U.S. Attorney Stuart Rabner flipped through the rest of the pages of exhibit 1002. On March 21, Rabner told the U.S. Court of Appeals for the Third Circuit, the defendant had received five bags of hot buffalo chips, five bags of salty peanuts, and five bags of crunchy cheese chips. On March 28, he received one cheese pizza, and again, five bags each of hot buffalo chips, salted peanuts, and crunchy cheese chips—and five apple pies. Turning to another page, Rabner said that on April 8 the defendant had ordered five bags of hot buffalo chips, five bags of salted peanuts, and two bags of crunchy cheese chips. And then on April 11, the food items ordered were five bags of hot buffalo chips, five bags of salted peanuts, three apple pies, two honey buns, and a cheese pizza.

“The defendant’s conduct,” the prosecutor argued, “can indeed be determined to be a contributing factor to the swollen legs that he now complains about and on which basis he seeks an adjournment of this trial. He should not be allowed.”

Hemant Lakhani’s diet was under scrutiny because he had undergone three surgeries in three weeks. The trial had begun in early January, but only ten days later the defendant had needed to be hospitalized. On the morning of January 14, a deputy marshal informed the court that the defendant had been admitted the previous evening at the St. Barnabas Medical Center in New Jersey with a variety of problems: a hernia, a congenital heart condition, and renal failure. Speaking on record four days later, Lakhani’s doctor reminded the court that his patient was nearly seventy. He was probably suffering from hypertension. And it was possible that his heart needed surgical treatment. Later that week, Lakhani underwent an angioplasty and a pacemaker was inserted into his body. He was having problems with one of his knees and a rheumatologist had been pressed into service. The court couldn’t meet for three weeks because the defendant had needed time to recuperate.

Henry Klingeman, the defendant’s lawyer, stated that his client had described the jail food as “inedible,” and had complained that he wasn’t given rice, which had “been a staple of his diet for his entire life.” The commissary food was used as a “supplement” and, because he had a “sweet tooth,” he used to order apple pies.

The judge in the case, Katharine Hayden, took a considered view of the medical opinion she had been provided about the defendant. She declared that Lakhani was “ready to go” and commented with some concern that the diet the defendant had chosen was “loaded with salt” and “loaded with sugar.” She noted that Lakhani had more than once refused nutritious meals consisting of salad, bread, beans, apples, cookies, and hard-boiled eggs. With adequate good reason, the appeal to adjourn was denied by the judge.

Judicial trials by their very nature are about acts. They concern themselves with what has actually been done by an individual or group. But the Lakhani trial from the very beginning had seemed to be about who he was rather than what he had done.

It Wasn’t a War

Kate Perkins interviews Norman Finkelstein in Guernica:

Finkle-Body The career of radical political scholar Norman Finkelstein might be described as a sort of heroic painting-into-a-corner. The son of Holocaust survivors, his life’s work has been dedicated to exposing the hypocrisy, ideology, and violence that sustains the Israeli occupation of Palestine. The dimensions of his emphatic anti-Zionism, expounded over the course of six meticulously researched and often polemical books on Israel, Palestine, and the legacy of the Holocaust, have made him a pariah in the mainstream and a hero amongst supporters of Palestinian liberation. The high controversy around Finkelstein’s politics has penetrated university walls on more than one occasion, making his academic career fraught with defensive, uphill battles. I first met Finkelstein in 2007, in the eye of a storm of controversy surrounding his academic status at DePaul University. Despite his prolific and highly influential body of critical scholarship—and after first having been approved for tenure at DePaul by both department and faculty committees—Finkelstein’s tenure had ultimately been denied—minority dissenters had campaigned successfully against his appointment. Flanked by a supporting cast of speakers including Tariq Ali, Tony Judt, and Noam Chomsky (via satellite), Finkelstein stood before some one thousand six hundred people in the University of Chicago’s packed Rockefeller Chapel to make the case for academic freedom. Contrary to his reputedly prickly demeanor, he appeared extraordinarily collected and calm, his heavy brow furrowing only slightly over sharp, dark eyes as he prepared to publicly address the charges against him. (The university’s final word on the matter was that Dr. Finkelstein’s reputation for outspoken criticism of Israel and of Israeli apologists like Harvard Law Professor Alan Dershowitz made Finkelstein unfit for tenure at DePaul, a school of “Vincentian values.”)

It was the culmination of a long struggle to advance his radical political critique of Israel and of the American Israeli lobby from within the academy. Now an independent scholar, Dr. Finkelstein remains a leading voice of dissent against the pro-Israel policies that underwrite an apartheid regime enforced by egregious war crimes and human rights violations. In This Time We Went Too Far: Truth and Consequences of the Gaza Invasion, his first book since departing from DePaul—he argues that Israel’s November, 2008 invasion of Gaza, which decisively ended a fragile ceasefire brokered by Egypt that June, marked the beginning of an unprecedented decline in public support for Israel. The book’s epilogue is devoted to the Goldstone Report, a document authored by renowned South African jurist Richard Goldstone that describes the damning conclusions of a U.N.-commissioned investigation into the Gaza invasion, including charges of war crimes against Israel.

More here.

A System for Connecting Brains to the Outside World

From The New York Times:

Brain About four years ago, John Donoghue’s son, Jacob, then 18, took his father aside and declared, “Dad, I now understand what you do — you’re ‘The Matrix’!” Dr. Donoghue, 61, is a professor of engineering and neuroscience at Brown University, studying how human brain signals could combine with modern electronics to help paralyzed people gain greater control over their environments. He’s designed a machine, the BrainGate, that uses thought to move objects. We spoke for two hours in his Brown University offices in Providence, R.I., and then again by telephone. An edited version of the two conversations follows:

Q. WHAT EXACTLY IS BRAINGATE?

A. It’s a way for people who’ve been paralyzed by strokes, spinal cord injuries or A.L.S. to connect their brains to the outside world. The system uses a tiny sensor that’s been implanted into the part of a person’s brain that generates movement commands. This sensor picks up brain signals, transmits them to a plug attached to the person’s scalp. The signals then go to a computer which is programmed to translate them into simple actions.

Q. WHY MOVE THE SIGNALS OUT OF THE BODY?

A. Because for many paralyzed people, there’s been a break between their brain and the rest of their nervous system. Their brains may be fully functional, but their thoughts don’t go anywhere. What BrainGate does is bypass the broken connection. Free of the body, the signal is directed to machines that will turn thoughts into action.

More here.

to be tête-à-tête with things

Andre-kertesz_the_fork_1928_500px

Why photograph inanimate objects, which neither move nor change? Set aside for the moment explorations of abstract form (Paul Strand’s flower pots, Edward Weston’s peppers) and glamorous advertisements for material luxuries (Edward Steichen’s cigarette lighters, Irving Penn’s melted brie). Many of the earliest photographs were still life of necessity: only statues, books, and urns could hold still long enough to leave their images on salted paper. But with the still lifes of Roger Fenton, sharpness of detail and richness of texture introduce a new note: the dusty skin of a grape puckers around the stem, a flower petal curls and darkens at the edge. Photographic still life, like painted still life, is about our sensual experience of everyday objects, and the inevitability of decay. Penn famously photographed cigarette butts and trash collected from the gutter, rotting fruit and vegetables, discarded clothes, and other examples of dead nature. The nineteenth-century art critic Théophile Thoré objected to the French term for still life, nature morte, proclaiming, “Everything is alive and moves, everything breathes in and exhales, everything is in a constant state of metamorphosis… There is no dead nature!” The Czech photographer Josef Sudek tersely echoed this thought when he said that to the photographer’s eye, “a seemingly dead object comes to life through light or by its surroundings.”

more from Imogen Sara Smith at Threepenny Review here.

righteous and wrong

Ruthven_1-081910_jpg_230x790_q85

Obsessed as they are with their model of a “totalitarian threat” to Enlightenment liberalism, both Berman and Hirsi Ali fail to take account of well-documented facts that would challenge their presuppositions. Berman muddles kin-patronage politics, a constant in Arab societies, with fascism. Hirsi Ali—oblivious of changes in gender roles that are occurring within more developed Muslim polities, and ignoring the way that traditional systems of authority tend to oppress women in cultures as different as China, Japan, and India—confuses Islam (a malleable religious tradition) with patriarchy (a specific set of social relationships built around masculine power). As Julien Benda himself might acknowledge, a failure to look at all the facts, however complex they may be, is a kind of intellectual betrayal, a trahison des clercs.

more from Malise Ruthven at the NYRB here.

man is not a reasoning animal

Cardinal-newman

Newman was aware that he was regarded in some circles as a saint, but thought he was quite unworthy of the honour. This is just the kind of humility one needs to be canonised, though that is not why he said it. To be canonised, one has among other things to perform a posthumous miracle, and the geographical distribution of miracles (they are less common in the unbelieving north of the globe) tends to work against Anglo-Saxon candidates. One, however, has been reported in the US. One reason Newman doubted he would be canonised was that he thought ‘literary men’ like himself were not the stuff of sainthood. In this splendidly readable biography, which seems to get everything right except the first name of Archbishop McHale of Tuam, Cornwell recognises, as so many others have not, that Newman was first and foremost a writer – that his genius lay in ‘creating new ways of imagining and writing about religion’. It is a rather more illuminating approach to the cardinal than wondering whether he ever got into bed with Ambrose St John.

more from Terry Eagleton at the LRB here.

chan

100809_r19867_p233

Chan’s Hollywood career was launched in 1926, with a film adaptation of “The House Without a Key,” starring the Japanese actor George Kuwa, after which Chan went on to appear in forty-six more movies; he was most memorably played, in the nineteen-thirties, by a Swede named Warner Oland. He also appeared in countless comic strips and, in the nineteen-seventies, in sixteen episodes of Hanna-Barbera’s “The Amazing Chan and the Chan Clan,” which aired on CBS television on Saturday mornings and featured a dog named Chu Chu, Jodie Foster’s voice as one of Chan’s ten children, and the cri de coeur “Wham bam, we’re in a jam!” Charlie Chan is also one of the most hated characters in American popular culture. In the nineteen-eighties and nineties, distinguished American writers, including Frank Chin and Gish Jen, argued for laying Chan to rest, a yellow Uncle Tom, best buried. In trenchant essays, Chin condemned the Warner Oland movies as “parables of racial order”; Jen called Chan “the original Asian whiz kid.” In 1993, the literary scholar Elaine Kim bid Chan good riddance—“Gone for good his yellowface asexual bulk, his fortune-cookie English”—in an anthology of contemporary Asian-American fiction titled “Charlie Chan Is Dead,” which is not to be confused with the beautiful and fantastically clever 1982 Wayne Wang film, “Chan Is Missing,” and in which traces of a man named Chan are all over the place, it’s just that no one can find him anymore.

more from Jill Lepore at The New Yorker here.

life without incandescence

0730lightbulb_178__1280533761_5090

The hot filament of the incandescent bulb has illuminated our loved ones, our books, our rooms for so long that its glow has come to feel as natural as daylight — maybe more so, since most of us spend the majority of our waking hours indoors and accompanied by its light. But now the days of Edison’s bulb are numbered. The Energy Independence and Security Act, passed by Congress and signed into law by President George W. Bush in 2007, set strict efficiency standards for lighting that most incandescent bulbs will never be able to meet. The standards will be phased in over several years, beginning in 2012. Our familiar 100-watt bulbs will disappear from store shelves first, then 75-watt, then 60….By 2014, almost all the incandescent light we have traditionally used in our homes will be unavailable. Similar restrictions have already begun to take effect in Europe, Australia, Brazil, and Venezuela. China has begun its phase-out of incandescence, and Canada will begin its in 2012.

more from Jane Brox at the Boston Globe here.

Monday, August 2, 2010

Sunday, August 1, 2010

The Evolution of Cooperation

Mungermulticell_HL Dave Munger in Seed:

Suppose you were imprisoned in a room with no food supply except for a huge trough of maple syrup. How long do you think you could survive? Sure, the syrup would provide plenty of energy for basic bodily functions, but it would perhaps be only a few months until scurvy or other nasty diseases of malnutrition ravaged your body. Without the ability to somehow produce vitamins and amino acids necessary for survival, consuming a food composed of just sugar and a few minerals likely wouldn’t sustain you for even a year.

Yet many animals do survive on very limited diets, and they have no more ability than you do to produce the basic building blocks of life. Last week, microbiology researcher Ryan Kitko pointed out that the candy-stripe leafhopper thrives while consuming only the xylem and phloem of plants—sap. So how do sap-sucking insects like leafhoppers and aphids survive? Kitko points to two studies on a type of leafhopper commonly known as sharpshooters. Researchers found cells in sharpshooters that were jam-packed with bacteria, which converted the raw materials from sap into the vitamins and amino acids the insects need to survive.

The glassy-winged sharpshooter has two different resident bacteria, each of which creates different nutrients for the host insect from its base diet of plant sap. The bacteria are transmitted directly from the mother to her eggs, so young insects hatch with all the apparatus they need to live on plant sap alone. The bacteria, in turn, have very limited genomes. They wouldn’t be able to survive without the host insects to provide protection and a ready supply of food. In fact, the two bacteria that provide nutrients for the sharpshooter themselves have complementary genomes, each having lost formerly essential sections of their genome now found in the other. The bacteria not only produce nutrients for the host, but also depend on each other’s presence to get the nutrients they themselves need.

Rescuing the Enlightenment from its Exploiters

Todorov Tim Black reviews Tzvetan Todorov's In Defence of the Enlightenment, in Spiked:

While the Enlightenment, ‘one of the most important shifts in the history of man’ as one recent account put it, has certainly had its detractors, who blame it for anything from the Holocaust to soulless consumerism, it now also has a veritable army of self-styled heirs. Militant secularists, New Atheists, advocates of evidence-based policy, human rights champions… each constituency in their turn will draw justification from the intellectual emanations of that period beginning roughly towards the end of the seventeenth century and culminating – some say ending – in the 1789 French Revolution and its aftermath. And each in their turn will betray it.

It is not deliberate treachery. This is no reactionary dissimulation – it is more impulsive than that. Still, in the hands of the neo-Enlightened, from the zealously anti-religious to the zealously pro-science, something strange has happened. Principles that were central – albeit contested – to the Enlightenment have been reversed, turned in on themselves. Secularism, as we have seen recently in the French government’s decision to ban the burqa, has been transformed from state toleration of religious beliefs into their selective persecution; scientific knowledge, having been emancipated from theology, has now become the politician’s article of faith; even freedom itself, that intregral Enlightenment impulse, has been reconceived as the enemy of the people. As the Enlightened critics of Enlightenment naivete would have it, in the symbolic shapes of our ever distending guts and CO2-belching cars, we may be a little too free.

Published in France in 2006, but only recently translated into English, philosopher Tzvetan Todorov’s In Defence of Enlightenment is, in short, a corrective. And insofar as it offers a polite but stern rebuke to those who distort the Enlightenment project, often in its own specious name, it is a welcome corrective at that.

Not the Messiah

20100720_2010+28critics_lead1_wAlain de Botton on Auguste Comte in New Statesman:

One of the most fruitless questions that can be asked of religions is whether or not they are “true”. For the sake of argument and the flow of this article, let us simply assume from the start that they aren't true in the supernatural sense. For a certain kind of atheist, this is the end of the story; but for those of a more ethnographic bent, it is clearly only a beginning. If we made up our gods to serve psychological needs, a study of these deities will tell us a crucial amount about what we require to preserve our sanity and balance, and will raise intriguing questions about how we are fulfilling the needs to which religions once catered.

Although we tend to think of atheists as not only unbelieving but also hostile to religion, there is a minor tradition of atheistic thinkers who have attempted to reconcile suspicion of religion with a sympathy for its ritualistic aspects. The most important and inspirational of these investigations was by the visionary, eccentric and only intermittently sane French 19th-century sociologist Auguste Comte.

Comte's thinking on religion had as its starting point a characteristically blunt observation that, in the modern world, thanks to the discoveries of science, it would no longer be possible for anyone intelligent or robust to believe in God. Faith would henceforth be limited to the uneducated, the fanatical, women, children and those in the final months of incurable diseases. At the same time Comte recognised, as many of his more rational contemporaries did not, that a secular society devoted solely to financial accumulation and romantic love and devoid of any sources of consolation, transcendent awe or solidarity would be prey to untenable social and emotional ills.

Comte's solution was neither to cling blindly to sacred traditions, nor to cast them collectively and belligerently aside, but rather to pick out their more relevant and secular aspects and fuse them with certain insights drawn from philosophy, art and science. The result, the outcome of decades of thought and the summit of Comte's intellectual achievement, was a new religion: a religion for atheists, or, as he termed it, a religion of humanity.

Changing Places

Guttenplanhitchens_hp_0D.D. Guttenplan on Christopher Hitchens, in the Nation:

Permit me, as the English say, to declare an interest. I was first told the story of the death of Yvonne Hitchens by her oldest son on the weekend of April 8, 1989. Christopher and his wife, Eleni, put us up at their house in Washington on our way to an abortion rights march. Abortion was a touchy subject with the Hitchenses, and not just because Eleni was pregnant with their second child. There had been a party in the afternoon, but the atmosphere was hardly festive. Our hosts seemed to be attempting, with limited success, to suppress a long-running quarrel. (It can't have been much more than a month later that Christopher left Eleni for Carol Blue, whom he eventually married.) As the house slowly emptied I found myself alone with Christopher, who, either because he noticed my distracted air or wanted to change the subject, soon elicited the fact that I'd spent an earlier part of the day visiting my mother in the hospital where she was undergoing treatment for cancer.

I was feeling both anxious and guilty. Christopher's response was to sit me down, fill our glasses and tell me about being summoned to Athens too late to talk his mother out of taking her life. I wasn't making notes—his apotheosis as a world-historical figure and scourge of the believers was many years in the future—so I can't recall exactly how he introduced the topic. Nor can I recall all the sordid details, though I did come away knowing that his mother's suicide in 1973 had marked him in ways he generally preferred not to consider. What I can recall was my sense of a man whose life seemed, on many levels, to be a kind of performance, allowing himself to be “off,” and to offer the only consolation he could: not cheerfulness, not competitive misery, but an acknowledgment that sometimes life just sucks. If any more evidence on that question were needed, in recent weeks the Internet has buzzed with the news that Hitchens is undergoing treatment for cancer of the esophagus, a disease, as ABC announced with barely restrained glee, “associated with smoking and drinking, habits Hitchens extolled as virtues.”

The pathetic circumstances of Yvonne Hitchens's last days have been told many times, and to many journalists. After a long, passionless marriage to a midranking officer in the Royal Navy, himself forcibly retired and working as a bookkeeper in a boys' boarding school, Yvonne fell in love with a former Anglican priest, only to have both their lives end in a suicide pact far from home. When I say that those last days have never been told so movingly, or with such filial tenderness, as in the pages of Hitch-22, you may think I am hardly an impartial witness. Fair enough. But where Hitchens is concerned, neutrality is liable to be in short supply.

How Puritans became capitalists

From The Boston Globe:

Books Even in down times like these, America’s economy remains remarkably productive, by far the world’s largest. At its base is a distinctive form of market-driven capitalism that was championed and shaped in Puritan era Boston. But the rise of Boston’s economy contains a deep contradiction: The Puritans whose ethic dominated New England hated worldly things. Market pricing was considered sinful, and church communities kept a watchful, often vengeful eye on merchants. How could people who loathed market principles birth a modern market economy? That question captivated Mark Valeri after he read sermons by the fiery revivalist Jonathan Edwards that included detailed discussions of economic policy. Edwards turned out to be part of a progression of ministers who led their dour and frugal flocks down a road that would bring fabulous riches, and ultimately give rise to a culture seen as a symbol of material excess.

In his new book, ”Heavenly Merchandize,” Valeri, professor of church history at Union Theological Seminary in Richmond, finds that the American economy as we know it emerged from a series of important shifts in the relationship between the Colonies and England, fomented by church leaders in both London and early Boston. In the 1630s, religious leaders often condemned basic moneymaking practices like lending money at interest; but by the 1720s, Valeri found, church leaders themselves were lauding market economics. Valeri says the shift wasn’t a case of clergymen adapting to societal changes–he found society changed after the ministers did, sometimes even decades later. Even the more open-minded ministers, however, would have been scandalized by some aspects of the modern system they helped create–particularly the idea that investors would allow their desire for profits to make decisions that would harm the broader economy.

More here.

McSweeney’s mix CD for the Obama era

From Salon:

Md_horiz My uncle Steve hates Barack Obama. There, I’ve said it: I’ve relayed in public the secret that we hush at family gatherings, the reason our family cannot openly celebrate and discuss the Obamas at Christmastime the way other black families do. Let me be explicit about what I am saying. When I use the word “hate,” I mean that my uncle — an African American man in his 50s who grew up in the segregated South, in Arkansas, a hundred miles from the National Guard’s 1957 standoff with nine black students outside an all-white school — this man, who ate at segregated diners, played in all-black athletic leagues, and went to all-black schools — despises the first black president of the United States. The reasons are varied: Sometimes he seems simply jealous, envious that a brother has come around in his lifetime who is — how can I put it? — superbadder than he will ever be. But my uncle, who works in Springfield, Ill., believes that Obama is just another politician with questionable ethics. He claims if the walls could talk about the real goings-on behind closed doors, Barack Obama would be in jail, and not in the White House. I must admit that I see most of the mysterious alliances or inconsistencies that pundits, scholars and my uncle cite as Obama’s failures as signs that Obama decided to go to Washington to get things done. I have no delusions about American politics. I need Obama to be a complex freedom fighter, not a saint.

That said, black folks everywhere are still figuring out what to make of this new era. In the midst of all this, I set out to compile a musical State of the Union address for the 2010 Believer music issue that embodies the spirit of these times we’re living in. We’re huddled around the TV, watching “The Boondocks” and wondering what to make of a song (from Season 3) called “Dick Riding Obama.” Some of us certainly laugh, and afterward we talk. Some of us really do feel that gross sections of the black community, and black artists in particular, are ill-informed and exploiting Obama’s platform — they are, in essence, dick-riding Obama — while others in the community are pissed-off, wondering what white folks think, and imagine they're happily whistling that little ditty. Perhaps, most important, some of us find it totally irresponsible for a black artist to make art that insinuates anything bad, dark or untoward about Obama and his legacy, while others feel it’s the black artist’s role to share his true feelings, to tell the truth to the world — right now! — precisely as he sees it, politics and niceties be damned.

More here.

Sunday Poem

Adolescence II

Although it is night, I sit in the bathroom, waiting.
Sweat prickles behind my knees, the baby-breasts are alert.
Venetian blinds slice up the moon; the tiles quiver in pale strips.

Then they come, the three seal men with eyes as round
As dinner plates and eyelashes like sharpened tines.
They bring the scent of licorice. One sits in the wash bowl,

One on the bathtub edge; one leans against the door.
“Can you feel it yet?” they whisper.
I don't know what to say, again. They chuckle,

Patting their sleek bodies with their hands.
“Well, maybe next time.” And they rise,
Glittering like pools of ink under moonlight,

And vanish. I clutch at the ragged holes
They leave behind, here at the edge of darkness.
Night rests like a ball of fur on my tongue.

by Rita Dove

the hunchback did it

0

Bolaño wrote a preface to Antwerp in 2002 when he found out it was finally being published. He called the preface, “Total Anarchy: Twenty-Two Years Later.” The “total anarchy” is a reference to a piece of paper tacked over Bolaño’s bed in those days, the late 70s. He’d asked a Polish friend to write ‘total anarchy’ on the scrap of paper in Polish. Maybe there is another connection to our Sophie Podolski here, our suicidal Belgian muse? This preface is like a little drink of water for the dying men who read Antwerp, I suppose. Bolaño seems to tell you a thing or two in the preface, explain the context within which he wrote his opaque novel. I read the preface three times before I realized it was a trick. He says, “I wrote this book for the ghosts, who, because they’re outside of time, are the only ones with time.” That’s a joke, man, it’s just a joke. Dead people are the only ones with time enough to sort this novel out. You’d have to be dead, and in possession of infinite time, to figure out if the hunchback really did it and what movie they are watching on that sheet hung between the trees at the campground. I think the hunchback did do it. I’m just not sure what he did.

more from me at The Owls here.