The Library in the New Age

Robert Darnton in the New York Review of Books:

BooksInformation is exploding so furiously around us and information technology is changing at such bewildering speed that we face a fundamental problem: How to orient ourselves in the new landscape? What, for example, will become of research libraries in the face of technological marvels such as Google?

How to make sense of it all? I have no answer to that problem, but I can suggest an approach to it: look at the history of the ways information has been communicated. Simplifying things radically, you could say that there have been four fundamental changes in information technology since humans learned to speak.

Somewhere, around 4000 BC, humans learned to write. Egyptian hieroglyphs go back to about 3200 BC, alphabetical writing to 1000 BC. Accord-ing to scholars like Jack Goody, the invention of writing was the most important technological breakthrough in the history of humanity. It transformed mankind’s relation to the past and opened a way for the emergence of the book as a force in history.

More here.

Saturday, May 24, 2008

Orderly Universe: Evidence of God?

John Allen Paulos at ABC News:

Screenhunter_03_may_25_1113Since writing my book “Irreligion” and some of my recent Who’s Counting columns, I’ve received a large number of e-mails from subscribers to creation science (who have recently christened themselves intelligent design theorists). Some of the notes have been polite, some vituperative, but almost all question “how order and complexity can arise out of nothing.”

…Because the seemingly inexplicable arising of order seems to be so critical to so many, however, I’ve decided to list here a few other sources for naturally occurring order in physics, math, and biology. Of course, order, complexity, entropy, randomness and related notions are clearly and utterly impossible to describe and disentangle in a column like this, but the examples below from “Irreligion” hint at some of the abstract ideas relevant to the arising of what has been called “order for free.”

Necessarily Some Order

Let me begin by noting that even about the seemingly completely disordered, we can always say something. No universe could be completely random at all levels of analysis.

In physics, this idea is illustrated by the kinetic theory of gases. There an assumption of disorder on one formal level of analysis, the random movement of gas molecules, leads to a kind of order on a higher level, the relations among variables such as temperature, pressure and volume known as the gas laws. The law-like relations follow from the lower-level randomness and a few other minimal assumptions. (This bit of physics does not mean that life has evolved simply by chance, a common mischaracterization of evolution.)

More here.

Want to Be My Boyfriend? Please Define

Maguerite Fields in the New York Times:

Screenhunter_02_may_25_1051_2Just before Valentine’s Day this year, Sunday Styles did something very unromantic: we asked college students nationwide to tell the plain truth about what love is like for them. We weren’t sure what to expect, but we thought we wouldn’t receive many essays about red roses and white tablecloths…

…Five of these essays will appear as the Modern Love column, starting today with Marguerite Fields’s winning entry, “Want to Be My Boyfriend? Please Define,” an eloquent, clear-eyed account of her generation’s often noncommittal dating scene. On the Sundays between Mother’s Day (May 11) and Father’s Day (June 15), we will publish the four runner-up essays.

Want to Be My Boyfriend? Please Define

By MARGUERITE FIELDS

RECENTLY my mother asked me to clarify what I meant when I said I was dating someone, versus when I was hooking up with someone, versus when I was seeing someone. And I had trouble answering her because the many options overlap and blur in my mind. But at one point, four years ago, I had a boyfriend. And I know he was my boyfriend because he said, “I want you to be my girlfriend,” and I said, “O.K.”

More here.  [Thanks to Margit Oberrauch.]

Israelis and Arabs Walk Into a Film …

Dave Itzkoff in the New York Times:

Screenhunter_01_may_25_1039About eight years ago Mr. Sandler conceived of the Zohan character, an Israeli assassin who has been trained to hate and kill Arabs; exhausted by the ceaseless bloodshed, he fakes his own death and flees to New York to become a hairdresser. There he finds Jews and Arabs living together in grudging if not quite harmonious tolerance.

At the time, Mr. Sandler (who rarely if ever gives interviews to the print news media) delegated the script to Mr. Smigel, who had frequently written for him on “Saturday Night Live,” and Judd Apatow, a former roommate of Mr. Sandler’s, who was not yet the one-man comedy juggernaut of “Knocked Up” and “40-Year-Old Virgin” fame.

Both writers found “Zohan” a subversive, somewhat improbable assignment. “There was always this question of, can you make this movie?” Mr. Apatow said in a phone interview. “Because it is making fun of the fact that people are so mad at each other.”

After the 2001 terrorist attacks, Mr. Sandler and his screenwriters shelved the first draft of the script, which so mercilessly sent up stereotypes of its characters that, Mr. Apatow said, “it’s like a Don Rickles routine: no one doesn’t get hurt.”

More here.

Measure for Measure

From The Boston Globe:

Microscope IT’S NOT SUCH a good time to be a literary scholar.

For generations, the study of literature has been a pillar of liberal education, a prime forum for cultural self-examination, and a favorite major for students seeking deeper understanding of the human experience. But over the last decade or so, more and more literary scholars have agreed that the field has become moribund, aimless, and increasingly irrelevant to the concerns not only of the “outside world,” but also to the world inside the ivory tower. Class enrollments and funding are down, morale is sagging, huge numbers of PhDs can’t find jobs, and books languish unpublished or unpurchased because almost no one, not even other literary scholars, wants to read them. The latest author to take the flagging pulse of the field is Yale’s William Deresiewicz. Writing recently in The Nation, he described a discipline suffering “an epochal loss of confidence” and “losing its will to live.” Deresiewicz’s alarming conclusion: “The real story of academic literary criticism today is that the profession is, however slowly, dying.”

Though the causes of the crisis are multiple and complex, I believe the dominant factor is easily identified: We literary scholars have mostly failed to generate surer and firmer knowledge about the things we study. While most other fields gradually accumulate new and durable understanding about the world, the great minds of literary studies have, over the past few decades, chiefly produced theories and speculation with little relevance to anyone but the scholars themselves. So instead of steadily building a body of solid knowledge about literature, culture, and the human condition, the field wanders in continuous circles, bending with fashions and the pronouncements of its charismatic leaders. I think there is a clear solution to this problem. Literary studies should become more like the sciences. Literature professors should apply science’s research methods, its theories, its statistical tools, and its insistence on hypothesis and proof. Instead of philosophical despair about the possibility of knowledge, they should embrace science’s spirit of intellectual optimism. If they do, literary studies can be transformed into a discipline in which real understanding of literature and the human experience builds up along with all of the words.

More here.

Jihadi Suicide Bombers: The New Wave

Ahmed Rashid in the New York Review of Books:

Rashid200When Osama bin Laden decided to launch a jihad against the US and the West from his new base in Afghanistan in 1996, few took him seriously. Several developments at that time got little attention from Western governments as Afghanistan became the incubator of a new, Arab-led “global jihad” against the West. The fifteen-year-long insurrection against the Indian government of Kashmir introduced the skills of suicide bombing to South Asia. The endless civil war in Somalia eliminated any clear center of power there and freewheeling jihadist groups emerged in the chaos. The Israeli–Palestinian conflict was seen as becoming increasingly insoluble. President Clinton’s failed attempt to foster peace at the end of his administration came just as many Palestinians were beginning to embrace more extremist Islamic ideas.

Two of the books under review are so illuminating about this twilight period in the 1990s that I even wonder if September 11 could have been averted if they had been published a decade earlier. One is Omar Nasiri’s Inside the Jihad, a first-person account by a Moroccan-born spy who infiltrated Islamist groups on behalf of European intelligence organizations in the 1990s; the other is Brynjar Lia’s Architect of Global Jihad, a Norwegian scholar’s account of a top al-Qaeda strategist named Abu Mus’ab al-Suri, who was arrested in Pakistan in 2005 and handed over to the US. He is now one of the “rendered” or disappeared prisoners. Both books are about men who were trained in terrorist camps in Afghanistan in the early 1990s—when bin Laden was not even there—and who then traveled across Europe to mobilize Muslims for the emerging global jihad. The Afghan camps were providing military and technical training, ideological education, and new global networks well before al-Qaeda arrived on the scene.

Yet the young men who trained in these camps were not educated in the Islamic schools called madrasas and they were inspired less by extremist Islamic ideology than by their desires to see the world, handle weapons, and have a youthful adventure. It was a boy’s world of reality games. “I realized that I had dreamed of this moment for years,” writes Nasiri—a nom de plume.

More here.

Jefferson, Buffon and the Moose

Keith Stewart Thompson in American Scientist:

Screenhunter_05_may_24_1844In 1781, in the midst of the American Revolutionary War, the British army rousted Thomas Jefferson from his home in Monticello. Politically unpopular, he retired from the governorship of Virginia and threw himself into writing. The result was his only book, Notes on the State of Virginia (1785), in which, among other topics, he famously defended his country against those Europeans who said that the Americas (North, South and Central) were unhealthy places populated by lesser animals and plants, compared with those of the Old World, and inhabited by peoples who were similarly weak and degenerate.

The immediate source of these libels was Histoire naturelle, générale et particulière, written in 44 quarto volumes between 1749 and 1809 by Georges-Louis Leclerc, Comte de Buffon, and his associates. A product of the age of French encyclopedists, Histoire naturelle pulled together a vast array of facts about natural history around the world. It was also the vehicle for Buffon’s many ideas about the history of the Earth and the organisms that inhabit it. Buffon had never been to the New World, but that did not prevent him from damning it. His critique carried an importance far greater than its questionable scientific value. Anything that lessened the public opinion of America—awash in foreign debt, at war with a global superpower, supplicant to the thrones of France and Spain—had political significance. Buffon had to be answered.

More here.

Does Time Run Backward in Other Universes?

From Scientific American:

Universe The universe does not look right. That may seem like a strange thing to say, given that cosmologists have very little standard for comparison. How do we know what the universe is supposed to look like? Nevertheless, over the years we have developed a strong intuition for what counts as “natural”—and the universe we see does not qualify.

Make no mistake: cosmologists have put together an incredibly successful picture of what the universe is made of and how it has evolved. Some 14 billion years ago the cosmos was hotter and denser than the interior of a star, and since then it has been cooling off and thinning out as the fabric of space expands. This picture accounts for just about every observation we have made, but a number of unusual features, especially in the early universe, suggest that there is more to the story than we understand.

Among the unnatural aspects of the universe, one stands out: time asymmetry. The microscopic laws of physics that underlie the behavior of the universe do not distinguish between past and future, yet the early universe—hot, dense, homogeneous—is completely different from today’s—cool, dilute, lumpy. The universe started off orderly and has been getting increasingly disorderly ever since.

More here.

The Dogs of War

Raymond Bonner reviews Standard Operating Procedure by Philip Gourevitch and Errol Morris, in the New York Times Book Review:

Screenhunter_03_may_24_1533After the abuse of the prisoners at Abu Ghraib was exposed in April 2004 by The New Yorker and “60 Minutes,” the Bush administration sought to portray the reprehensible misconduct as the work of a few bad apples. Seeming to underscore that verdict was the fact that soldiers took pictures of themselves, smiling, holding thumbs up, with the naked, dead, abused and humiliated prisoners.

Unfortunately, the truth, which emerges with painful clarity from “Standard Operating Procedure,” is that what happened at Abu Ghraib was not only tolerated but condoned and encouraged. Harsh treatment wasn’t punished; it was rewarded. When First Lt. Carolyn Wood of the Army was in charge of the interrogation center at Bagram Air Force base in Afghanistan in 2003, she established a policy that allowed prisoners to be held in solitary confinement for a month, to be stripped, shackled in painful positions, kept without sleep, bombarded with sound and light. Three prisoners were beaten to death on her watch. She was awarded a Bronze Star, one of the armed forces’ highest combat medals, promoted to captain and sent to Iraq.

More here.

Saturday Poem

///
It is Deep
—(don’t never forget the bridge you crossed over on)Person_carolyn_rodgers
Carolyn Rodgers

Having tried to use the
witch cord
that erases the stretch of
thirty-three blocks
and tuning in the voice which
woodenly stated that the
talk box was “disconnected”

My mother, religiously girdled in
her god, slipped on some love, and
laid on my bell like a truck,
blew through my door warm wind from the south
concern making her gruff and tight-lipped
and scared
that her “baby” was starving.
she, having learned, that disconnection results from
non-payment of bill (s).

She did not
recognize the poster of the
grand le-roi (al) cat on the wall
had never even seen the books of
Black poems that I have written
thinks that I am under the influence of
“communists”
when I talk about Black as anything
other than something ugly to kill it befo it grows
in any impression she would not be
considered “relevant” or “Black”
but
there she was, standing in my room
not loudly condemning that day and
not remembering that I grew hearing her
curse the factory where she “cut uh slave”
and the cheap j-boss wouldn’t allow a union,
not remembering that I heard the tears when
they told her a high school diploma was not enough,
and here now, not able to understand, what she had
been forced to deny, still–

she pushed into my kitchen so
she could open my refrigerator to see
what I had to eat, and pressed fifty
bills in my hand saying “pay the talk bill and buy
some food; you got folks who care about you . . .”

My mother, religious-negro, proud of
having waded through a storm, is very obviously,
a sturdy Black bridge that I
crossed over, on.

//

Friday, May 23, 2008

Why I had to lie to my dying mother

American writer Susan Sontag was terrified of death. She beat cancer in the 1970s, and again in the 1990s, but third time around she wasn’t so lucky. In a tender account of her final illness, her son David Rieff recalls how he colluded with his mother’s fantasy that she wasn’t dying – and what this ultimately cost him after she had gone.

From The Guardian:

Screenhunter_01_may_24_1305When my mother Susan Sontag was diagnosed in 2004 with myelodysplastic syndrome, a precursor to a rapidly progressive leukaemia, she had already survived stage IV breast cancer in 1975 that had spread into the lymph system, despite her doctors having held out little hope of her doing so, and a uterine sarcoma in 1998. ‘There are some survivors, even in the worst cancers,’ she would often say during the nearly two years she received what even for the time was an extremely harsh regime of chemotherapy for the breast cancer. ‘Why shouldn’t I be one of them?’

After that first cancer, mutilated but alive (the operation she underwent not only removed one of her breasts but the muscles of the chest wall and part of an armpit), she wrote her defiant book Illness as Metaphor. Part literary study, part polemic, it was a fervent plea to treat illness as illness, the luck of the genetic draw, and not the result of sexual inhibition, the repression of feeling, and the rest – that torrid brew of low-rent Wilhelm Reich and that mix of masochism and hubris that says that somehow people who got ill had brought it on themselves.

In the book, my mother contrasted the perennial stigma attached to cancer with the romanticising of tuberculosis in 19th-century literature (La bohème and all that). In the notebooks for the book that I found after her death, I discovered one entry that stopped me cold. ‘Leukaemia,’ it read, ‘the only “clean” cancer.’ Clean illness, indeed. My poor mother: to think of what awaited her.

More here.

Grand Canyon tourist’s death-defying leap

From The Telegraph:

Grandcanyonleap2_673092cA crowd of onlookers gasped as the man – wearing just a pair of flipflops – risked serious injury to capture a stunning shot of the Arizona sunset.

Just minutes earlier the unknown man had been sunbathing on a rock column and even downed a six-pack of beer.

He then tucked his camera and tripod under his arm and leapt across the 8ft gap – grasping hold of the opposing rock face with just one hand.

More here.

Why we curse

Very interesting (at least to me) article from last year by Steven Pinker in The New Republic (if the sight of curse words offends you, do NOT read on):

23860378_89e8ebd646Fucking became the subject of congressional debate in 2003, after NBC broadcast the Golden Globe Awards. Bono, lead singer of the mega-band U2, was accepting a prize on behalf of the group and in his euphoria exclaimed, “This is really, really, fucking brilliant” on the air. The Federal Communications Commission (FCC), which is charged with monitoring the nation’s airwaves for indecency, decided somewhat surprisingly not to sanction the network for failing to bleep out the word. Explaining its decision, the FCC noted that its guidelines define “indecency” as “material that describes or depicts sexual or excretory organs or activities” and Bono had used fucking as “an adjective or expletive to emphasize an exclamation.”

Cultural conservatives were outraged. California Representative Doug Ose tried to close the loophole in the FCC’s regulations with the filthiest piece of legislation ever considered by Congress…

…The Clean Airwaves Act assumed that fucking is a participial adjective. But this is not correct. With a true adjective like lazy, you can alternate between Drown the lazy cat and Drown the cat which is lazy. But Drown the fucking cat is certainly not interchangeable with Drown the cat which is fucking.

If the fucking in fucking brilliant is to be assigned a traditional part of speech, it would be adverb, because it modifies an adjective and only adverbs can do that, as in truly bad, very nice, and really big. Yet “adverb” is the one grammatical category that Ose forgot to include in his list! As it happens, most expletives aren’t genuine adverbs, either. One study notes that, while you can say That’s too fucking bad, you can’t say That’s too very bad. Also, as linguist Geoffrey Nunberg pointed out, while you can imagine the dialogue How brilliant was it? Very, you would never hear the dialogue How brilliant was it? Fucking.

More here.

Obama and Affirmative Action

Richard D. Kahlenberg in Inside Higher Ed:

Even as Barack Obama became the presumptive Democratic presidential nominee last Tuesday, his continuing failure to win white working-class voters clouds his prospects for November. The inability to connect with noncollege educated whites also undercuts his claim to being a truly transformative candidate — a Robert F. Kennedy figure — who could significantly change the direction of the country. In the fall campaign, however, Obama’s suggestion that he may be ready to change the focus of affirmative action policies in higher education — away from race to economic class — could prove pivotal in his efforts to reach working-class whites, and revive the great hopes of Bobby Kennedy’s candidacy.

Affirmative action is a highly charged issue, which most politicians stay away from. But nothing could carry more potent symbolic value with Reagan Democrats than for Obama to end the Democratic Party’s 40 years of support for racial preferences and to argue, instead, for preferences — in college admissions and elsewhere — based on economic status. Obama needs to do something dramatic. Right now, while people inside and outside the Obama campaign are making the RFK comparison, working-class whites aren’t buying it. The results in Tuesday’s Indiana primary are particularly poignant. Obama won handily among black Hoosiers, but lost the non-college educated white vote to Hillary Clinton by 66-34 percent. Forty years earlier, by contrast, Kennedy astonished observers by forging a coalition of blacks and working class whites, the likes of which we have rarely seen since then.

Moral Blame and Memory

Dave Munger at Cognitive Daily:

Anton races home at speeds well in excess of the speed limit. He’s rushing to beat his parents home so that he can hide their anniversary present so it will be a surprise. Suddenly, he hits a slick patch and runs his car off the road an into a tree. He’s okay, but the car is totaled and his parent’s surprise anniversary party is ruined.

How much is Anton to blame for the accident? If you had to rate it on a scale of 1 to 10, maybe you’d give him a 7. After all, he was just trying to do something special for his parents.

But what if instead of hiding an anniversary present, Anton was rushing home to hide his cocaine stash? Would you now say he’s more to blame for the accident? You might not when the two alternatives are placed side-by-side, but when Mark Alicke told the two versions of this story to different groups, the cocaine group rated Anton as more blameworthy than the anniversary present group.

Alicke’s study provided the foundation for an array of studies on the effects of social evaluations of individuals on apparently unrelated events, and even factual recollections about episodes.

But when a team led by David Pizarro addressed this question, no study had yet shown that unrelated details about a person could literally affect witnesses’ accuracy in recalling that person’s actions.

Reconsidering the One Dollar a Day Poverty Line

In the Economist:

The dollar-a-day definition of global destitution made its debut in the bank’s 1990 World Development Report. It was largely the discovery of Martin Ravallion, a researcher at the bank, and two co-authors, who noticed that the national poverty lines of half-a-dozen developing countries clustered around that amount. In two working papers* published this week, Mr Ravallion and two colleagues, Shaohua Chen and Prem Sangraula, revisit the dollar-a-day line in light of the bank’s new estimates of purchasing power. They also provide a new count of China’s poor.

Thanks to American inflation, $1.08 in 1993 was worth about $1.45 in 2005 money. In principle, the researchers could count the number of people living on less than this amount, converted into local money using the bank’s new PPP rates. But $1.45 a day strikes the authors as a bit high. Rather than update their poverty line, they propose to abandon it. It is time, they say, to return to first principles, repeating the exercise Mr Ravallion performed almost two decades ago, using the better, more abundant data available now.

Between Church and State

Jeff Sharlet reviews Liberty of Conscience by Martha Nussbaum and Founding Faith Providence, Politics, and the Birth of Religious Freedom in America by Steven Waldman, in The Nation:

Waldman wins his centrist peace by dismissing Christian conservatives’ majoritarian bullying and secularists’ insistence on separation of church and state as “extremes” that can be reconciled by the former acknowledging pluralism and the latter accepting that separation is neither strict nor meant to be universal. Doing so, however, would require fundamentalists to give up the most important claim of their faith–its exclusivity–and secularists to ignore history. Significantly, Waldman pays only brief lip service to an essential development in American law, the principle of incorporation–the Fourteenth Amendment’s extension of the Bill of Rights to the states. Incorporation is the tidiest rebuttal to Justice Thomas’s antebellum legal dreams and Waldman’s contention that the protection of minority views as an essential function of separation is a “liberal fallacy.”

Incorporation, notes Nussbaum, is “settled law.” What’s still in dispute is the meaning of freedom, the value of equality, the ends that can be justified in attempting to achieve both and just what separation is good for, anyway. In other words, it’s all up for grabs. Waldman’s centrism may appear to support a mildly liberal resolution; his book is, in the end, a defense of separation of church and state, very narrowly defined. But by slighting the enduring strength of religious conservatism, suggesting that the right’s partisans and the left’s separationists are evenly matched and assuming that his relatively liberal views are the happy mean, Waldman undermines the case for real religious freedom and liberty of conscience. Founding Faith is one of those books that find friends and enemies on both the left and the right and thus declare themselves balanced, as if freedom and equality were sandwich meats to be weighed on a scale.

Friday Poem

Morning e-Validation
—how the digital age contributes to self-esteem
Anonymous

You_are_approved_screenI startup my computer,
my hard drive whirs,

Vista lights and loads,
my cursor stirs,

I call up Outlook
and find I’m in the groove,

‘cause ten subjects in my junk-mail
say, “You are approved!””………
…………………………………
………………………………..

Tiger Burning Bright

A review of Love Marriage by Gail Tsukiyama in Ms. Magazine:

Book In spare, lyrical prose, V. V. Ganeshananthan’s debut novel tells the story of two Sri Lankan Tamil families over four generations who, despite civil war and displacement, are irrevocably joined by marriage and tradition. At the heart of the story is American-born Yalini, 22, the only child of Tamil immigrants. Her father eventually becomes a doctor, her mother a teacher; they make their new life in the United States. Even so, Yalini feels bound to “the laws of ancestry and society.”

Born during “Black July” of 1983, the beginning of the civil war between the Tamil and Sinhalese, Yalini is haunted by Sri Lanka’s political turmoil, caught between the political and social traditions of her ancestors and the modern world in which she lives. She can’t forget that in a Sri Lankan family there are only two ways to wed, in an Arranged Marriage or a Love Marriage, even though she knows that “in reality, there is a whole spectrum in between, but most of us spend years running away from the first toward the second.”

Uncertain what to do with her life, Yalini takes time off from school and travels to Toronto to help her parents care for her dying Uncle Kumaran, her mother’s older brother, who immigrated to Canada.

More here.