REGARDING A NEW HUMANISM

Salvador Pániker at Edge.org:

Paniker200In 1959, C. P. Snow gave a famous lecture at Cambridge entitled “The Two Cultures and the Scientific Revolution”, lamenting the academic and professional scission between the field of science and that of letters. In 1991, the literary agent John Brockman popularized the concept of the third culture, to refer to the dawning of the scientist-writer, and hence, the birth of a new humanism. A humanism no longer bound to the classical sense of the term, but instead a new hybridization between the sciences and the humanities.

As far as philosophy is concerned, this new humanism should be aware of not only the latest in sciences, but also to as many tendencies of contemporary thought as is possible. Meaning that philosophy should not remain shut up in a professional academic department, but instead participate in an interdisciplinary intersection, “in conversation”—as the recently disappeared Richard Rorty would say—with all the other sciences. Philosophy needs to trace the maps of reality. The philosopher is, in the words of Plato, “he who possesses a vision of the whole (synoptikos),” in such, he who organizes that which is most relevant of the “stored information” (culture) and sketches out the new world views (provisional, but coherent). Moreover, the initial intuition of the analytic philosophers—who were the first to point out the importance of avoiding the traps set by language—should not be thrown out altogether.

More here.

Food Science

In Cornell’s food science department, chemists, engineers, and microbiologists are working on tomorrow’s hot products– coming soon to a supermarket near you.

Beth Saulnier in Cornell Alumni Magazine:

Screenhunter_04_jul_11_1728_2With nutrition guidelines in constant flux, products jockeying for space on supermarket shelves, and the supply chain now thoroughly global–Hotchkiss notes that today’s undergrads can’t imagine not having Chilean grapes in January–food science is big business. The Institute of Food Technologists, the industry’s professional organization, has 22,000 members, many of whom show up for its popular Annual Meeting & Food Expo. “The application of science to food is a huge thing,” Hotchkiss says. “People don’t advertise this very much, in part because the food industry wants you to think that elves make cookies. But I’ll tell you: I’ve been to a cookie factory, and cookies are made so fast you can’t even see them go by.”

Not everyone is enthusiastic about what happens in food science labs, or in the production facilities that employ so few Keebler elves. In a January cover story in the New York Times Magazine entitled “Unhappy Meals,” best-selling author Michael Pollan condemned food scientists as creators of nutritionally unsound products that have uncoupled Americans from the benefits and pleasures of natural food. (Pollan’s most recent book, The Omnivore’s Dilemma, laments “our national eating disorder” and is highly critical of industrial food production.) “Scientists operating with the best of intentions, using the best tools at their disposal, have taught us to look at food in a way that has diminished our pleasure in eating it,” Pollan writes, “while doing little or nothing to improve our health.” Among Pollan’s credos: avoid anything highly processed, and don’t eat anything your great-great-grandmother wouldn’t recognize.

More here.

sehnsuchtsland

18124

Does the Orient dream of the Orient too? Or is this a speciality of an Occident weary of civilisation and reason, which has long projected its needs for mysterious and meditatively abstemious serenity onto the eastern regions of the world from North Africa to China? In the Romantic period, India was singled out as the ultimate “fernwehland” [the longing for faraway places, an antonym for Heimweh or homesickness] the opposite pole to the goal-oriented rationality and early capitalism of Europe, although – or perhaps because – its admirers never went there. Novalis, Jean Paul and Goethe needed no more than the Bhagavad Gita and the Upanishads to believe in and feed the myth of Arcadian Indian wholeness. Today the huge success of Ayurveda and Yoga travel industry suggests that the myth lives on.

But what does an aristocratic Muslim woman from an Emirate dynasty, a Sheikha, think about such European dreams and longings? This question arose when I found out that the Arabic translation of my first novel, “Sister and Brother”, which deals with a journey to India and its psychological consequences, had landed in the hands of the Sheikha Shamma.

more from Sign and Sight here.

pynchon’s paranoia

Campaniletint2

There’s a longing at the heart of Against the Day, a tortured desire to redeem and amend—the theme is taken up as vengeance but played out as nostalgia. Order is never restored in Pynchon’s universe, though things change: an old enemy dies ignominiously at the hands of his bodyguard, an assassin is taken unawares, third parties do away with a traitorous spy. No one takes much pleasure in these messy ends—death comes too quickly to afford the living any satisfaction. The final pages of the novel offer a frazzled sentimental tale of coupling and growing old, where antique outlaws are domesticated and matters come more or less right only in the way they go more or less wrong. The idea of time travel, though lugged in for laughs, suggests a hankering to go back and fix things (in science fiction, the theme usually turns into tragic farce—tragedy if you like science fiction, farce if you don’t). Yet when men arrive from some indefinite future, fleeing some unimaginable global catastrophe, they seem only to want to be left alone, the most pitiable of refugees.

more from VQR here.

shelley: alone, being, in utter Being

Shelley_portrait

No poet was more maligned in death. To Matthew Arnold, Percy Bysshe Shelley was an “ineffectual angel, beating his luminous wings in vain”, while, in his hideous sculpture at University College, Oxford, Edward Onslow Ford portrayed him as a naked corpse on the shores of Viareggio. Shelley would have hated both depictions: no poet of the Romantic period was more intent on altering the political and physical structure of the world through an active engagement with it.

There have been some excellent biographies of Shelley, each of its time and with its own agenda: Shelley as belle-lettrist, Shelley as revolutionary, Shelley as hippy. As a result, he remains elusive – the haziest, most evanescent of the Romantics. Even those who tried to paint him from life found that his face “could never be fixed on paper”.

more from the Telegraph UK here.

A Wave of Neologisms Accepted by Websters

In the NYT:

No matter how odd some of the words might seem, the dictionary editors say each has the promise of sticking around in the American vocabulary.

”There will be linguistic conservatives who will turn their nose up at a word like `ginormous,”’ said John Morse, Merriam-Webster’s president. ”But it’s become a part of our language. It’s used by professional writers in mainstream publications. It clearly has staying power.”

One of those naysayers is Allan Metcalf, a professor of English at MacMurray College in Jacksonville, Ill., and the executive secretary of the American Dialect Society.

”A new word that stands out and is ostentatious is going to sink like a lead balloon,” he said. ”It might enjoy a fringe existence.”

But Merriam-Webster traces ginormous back to 1948, when it appeared in a British dictionary of military slang. And in the past several years, its use has become, well, ginormous.

Window of Possibility: Why one particular photograph should be in every classroom in the world

From Orion Magazine:

Milky_2We call our galaxy the Milky Way. There are at least 100 billion stars in it and our sun is one of those. A hundred billion is a big number, and humans are not evolved to appreciate numbers like that, but here’s a try: If you had a bucket with a thousand marbles in it, you would need to procure 999,999 more of those buckets to get a billion marbles. Then you’d have to repeat the process a hundred times to get as many marbles as there are stars in our galaxy.

That’s a lot of marbles.

So. The Earth is massive enough to hold all of our cities and oceans and creatures in the sway of its gravity. And the sun is massive enough to hold the Earth in the sway of its gravity. But the sun itself is merely a mote in the sway of the gravity of the Milky Way, at the center of which is a vast, concentrated bar of stars, around which the sun swings (carrying along Earth, Mars, Jupiter, Saturn, etc.) every 230 million years or so. Our sun isn’t anywhere near the center; it’s way out on one of the galaxy’s minor arms. We live beyond the suburbs of the Milky Way. We live in Nowheresville.

But still, we are in the Milky Way. And that’s a big deal, right? The Milky Way is at least a major galaxy, right?

Not really. Spiral-shaped, toothpick-shaped, sombrero-shaped—in the visible universe, at any given moment, there are hundreds of thousands of millions of galaxies. Maybe as many as 125 billion. There very well may be more galaxies in the universe than there are stars in the Milky Way.

So. Let’s say there are 100 billion stars in our galaxy. And let’s say there are 100 billion galaxies in our universe. At any given moment, then, assuming ultra-massive and dwarf galaxies average each other out, there might be 10,000,000,000,000,000,000,000 stars in the universe.  That’s 1.0 X 10 to the twenty-second power.  That’s 10 sextillion. Here’s a way of looking at it: there are enough stars in the universe that if everybody on Earth were charged with naming his or her share, we’d each get to name a trillion and a half of them.

More here.

Wimbledon 2007

Fulljgetty74808440mt242_the_champ_3 On Saturday, Venus Williams, last year’s leading spokeswoman for equal prize money, fittingly won the first Wimbledon women’s singles title since the success of that campaign.  The match was a lopsided one, with Williams too powerful and too fast for Frenchwoman Marion Bartoli to make an impression other than one of courage against all hope of winning.  Afterwards, Venus paid tribute to Billie Jean King for her efforts in making tennis the most visible and well-remunerated women’s sport in the world.  All to the good, and hopefully her graciousness will quiet some of the detractors who seem to surround the Williams sisters, who are, after all, perhaps the most exceptional story in sports, even a decade after their emergence on the pro tour.

Sunday, Roger Federer earned his (equal) slice of the pie, winning a stunningly well-played and epic five-set Wimbledon final against Rafael Nadal.  Together, these two men occupy a plateau far, far above the rest, and Nadal is the only man alive who can test Federer.  This picture (you may want to click on it to enlarge) should tell you how draining the effort of fending him off once again was for the Rajah, as his fans call him.  Notice anything strange about it?

Answer after the jump…

Read more »

Tuesday, July 10, 2007

Love on Campus

Why we should understand, and even encourage, a certain sort of erotic intensity between student and professor.

William Deresiewicz in The American Scholar:

Currentcover2The absentminded professor, that kindly old figure, is long gone. A new image has taken his place, one that bespeaks not only our culture’s hostility to the mind, but also its desperate confusion about the nature of love.

Look at recent movies about academics, and a remarkably consistent pattern emerges. In The Squid and the Whale (2005), Jeff Daniels plays an English professor and failed writer who sleeps with his students, neglects his wife, and bullies his children. In One True Thing (1998), William Hurt plays an English professor and failed writer who sleeps with his students, neglects his wife, and bullies his children. In Wonder Boys (2000), Michael Douglas plays an English professor and failed writer who sleeps with his students, has just been left by his third wife, and can’t commit to the child he’s conceived in an adulterous affair with his chancellor. Daniels’s character is vain, selfish, resentful, and immature. Hurt’s is vain, selfish, pompous, and self-pitying. Douglas’s is vain, selfish, resentful, and self-pitying. Hurt’s character drinks. Douglas’s drinks, smokes pot, and takes pills. All three men measure themselves against successful writers (two of them, in Douglas’s case; his own wife, in Daniels’s) whose presence diminishes them further. In We Don’t Live Here Anymore (2004), Mark Ruffalo and Peter Krause divide the central role: both are English professors, and both neglect and cheat on their wives, but Krause plays the arrogant, priapic writer who seduces his students, Ruffalo the passive, self-pitying failure. A Love Song For Bobby Long (2004) divides the stereotype a different way, with John Travolta as the washed-up, alcoholic English professor, Gabriel Macht as the blocked, alcoholic writer.

More here.

ON SUICIDE BOMBING

G. Sampath reviews Talal Asad's book on suicide bombing in The Hindu:

19LRsuicidejpgThis is one book you may want to avoid reading on a plane. Its title is On Suicide Bombing. And the author is a Muslim, with an Arab name: Talal Asad.

I came to it via a lecture by the American philosopher, Judith Butler. Her subject was ‘the human condition’. She talks about the questions Asad poses in his book: Can suicide bombing be thought? What resources do we need in order to think it? I was intrigued enough by Butler’s remarks to get a copy of the book.

Asad is an anthropologist by training. As an Arab Muslim in American academia, he is uniquely placed to offer an anthropological perspective on the discourse of terrorism in liberal democracies. ‘On Suicide Bombing’ is a collection of lectures he delivered in 2006. It has three chapters: ‘Terrorism’, ‘Suicide terrorism’ and ‘Horror at suicide terrorism’.

Asad begins with the most spectacular instance of suicide terrorism in recent history, the September 11, 2001, attack in the U.S., which sparked worldwide outrage, and rightly so. The mass killing of innocents is simply wrong and condemnable. There is nothing to debate here.

Nonetheless, Asad wants us to temporarily reserve our judgement, so that we could arrive at an understanding of the moral ground from which we pass judgment.

More here.

Parting the Veil

Now is no time to give up on supporting democracy in the Middle East. But to do so, the United States must embrace Islamist moderates.

Shadi Hamid in Democracy: A Journal of Ideas:

Shadi_hamidAmerica’s post-September 11 project to promote democracy in the Middle East has proven a spectacular failure. Today, Arab autocrats are as emboldened as ever. Egypt, Jordan, Tunisia, and others are backsliding on reform. Opposition forces are being crushed. Three of the most democratic polities in the region, Lebanon, Iraq, and the Palestinian territories, are being torn apart by violence and sectarian conflict.

Not long ago, it seemed an entirely different outcome was in the offing. As recently as late 2005, observers were hailing the “Arab spring,” an “autumn for autocrats,” and other seasonal formulations. They had cause for such optimism. On January 31, 2005, the world stood in collective awe as Iraqis braved terrorist threats to cast their ballots for the first time. That February, Egyptian President Hosni Mubarak announced multi-candidate presidential elections, another first. And that same month, after former Lebanese Prime Minister Rafiq Hariri was killed, Lebanon erupted in grief and then anger as nearly one million Lebanese took to the streets of their war-torn capital, demanding self-determination. Not long afterward, 50,000 Bahrainis–one-eighth of the country’s population–rallied for constitutional reform. The opposition was finally coming alive.

But when the Arab spring really did come, the American response provided ample evidence that while Arabs were ready for democracy, the United States most certainly was not.

More here.

Berlin’s art scene, once wild and free, is increasingly commercialized

Jeffrey Fleishman in the Los Angeles Times:

Screenhunter_02_jul_10_1149There’s a Starbucks near where the orgies used to be, and although the aura of Bohemia is distinct, things aren’t as unhinged as they were 17 years ago when punkers, pornographers, anarchists, squatters and artists of all persuasions landed amid the rust and drizzle of this liberated city.

It seems an era from a scrapbook, a time of cheap rents when everyone with a brush and a bit of brio claimed a garret. Some were talented; many were not. But they roamed the east side of a fallen wall, scavenging ideas and materials to make art and revive a naughty, creative spirit that resided here before decades of fascism and the Cold War.

The zeitgeist these days is more commercial. Galleries serve sushi amid prattle about hedge funds and economic indexes. Berlin has become a production center for works sold from Portugal to Dubai. Rents are going up. The dilettantes have departed. The foreign purveyors have nestled in. What remains is less the innocent verve of the past than an atmosphere that — although aesthetically adventurous and more open to experimentation than in most cities — has matured with a shrewd eye toward marketing.

More here.

The History Boys

In the twilight of his presidency, George W. Bush and his inner circle have been feeding the press with historical parallels: he is Harry Truman—unpopular, besieged, yet ultimately to be vindicated—while Iraq under Saddam was Europe held by Hitler. To a serious student of the past, that’s preposterous. Writing just before his untimely death, David Halberstam asserts that Bush’s “history,” like his war, is based on wishful thinking, arrogance, and a total disdain for the facts.

David Halberstam in Vanity Fair:

Screenhunter_01_jul_10_1140We are a long way from the glory days of Mission Accomplished, when the Iraq war was over before it was over—indeed before it really began—and the president could dress up like a fighter pilot and land on an aircraft carrier, and the nation, led by a pliable media, would applaud. Now, late in this sad, terribly diminished presidency, mired in an unwinnable war of their own making, and increasingly on the defensive about events which, to their surprise, they do not control, the president and his men have turned, with some degree of desperation, to history. In their view Iraq under Saddam was like Europe dominated by Hitler, and the Democrats and critics in the media are likened to the appeasers of the 1930s. The Iraqi people, shorn of their immensely complicated history, become either the people of Europe eager to be liberated from the Germans, or a little nation that great powerful nations ought to protect. Most recently in this history rummage sale—and perhaps most surprisingly—Bush has become Harry Truman.

More here.  [Thanks to Akbi Khan.]

the nose

Navesfedericoandguidobaldo1v

Federico da Montefeltro has one of the most memorable noses in Western art. Thanks to the Renaissance master Piero della Francesca, whose portrait of Federico is a prize of the Uffizi Gallery in Florence, the abrupt crook of the duke’s profile is a staple of art-history texts the world over. Only the disfigured nose of the grandfather in Ghirlandaio’s Old Man with a Young Boy (ca. 1490) and, perhaps, Rembrandt’s tuberous proboscis can vie with that of Federico.

A different side view of the duke can be seen in Federico da Montefeltro and His Library, an exhibition at the Morgan Library and Museum. Double Portrait of Federico da Montefeltro and His Son Guidobaldo (ca. 1475) is the show’s centerpiece. There’s still no definitive attribution for the painting, but whoever created the picture did justice to the nobleman’s nose, making it part and parcel of Federico’s regal bearing. Sitting upright in his armor, he reads a tome by Pope Gregory and wears an expression that is equal parts erudition, refinement and arrogant power. The painting may be adulatory, but it does expose the conscious contrivance behind Federico’s image.

more from the NY Observer here.

Good and Bad Hair

1183816948_6154

Whether they realize it or not, Jolie and Pitt have wandered into the fraught zone of black hair care, particularly as it concerns black women. For centuries, the identities of African-American women have been bound up in what they’ve chosen to do with their hair: straighten it, get extensions, get a press ‘n’ curl, get a Jheri curl (yes, it’s still an option), get cornrows, grow dreadlocks, twist it, wear a weave, wear a wig, or just leave it natural. It’s a prideful question asked in the poorest homes and the toniest houses — a question from which no black female living in America is immune. Oprah Winfrey might be able to do anything she wants with her hair today, but when she first started out, she had to face the same dilemma as a lot of black women breaking into TV: whether or not to get rid of the kinks.

more from Boston Globe Ideas here.

Tranströmer: Is it true, or have I dreamt it?

Images

In the 1989 poem “Golden Wasp,” Tomas Tranströmer provides a telling remark about his project: “We’re in the church of keeping-silence, of piety according to no letter.” Tranströmer’s particular piety requires only receptivity as an active principle of personal engagement with the world. It places images together in unexpected and beautiful ways and holds them steady enough to create unmistakable tension, even if it doesn’t always tell the reader what that tension is for.

Tranströmer, a psychotherapist as well as a poet, remains one of Sweden’s most widely translated and discussed living poets. His shortest poems are his most characteristic, and they may be his best. He has perfected a particular kind of epiphanic lyric, often in quatrains, in which nature is the active, energizing subject, and the self (if the self is present at all) is the object. Off-kilter and mystical, many of these poems approach the surreal and have an American parallel with Emily Dickinson’s slant of light: “There’s a tree walking around in the rain, / it rushes past us in the pouring grey. / It has an errand. It gathers life / out of the rain like a blackbird in an orchard” (from “The Tree and the Sky”).

more from Boston Review here.

Why most suicide bombers are Muslim and beautiful people have more daughters: Ten Politically Incorrect Truths About Human Nature

From Psychology Today:

Book Excerpted from Why Beautiful People Have More Daughters, by Alan S. Miller and Satoshi Kanazawa, to be published by Perigree in September 2007.

Most suicide bombers are Muslim

Suicide missions are not always religiously motivated, but according to Oxford University sociologist Diego Gambetta, editor of Making Sense of Suicide Missions, when religion is involved, the attackers are always Muslim. Why? The surprising answer is that Muslim suicide bombing has nothing to do with Islam or the Quran (except for two lines). It has a lot to do with sex, or, in this case, the absence of sex. What distinguishes Islam from other major religions is that it tolerates polygyny. By allowing some men to monopolize all women and altogether excluding many men from reproductive opportunities, polygyny creates shortages of available women. If 50 percent of men have two wives each, then the other 50 percent don’t get any wives at all. So polygyny increases competitive pressure on men, especially young men of low status. It therefore increases the likelihood that young men resort to violent means to gain access to mates.

Men like blond bombshells (and women want to look like them)

Long before TV—in 15th- and 16th- century Italy, and possibly two millennia ago—women were dying their hair blond. A recent study shows that in Iran, where exposure to Western media and culture is limited, women are actually more concerned with their body image, and want to lose more weight, than their American counterparts. It is difficult to ascribe the preferences and desires of women in 15th-century Italy and 21st-century Iran to socialization by media. Women’s desire to look like Barbie—young with small waist, large breasts, long blond hair, and blue eyes—is a direct, realistic, and sensible response to the desire of men to mate with women who look like her. There is evolutionary logic behind each of these features.

More here.

Instead of Making Films About the Civil Rights Era, Hollywood Has Made Excuses

From The Washington Post:

Black While familiar images of King are commonplace in 1960s montage sequences, Hollywood has yet to make the definitive King biopic. Indeed, of all the social, cultural and political touchstones of the baby boom generation — World War II, the Kennedy assassinations, the Vietnam War, Watergate, feminism, gay rights, AIDS and all manner of political coverups — the civil rights movement has yet to be the subject of a pivotal, defining feature film.

That the story of the most important social and political moment in this country’s history has gone untold in its dominant narrative art form is shocking on any number of levels (one being that among the movement’s most effective tactics was creating media images). Here is a chapter of American life whose legacy and ramifications — from Don Imus’s idea of humor to the decisions of the current Supreme Court — are still deeply, if painfully, felt. It’s a chapter filled with charismatic characters and compelling stories. It’s a chapter that — considering the ever-increasing number of bankable African American stars — seems not just worthy of Hollywood’s attention but positively ideal for a major movie event.

Ask studio executives why this is, and this is what you’ll hear: Black-themed films don’t play overseas. African American actors can’t open movies. American filmgoers don’t like dramas. Multi-character historical dramas are just too expensive.

More here.

Monday, July 9, 2007

Sunday, July 8, 2007

Land of Saints and Morons

Max McGuinness at The Dubliner:

200pxbishberk There was a young man who said “God
Must think it exceedingly odd
If he finds that this tree
Continues to be
When there’s no one about in the Quad.”
“Dear Sir, your astonishment’s odd;
I am always about in the Quad
And that’s why this tree
Will continue to be
Since observed by Yours faithfully, God.”

So goes the one lasting Irish contribution to the history of philosophy. This ditty by Ronald Knox is a paraphrase of the bizarre thoughts of Bishop Berkeley, who held court in Trinity during the early 18th Century. Berkeley was an idealist, more specifically an immaterialist, who denied the existence of the material world. All that truly existed for the Bishop were the contents of our own tiny minds – our perceptions. This view is summarised in the maxim esse est percipi – to be is to be perceived. Thence the bewilderment of the young man in Knox’s doggerel, anxious no doubt that were he to take his eye off his wallet, it would indeed disappear. Fear not. As long as God is around to keep an eye on things, they’ll stay right where they are. So you’d better believe in God, right? Or else He might just stop watching over that pad of yours in Ranelagh…and puff! It vanishes when you trot out to buy a pint of milk.

So next time some moon-faced spelt-chewer murmurs, “If a tree falls in the forest and no-one’s around, does it make a sound?” (‘Deepshit’ Chopra pseudo-spirituality), you can retort, “‘Twas a Mick who thought o’ that one, so ‘twas.” And there, alas, is the end; no Irishman has been so clever since. Idealism may be crackers but it is still frightfully hard to refute.

More here.