Recollections of Rorty

Via Crooked Timber, Raymond Geuss remembers Richard Rorty, in Arion:

When I arrived in Princeton during the 1970s my addiction to tea was already long-standing and very well entrenched, but I was so concerned about the quality of the water in town that I used to buy large containers of allegedly “pure” water at Davidson’s—the local supermarket, which seems now to have gone out of business. I didn’t, of course, have a car, and given the amount of tea I consumed, the transport of adequate supplies of water was a highly labor-intense and inconvenient matter. Dick and Mary Rorty must have noticed me lugging canisters of water home, because, with characteristic generosity, they developed the habit of calling around at my rooms in 120 Prospect, often on Sunday mornings, offering to take me by car to fill my water-bottles at a hugely primitive and highly suspicious-looking outdoor water-tap on the side of a pumphouse which was operated by the Elizabethtown Water Company on a piece of waste land near the Institute Woods. This pumphouse with its copiously dripping tap was like something out of Tarkhovski’s film about Russia after a nuclear accident, Stalker, and the surrounding area was a place so sinister one half expected to be attacked by packs of dogs in the final stages of radiation sickness or by troops of feral children who had been left by their parents to fend for themselves while the parents went off to the library to finish their dissertations. On one of those Sunday mornings in that insalubrious, but somehow appropriate, landscape, Dick happened to mention that he had just finished reading Gadamer’s Truth and Method. My heart sank at this news because the way he reported it seemed to me to indicate, correctly as it turned out, that he had been positively impressed by this book. I had a premonition, which also turned out to be correct, that it would not be possible for me to disabuse him of his admiration for the work of a man, whom I knew rather well as a former colleague at Heidelberg and whom I held to be a reactionary, distended wind-bag.

Spike Lee gets ready to do battle with Miracle at St Anna

From The Telegraph:

Lee The story is Miracle at St Anna, drawn from the novel of the same name by American author James McBride. Recounting the deeds of four “Buffalo Soldiers” from the US Army’s Negro 92nd Division, who are trapped behind enemy lines in Tuscany, the book is like a Roman mosaic, piecing together different narratives to reveal the complex moral landscape of war. Lee is using native actors, speaking their native tongues, and in the scene we have just witnessed English, German and Italian rattled around the room. Of the American actors, the best known is Derek Luke, who was excellent in Phillip Noyce’s apartheid film Catch a Fire and also starred in Robert Redford’s Lions for Lambs.

At first glance, a war film might seem an unusual departure for Lee. The 57-year-old director has tackled a number of genres in his 20-year career, but he forged his reputation by tackling issues – many of them controversial – that affected modern-day African-American communities. With films such as School Daze, Do the Right Thing, Jungle Fever and Clockers, he studied conflict within the black community, interracial tension, interracial sex and the horrors of the drug trade. “Actually, people don’t realise that it was the return of the black soldiers from the Second World War that laid the foundations for the civil rights movement,” he says. “There was a new militancy happening, and at the various training bases around the country you had violent outbreaks. And these negro soldiers had guns, too, so they weren’t going to take too much!”

More here.

Regulating Evolution: How Gene Switches Make Life

From Scientific American:

Evo For a long time, scientists certainly expected the anatomical differences among animals to be reflected in clear differences among the contents of their genomes. When we compare mammalian genomes such as those of the mouse, rat, dog, human and chimpanzee, however, we see that their respective gene catalogues are remarkably similar. The approximate number of genes in each animal’s genome (about 20,000 or so) and the relative positions of many genes have been fairly well maintained over 100 million years of evolution. That is not to say there are no differences in gene number and location. But at first glance, nothing in these gene inventories shouts out “mouse” or “dog” or “human.” When comparing mouse and human genome, for example, biologists are able to identify a mouse counterpart for at least 99 percent of all our genes.

In other words, we humans do not, as some once assumed, have more genes than our pets, pests, livestock or even a puffer fish. Disappointing, perhaps, but we’ll have to get over it. If humans want to understand what distinguishes animals, including ourselves, from one another, we have to look beyond genes.

More here.

Wednesday Poem

///
The Mushroom
Robert Bly

This white mushroom comes up through the duffy
lith on a granite cliff, in a crack that ice has widened.
The most delicate light tan, it has the texture of a rubber
ball left in the sun too long. To the fingers it feels a
little like the tough heel of a foot.

One split has gone deep into it, dividing it into two
half-spheres, and through the cut one can peek inside,
where the flesh is white and gently naive.

The mushroom has a traveller’s face. We know there
are men and women in Old People’s Homes whose souls
prepare now for a trip, which will also be a marriage.
There must be travellers all around us supporting us whom
we do not recognize. This granite cliff also travels. Do we
know more about our wife’s journey or our dearest friends’
than the journey of this rock? Can we be sure which
traveller will arrive first, or when the wedding will be?
Everything is passing away except the day of this wedding.

///

Tuesday, May 13, 2008

Fafblog Interviews Hillary Clinton

For those of you who haven’t notice, fafblog, perhaps the greatest blog in the history of the blogosphere, returned on April 1st after a long hiatus. Fafnir:

FAFBLOG: Wow, Hillary Clinton, right here on our little blog! Well, we don’t want to waste your time so let’s cut to the chase! Why should we vote for you for president?
HILLARY CLINTON: One word, Fafnir: experience. I have thirty-five years of experience working for change, building a list of accomplishments so lengthy and impressive no one else even knows what they are. Why, I could go on for hours just about the policies I advanced as First Lady, from critical legislation like the Mumble-Something Act to my efforts to bring peace to the troubled region of Upper McDonaldland.
FB: And millions of Americans still enjoy the benefits of your successful health care plan in some distant parallel universe!
CLINTON: That’s right, Fafnir. No one has more experience failing to fix health care than me. I worked in the White House for eight years failing to fix health care, and as president I’ll make failing to fix health care my number one priority.
FB: Well that sounds pretty good, Hillary Clinton, but what if I wanna vote for someone with even more experience, like John McCain or Zombie Strom Thurmond or Andrew Jackson’s collection of antique spittoons? Those spittoons have been in the White House for a long time an I hear they got a formidable command of foreign policy.
CLINTON: Ha haaa! Well you know, anyone off the street with a scary black pastor can talk about change, but it takes a fighter to fight for change. And I’m a fighter. I’m tough. And if you lived my life you’d be pretty darn tough too. I mean, I had to go to Wellesley. That was my safety school. But I was strong anyway and I endured. And as president I’ll fight the insurance industry and the pharmaceutical industry and the health care industry, just as soon as they stop giving me millions of dollars!
FB: That’s that no-nonsense down-to-business style I like about you, Hillary Clinton! You don’t just talk about change. You talk about how much you don’t just talk about change!

Fusion 2.0

20080424_fusion Over at Cosmos, Robin McKie looks at the International Thermonuclear Experimental Reactor:

Recreating the fusion process clearly offers great rewards, but it is not an easy task – to say the least. In particular, the business of containing a cloud of plasma that has been heated to around 100 million degrees Celsius has taxed the imaginations of a great many scientists. You can’t hold super-hot matter in an old bathtub, after all. In the end, it took the ingenuity of Russian scientists Igor Tamm and Andrei Sakharov, working at Moscow’s Kurchatov Institute in the late 1950s, to come up with the answer: the tokamak.

The key feature of a tokamak is its central chamber which is shaped like a giant, hollow doughnut or torus, and which gives the device its name. Abbreviating the Russian ‘TOroidalnaya KAmera v MAgnitnykh Katushkakh’ results in ‘tokamak’, or its similar English equivalent, TOroidal CHAmber in MAgnetic Coils (tochamac). Powerful electric currents are passed through coils that wind round the doughnut-shaped chamber and through the plasma inside it, creating a twisting magnetic field that holds the super-hot plasma in a tight, invisible grip.

However, massive amounts of electricity are needed to create this unseen container, and to date, far more energy has been spent powering-up tokamaks than has been released through the resulting fusion of atoms. For example, JET soaks up 25 megawatts of electrical power to generate only 16 megawatts of fusion power. However, ITER – which will be the biggest tokamak reactor ever built when completed – is scheduled to have an output of 500 megawatts for an input of only 50 megawatts of electricity.

Hillary Clinton and the Undoing of a Stereotype

1210610333large Barbara Ehrenreich in The Nation:

A mere decade ago Francis Fukuyama fretted in Foreign Affairs that the world was too dangerous for the West to be entrusted to graying female leaders, whose aversion to violence was, as he established with numerous examples from chimpanzee society, “rooted in biology.” The counter-example of Margaret Thatcher, perhaps the first of head of state to start a war for the sole purpose of pumping up her approval ratings, led him to concede that “biology is not destiny.” But it was still a good reason to vote for a prehistoric-style club-wielding male.

Not to worry though, Francis. Far from being the stereotypical feminist-pacifist of your imagination, the woman to get closest to the Oval Office has promised to “obliterate” the toddlers of Tehran–along, of course, with the bomb-builders and Hezbollah supporters. Earlier on, Clinton foreswore even talking to presumptive bad guys, although women are supposed to be the talk addicts of the species. Watch out–was her distinctly unladylike message to Hugo Chávez, Kim Jong-Il and the rest of them–or I’ll rip you a new one.

There’s a reason it’s been so easy for men to overlook women’s capacity for aggression. As every student of Women’s Studies 101 knows, what’s called aggression in men is usually trivialized as “bitchiness” in women: men get angry; women suffer from bouts of inexplicable, hormonally-driven, hostility. So give Clinton credit for defying the belittling stereotype: she’s been visibly angry for months, if not decades, and it can’t all have been PMS.

But did we really need another lesson in the female capacity for ruthless aggression?

Robert Rauschenberg, 1925-2008

13rauschenberg2190 In the NYT:

A painter, photographer, printmaker, choreographer, onstage performer, set designer and, in later years, even a composer, Mr. Rauschenberg defied the traditional idea that an artist stick to one medium or style. He pushed, prodded and sometimes reconceived all the mediums in which he worked.

Building on the legacies of Marcel Duchamp, Kurt Schwitters, Joseph Cornell and others, he thereby helped to obscure the lines between painting and sculpture, painting and photography, photography and printmaking, sculpture and photography, sculpture and dance, sculpture and technology, technology and performance art — not to mention between art and life.

Mr. Rauschenberg was also instrumental in pushing American art onward from Abstract Expressionism, the dominant movement when he emerged during the early 1950s. He became a transformative link between artists like Jackson Pollock and Willem de Kooning and those who came next, artists identified with Pop, Conceptualism, Happenings, Process Art and other new kinds of art in which he played a signal role.

Warp Processors

Warpsummary Via Cosmic Variance, over at UC Riverside (also see here):

Imagine owning an automobile that can change its engine to suit your driving needs – when you’re tooling about town, it works like a super-fast sports car; when you’re hauling a heavy load, it operates like a strong, durable truck engine. While this turn-on-a-dime flexibility is impossible for cars to achieve, it is now possible for today’s computer chips.

A new, patent-pending technology developed over the last five years by UCR’s Frank Vahid, Professor of Computer Science and Engineering, called “Warp processing” gives a computer chip the ability to improve its performance over time. The benefits of Warp processing are just being discovered by the computing industry. A range of companies including IBM, Intel and Motorola’s Freescale have already pursued licenses for the technology through UCR’s funding source, the Semiconductor Research Corporation.

Here’s how Warp processing works: When a program first runs on a microprocessor chip (such as a Pentium), the chip monitors the program to detect its most frequently-executed parts. The microprocessor then automatically tries to move those parts to a special kind of chip called a field-programmable gate array, or FPGA. “An FPGA can execute some (but not all) programs much faster than a microprocessor – 10 times, 100 times, even 1,000 times faster,” explains Vahid.

A Discussion on Modes of Philosophizing

“Should philosophy have something to say to non-philosophers? Should philosophy be pursued only by those trained in philosophy? Should academic teachers of philosophy consider themselves philosophers in virtue of the fact that they teach philosophy? And should analytic philosophers deny that continental philosophers are philosophers at all, or acknowledge that they represent different modes of philosophizing?” Jonathan Barnes, Myles Fredric Burnyeat, Raymond Geuss, and Barry Stroud debate, over at Eurozine:

[Raymond Geuss]: On what grounds is it reasonable to say that someone should not do X, e.g. should not study philosophy? In contemporary Western European societies people are, by and large, assumed to be free to engage in any activity not explicitly forbidden, and in general for an activity to be forbidden it is thought to be necessary to show that it is in some way harmful. No one else is harmed if I paint an uninteresting picture, and if an aesthetically obtuse person buys my painting, caveat emptor. On the other hand, if the building I construct falls down, indeterminately many people at some later time may well suffer, and a surgical error can be fatal to a person who is in no position to make an informed antecedent judgment about the skill of someone who offers to perform a certain operation. This gives a clear sense to the “should” in “surgery should be performed only by those with appropriate medical training”. The “should” here depends on two distinct features of this situation, first that bad surgery imposes material harm on others, and second that by giving prospective surgeons medical training one can reduce the risk that they will perform poorly. The second feature is as important as the first. If medical training really had no effect on surgical results, there would be no grounds for requiring it. So is studying philosophy really like performing surgery or practicing as a civil engineer?

Before the Revolution

Article00 In Artforum, Arthur Danto remembers the protests of 1968 at Columbia:

As I left the building, I was told by several students that I didn’t understand what was happening, that this was the revolution! Well, revolution was much in the air. How was I to know? How was anyone?

Early the next morning, the phone rang. Someone said, with great urgency, that I had to get over to campus immediately, that the black students had taken over Hamilton Hall. I asked what he thought I could do, and he answered: “Negotiate!” It was still pretty dark, and I remember seeing Mark Rudd, the leader of the Columbia chapter of Students for a Democratic Society (SDS), loping across the campus. He was heading toward Low Library—the university administration building, home to the president’s office—which I was shortly to find had been occupied by the white students who had been thrown out of Hamilton. “Are the blacks still in Hamilton?” I asked. Rudd answered, “I wish I were in there with them!” From that point on, the event becomes a blur to me. I remember a meeting at Lionel Trilling’s apartment, the gist of which was, What could we do to save the university? That was the first meeting of what came to be the Ad Hoc Faculty Group, which met throughout the crisis in the Graduate Students’ Lounge in Philosophy Hall. Living in history has, in retrospect, something of the form of a partially restored mural, in which irregular islands of painted incident are all that remain, set into a wall of blank white plaster. There is no better example of what I mean than Fabrizio’s disconnected battlefield experiences, in Stendhal’s Charterhouse of Parma, in what he afterward learns was the Battle of Waterloo.

What I did learn from the meetings of the Ad Hoc Faculty Group was how such groups move in increasingly radical directions. It was like it must have been in the French Revolution. Initially, you have moderates making impassioned but rational speeches to one another. But then the Jacobins move in and discourse takes a more and more vehement tone. At Columbia in 1968, at least, this phenomenon was the consequence of external uncertainties.

Brooks on Neural Buddhism

Tsbrooks190 Robert Boyle once described the natural world as “brute and stupid.”  This view gained prominence in institutions like the Royal Society, helping to disenchant the world, meaning the non-scientific question whether there are values in the world (out there) or not was usurped by science in favor of the latter.  This criticism of science’s ostensible overreach has been made by not simply philosophers.  Lawrence Krauss, for example, has recently embraced something like this view.  (This issue is separate from the question of the existence of god or gods.)  It seems  to be part of the zeitgeist, having now made it even to the hands of David Brooks who contorts it in his David Brooksian way, in the NYT:

This new wave of research [on the neural instantiation of transcendent experiences] will not seep into the public realm in the form of militant atheism. Instead it will lead to what you might call neural Buddhism.

If you survey the literature (and I’d recommend books by Newberg, Daniel J. Siegel, Michael S. Gazzaniga, Jonathan Haidt, Antonio Damasio and Marc D. Hauser if you want to get up to speed), you can see that certain beliefs will spread into the wider discussion.

First, the self is not a fixed entity but a dynamic process of relationships. Second, underneath the patina of different religions, people around the world have common moral intuitions. Third, people are equipped to experience the sacred, to have moments of elevated experience when they transcend boundaries and overflow with love. Fourth, God can best be conceived as the nature one experiences at those moments, the unknowable total of all there is.

In their arguments with Christopher Hitchens and Richard Dawkins, the faithful have been defending the existence of God. That was the easy debate. The real challenge is going to come from people who feel the existence of the sacred, but who think that particular religions are just cultural artifacts built on top of universal human traits. It’s going to come from scientists whose beliefs overlap a bit with Buddhism.

The human brain is a less-than-perfect device

From Newsweek:

Book Despite the fact that humans have been known to be eaten by bears, sharks and assorted other carnivores, we love to place ourselves at the top of the food chain. And, despite our unwavering conviction that we are smarter than the computers we invented, members of our species still rob banks with their faces wrapped in duct tape and leave copies of their resumes at the scene of the crime. Six percent of sky-diving fatalities occur due to a failure to remember to pull the ripcord, hundreds of millions of dollars are sent abroad in response to shockingly unbelievable e-mails from displaced African royalty and nobody knows what Eliot Spitzer was thinking.

Are these simply examples of a few subpar minds amongst our general brilliance? Or do all human minds work not so much like computers but as Rube Goldberg machines capable of both brilliance and unbelievable stupidity? In his new book, “Kluge: The Haphazard Construction of the Human Mind,” New York University professor Gary Marcus uses evolutionary psychology to explore the development of that “clumsy, cobbled-together contraption” we call a brain and to answer such puzzling questions as, “Why do half of all Americans believe in ghosts?” and “How can 4 million people believe they were once abducted by aliens?”

According to Marcus, while we once we used our brains simply to stay alive and procreate, the modern world and its technological advances have forced evolution to keep up by adapting ancient skills for modern uses–in effect simply placing our relatively new frontal lobes (the home of memory, language, speech and error recognition) on top of our more ancient hindbrain (in charge of survival, breathing, instinct and emotion.) It is Marcus’s hypothesis that evolution has resulted in a series of “good enough” but not ideal adaptations that allow us to be smart enough to invent quantum physics but not clever enough to remember where we put our wallet from one day to the next or to change our minds in the face of overwhelming evidence that our beliefs are wrong.

More here.

Two New Ways to Explore the Virtual Universe, in Vivid 3-D

From The New York Times:

Astro_600_2 The skies may be the next frontier in travel, yet not even the wealthiest space tourist can zoom out to, say, the Crab Nebula, the Trapezium Cluster or Eta Carinae, a star 100 times more massive than the Sun and 7,500 light-years away.

But those galactic destinations and thousands of others can now be toured and explored at the controls of a computer mouse, with the constellations, stars and space dust displayed in vivid detail and animated imagery across the screen. The project, the WorldWide Telescope, is the culmination of years of work by researchers at Microsoft, and the Web site and free downloadable software are available starting on Tuesday, at www.WorldWideTelescope.org.

More here.

Monday, May 12, 2008

Video Dispatch: Teatro UNAM

Teatro UNAM is a mobile theater company based in Mexico D.F.  Hauling a specially engineered trailer that transforms into a stage complete with prop and wardrobe storage, the company travels through Mexico.  Upon each stop, the company themselves set up the stage and perform, often in rural areas where stage drama is unheard of, before driving on.  The video shows one such transformation, in a high school in Mexico City, for a performance of Synge’s “The Playboy of the Western World,” under the direction of Alonso Ruizpalacios.

Sunday, May 11, 2008

todorov on 68

Todorov3

While a great wind of change was blowing in the social realm, political speeches breathed dogmatism and preached (often unwittingly) the imposition of dictatorship. For those who, like me, came from a land of “real socialism,” all this was a chimera.

At first glance, this heritage has almost entirely disappeared (with the exception of the peculiar popularity of French Trotskyite leaders in presidential elections). But, a few years later, the project of a violent social transformation reappeared in the doctrines dubbed neoconservative. The neoconservatives entered the corridors of power in the US and they now have influence in France, too. The permanent revolution that the 68ers used to preach has changed in its objectives but not in its nature: the eradication of the enemy is still what is called for. And often by the same people as in 1968! This is a heritage that truly does deserve to be abandoned.

more from Prospect Magazine here.

Looking at Your Brain on Ethics

200850811 Greg Miller in ScienceNOW:

Say you have a load of donated food to deliver to an orphanage in Uganda. But due to circumstances beyond your control, you’re forced to make a hard choice: give some of the children enough meals to stave off hunger for several days and let the rest go hungry, or evenly distribute a smaller amount of food so that each child feels full for just a few hours. A study published online today in Science is one of the first to investigate how the brain wrestles with such morally charged tradeoffs.

Ming Hsu, a behavioral economist now at the University of Illinois, Urbana-Champaign, and colleagues Cédric Anen and Steven Quartz at the California Institute of Technology in Pasadena, California, used functional magnetic resonance imaging (fMRI) to monitor brain activity in 26 volunteers as they grappled with a version of the orphanage conundrum.

gondry enjoys box

Gondry_slideshow2

NEW YORK—Director Michel Gondry has spent nearly a week developing his latest flight of artistic fancy by playing make-believe in a large corrugated cardboard box, sources close to the daring filmmaker announced Tuesday.

The 45-year-old Gondry, who directed the critically acclaimed films Eternal Sunshine Of The Spotless Mind and Be Kind Rewind, reportedly dragged the washing-machine box into the foyer of his $2.1 million Upper West Side apartment after it was discarded by a neighbor Saturday morning. Using only a crayon and his imagination, Gondry was able to effortlessly transform the box into a submarine, a spaceship, and a castle.

He also reportedly turned the box into a super-secret fort.

more from The Onion here.

Psychological Sources of the Self

Karen Wright in Psychology Today:

It starts innocently enough, perhaps the first time you recognize your own reflection.

You’re not yet 2 years old, brushing your teeth, standing on your steppy stool by the bathroom sink, when suddenly it dawns on you: That foam-flecked face beaming back from the mirror is you.

You. Yourself. Your very own self.

It’s a revelation—and an affliction. Human infants have no capacity for self-awareness. Then, between 18 and 24 months of age, they become conscious of their own thoughts, feelings, and sensations—thereby embarking on a quest that will consume much of their lives. For many modern selves, the first shock of self-recognition marks the beginning of a lifelong search for the one “true” self and for a feeling of behaving in accordance with that self that can be called authenticity.

A hunger for authenticity guides us in every age and aspect of life. It drives our explorations of work, relationships, play, and prayer. Teens and twentysomethings try out friends, fashions, hobbies, jobs, lovers, locations, and living arrangements to see what fits and what’s “just not me.” Midlifers deepen commitments to career, community, faith, and family that match their self-images, or feel trapped in existences that seem not their own. Elders regard life choices with regret or satisfaction based largely on whether they were “true” to themselves.

Questions of authenticity determine our regard for others, as well. They dominated the presidential primaries: Was Hillary authentic when she shed a tear in New Hampshire? Was Obama earnest when his speechwriters cribbed lines from a friend’s oration?