the avedon era

02dire1

In 1960 Richard Avedon photographed the poet W. H. Auden on St. Mark’s Place, New York, in the middle of a snow storm. A few passers-by and buildings are visible to the left of the frame but the blizzard is in the process of freezing Auden in the midst of what is meteorologically termed “a white-out.” Avedon had by then already patented his signature approach to portraiture, so it is tempting to see this picture as a God-given endorsement of his habit of isolating people against a sheer expanse of white, as evidence that his famously severe technique is less a denial of naturalism than its apotheosis. Auden is shown full-length, bundled up in something that seems a cross between an old-fashioned English duffle coat and a prototype of the American anorak. Avedon, in this image, keeps his distance.

more from Geoff Dyer at Threepenny Review here.



“Am I unreal?” she shouted. “Am I a character who can’t possibly exist?”

Aynrand091026_250

Whenever Ayn Rand met someone new—an acolyte who’d traveled cross-country to study at her feet, an editor hoping to publish her next novel—she would open the conversation with a line that seems destined to go down as one of history’s all-time classic icebreakers: “Tell me your premises.” Once you’d managed to mumble something halfhearted about loving your family, say, or the Golden Rule, Rand would set about systematically exposing all of your logical contradictions, then steer you toward her own inviolable set of premises: that man is a heroic being, achievement is the aim of life, existence exists, A is A, and so forth—the whole Objectivist catechism. And once you conceded any part of that basic platform, the game was pretty much over. She’d start piecing together her rationalist Tinkertoys until the mighty Randian edifice towered over you: a rigidly logical Art Deco skyscraper, 30 or 40 feet tall, with little plastic industrialists peeking out the windows—a shining monument to the glories of individualism, the virtues of selfishness, and the deep morality of laissez-faire capitalism. Grant Ayn Rand a premise and you’d leave with a lifestyle.

more from Sam Anderson at New York Magazine here.

the best year

20091105-prague

In truth, the essence of 1989 lies in the multiple interactions not merely of a single society and party-state, but of many societies and states, in a series of interconnected three-dimensional chess games. While the French Revolution of 1789 always had foreign dimensions and repercussions, and became an international event with the revolutionary wars, it originated as a domestic development in one large country. The European revolution of 1989 was, from the outset, an international event—and by international I mean not just the diplomatic relations between states but also the interactions of both states and societies across borders. So the lines of causation include the influence of individual states on their own societies, societies on their own states, states on other states, societies on other societies, states on other societies (for example, Gorbachev’s direct impact on East-Central Europeans), and societies on other states (for example, the knock-on effect on the Soviet Union of popular protest in East-Central Europe). These portmanteau notions of state and society have themselves to be disaggregated into groups, factions, and individuals, including unique actors such as Pope John Paul II.

more from Timothy Garton Ash at the NYRB here.

Thursday Poem

Café

In that café in a foreign town bearing a French writer’s
name I read Under the Volcano
but with diminishing interest. You should heal yourself,
I thought. I’d become a philistine.
Mexico was distant, and its vast stars
no longer shone for me. The day of the dead continued.
A feast of metaphors and light. Death played the lead.
Alongside a few patrons at the tables, assorted fates:
Prudence, Sorrow, Common Sense. The Consul, Yvonne.
Rain fell. I felt a little happiness. Someone entered,
someone left, someone finally discovered the perpetuum mobile.
I was in a free country. A lonely country.
Nothing happened, the heavy artillery lay still.
The music was indiscriminate: pop seeped
from the speakers, lazily repeating: many things will happen.
No one knew what to do, where to go, why.
I thought of you, our closeness, the scent
of your hair in early autumn.
A plane ascended from the runway
like an earnest student who believes
the ancient masters’ sayings.
Soviet cosmonauts insisted that they didn’t find
God in space, but did they look?

by Adam Zagajewsky

translation: Clare Cavanagh
from Anteny (Antennas)
publisher: Znak, Kraków, 2005

Obama’s Delusion

David Bromwich in the London Review of Books:

ScreenHunter_06 Oct. 22 13.15 Long before he became president, there were signs in Barack Obama of a tendency to promise things easily and compromise often. He broke a campaign vow to filibuster a bill that immunised telecom outfits against prosecution for the assistance they gave to domestic spying. He kept his promise from October 2007 until July 2008, then voted for the compromise that spared the telecoms. As president, he has continued to support their amnesty. It was always clear that Obama, a moderate by temperament, would move to the middle once elected. But there was something odd about the quickness with which his website mounted a slogan to the effect that his administration would look to the future and not the past. We all do. Then again, we don’t: the past is part of the present. Reduced to a practice, the slogan meant that Obama would rather not bring to light many illegal actions of the Bush administration. The value of conciliation outweighed the imperative of truth. He stood for ‘the things that unite not divide us’. An unpleasant righting of wrongs could be portrayed as retribution, and Obama would not allow such a misunderstanding to get in the way of his ecumenical goals.

The message about uniting not dividing was not new. It was spoken in almost the same words by Bill Clinton in 1993; and after his midterm defeat in 1994, Clinton borrowed Republican policies in softened form – school dress codes, the repeal of welfare. The Republican response was unappreciative: they launched a three-year march towards impeachment. Obama’s appeals for comity and his many conciliatory gestures have met with a uniform negative. If anything, the Republicans are treating him more roughly than Clinton.

More here.

It’s decorative gourd season, motherfuckers.

Colin Nissan in McSweeney's:

ScreenHunter_05 Oct. 22 12.52 I don't know about you, but I can't wait to get my hands on some fucking gourds and arrange them in a horn-shaped basket on my dining room table. That shit is going to look so seasonal. I'm about to head up to the attic right now to find that wicker fucker, dust it off, and jam it with an insanely ornate assortment of shellacked vegetables. When my guests come over it's gonna be like, BLAMMO! Check out my shellacked decorative vegetables, assholes. Guess what season it is—fucking fall. There's a nip in the air and my house is full of mutant fucking squash.

I may even throw some multi-colored leaves into the mix, all haphazard like a crisp October breeze just blew through and fucked that shit up. Then I'm going to get to work on making a beautiful fucking gourd necklace for myself. People are going to be like, “Aren't those gourds straining your neck?” And I'm just going to thread another gourd onto my necklace without breaking their gaze and quietly reply, “It's fall, fuckfaces. You're either ready to reap this freaky-assed harvest or you're not.”

Carving orange pumpkins sounds like a pretty fitting way to ring in the season. You know what else does? Performing a all-gourd reenactment of an episode of Different Strokes—specifically the one when Arnold and Dudley experience a disturbing brush with sexual molestation.

More here. [Thanks to Anjuli Kolb.]

Democracy and Moral Conflict

Excerpt from Robert B. Talisse's new book, at the Cambridge University Press website:

ScreenHunter_03 Oct. 22 11.49 Democracy is in crisis. So we are told by nearly every outlet of political comment, from politicians and pundits to academicians and ordinary citizens. This is not surprising, given that the new millennium seems to be off to a disconcerting and violent start: terrorism, genocide, torture, assassination, suicide bombings, civil war, human rights abuse, nuclear proliferation, religious extremism, poverty, climate change, environmental disaster, and strained international relations all forebode an uncertain tomorrow for democracy. Some hold that democracy is faltering because it has lost the moral clarity necessary to lead in a complicated world. Others hold that “moral clarity” means little more than moral blindness to the complexity of the contemporary world, and thus that what is needed is more reflection, self-criticism, and humility. Neither side thinks much of the other. Consequently our popular democratic politics is driven by insults, scandal, name-calling, fear-mongering, mistrust, charges of hypocrisy, and worse.

Political theorists who otherwise agree on very little share the sense that inherited categories of political analysis are no longer apt. Principles and premises that were widely accepted only a few years ago are now disparaged as part of a Cold War model that is wholly irrelevant to our post-9/11 context. An assortment of new paradigms for analysis are on offer, each promising to set matters straight and thus to ease the cognitive discomfort that comes with tumultuous times.

The diversity of approaches and methodologies tends to employ one of two general narrative strategies. On the one hand, there is the clash of civilizations account, which holds that the world is on the brink of, perhaps engaged in the early stages of, a global conflict between distinct and incompatible ways of life. On the other hand, there is the democracy deficit narrative, according to which democracy is in decline and steadily unraveling around us. Despite appearances, both narratives come in local and global versions.

More here.

What the Dog Saw

From The Guardian:

Gladwell In 1984, a history graduate at the University of Toronto upped sticks and moved to Indiana. His grades weren't good enough to stay on for postgraduate work, he'd been rejected by more than a dozen advertising agencies, and his application for a fellowship “somewhere exotic” went nowhere. The only thing left was writing – but it turned out that Malcolm Gladwell knows how to write. Gladwell's journalistic trajectory from junior writer on the Indiana-based American Spectator to the doors of the New Yorker makes for a story in itself, but only after arriving at the magazine did he become established as one of the most imaginative non-fiction writers of his generation. As of last year, he had three bestsellers under his belt and was named one of Time magazine's 100 most influential people. Gladwell owes his success to the trademark brand of social psychology he honed over a decade at the magazine. His confident, optimistic pieces on the essence of genius, the flaws of multinational corporations and the quirks of human behaviour have been devoured by businessmen in search of a new guru. His skill lies in turning dry academic hunches into compelling tales of everyday life: why we buy this or that; why we place trust in flakey ideas; why we are hopeless at joining the dots between cause and effect. He is the master of pointing out the truths under our noses (even if they aren't always the whole truth).

Gladwell's latest book, What the Dog Saw, bundles together his favourite articles from the New Yorker since he joined as a staff writer in 1996. It makes for a handy crash course in the world according to Gladwell: this is the bedrock on which his rise to popularity is built. A warning, though: it's hard to read the book without the sneaking suspicion that you're unwittingly taking part in a social experiment he's masterminded to provide grist for his next book. Times are hard, good ideas are scarce: it may just be true. But more about that later.

More here.

Why Sleepyheads Forget

From Science:

Sleep Red-eye flights, all-night study sessions, and extra-inning playoff games all deprive us of sleep and can leave us forgetful the next day. Now scientists have discovered that lost sleep disrupts a specific molecule in the brain's memory circuitry, possibly leading to treatments for tired brains. Neuroscientists studying rodents and humans have found that sleep deprivation interrupts the storage of episodic memories: information about who, what, when, and where. To lay down these memories, neurons in our brains form new connections with other neurons or strengthen old ones. This rewiring process, which occurs over a period of hours, requires a rat's nest of intertwined molecular pathways within neurons that turn genes on and off and fine-tune how proteins behave.

Neuroscientist Ted Abel of the University of Pennsylvania and colleagues wanted to untangle these molecular circuits and pinpoint which one sleep deprivation disrupts. The researchers started by studying electrical signals in slices of the hippocampus–the brain's memory center–from sleep-deprived mice. They tested for long-term potentiation (LTP), a strengthening of connections between neurons that neuroscientists think underlies memory. When the scientists tried to trigger LTP in these brain slices with electrical stimulation or chemicals, they found that methods that fired up cellular pathways involving the molecule cyclic adenosine monophosphate (cAMP) didn't work. Brain cells from sleep-deprived mice also held about 50% less cAMP than did cells from well-rested mice. In the brain, cAMP acts as a molecular messenger, passing signals between proteins that regulate activity of genes responsible for memory formation.

More here.

Wednesday, October 21, 2009

what should a robot look like?

Plastic_slide

The robots are coming. We’ve heard this claim frequently over the past 30 years: that someday soon robots will be ironing our clothes, washing our windows, and serving our morning coffee. In fact, the nearest we’ve come to achieving this vision of domestic automation is embodied by the iRobot Roomba, a puck-shaped robotic vacuum cleaner that does decent work on tile and hardwood, but won’t venture near pile. As a working roboticist, however, I can attest that the vision of domestic robotics is finally, if incrementally, becoming a reality. Robots will not be serving our coffee any time soon, but they will be entertaining our children and caring for our – hopefully not my – elderly relatives. And the likely form of these robots is decidedly humanoid. But what should a humanoid robot look like?

more from Karl Iagnemma at Frieze here.

Peepin’ Ain’t Easy

ID_TN_SMITH_FALL_CO_003

Alas, peak is fleeting. In its wake comes past peak: “Brightness and depth of color have begun to fade. Leaf drop has begun and will accelerate from this point.” Past peak is indicated by a that’s-all-she-wrote, see-you-in-the-spring burgundy that slowly creeps down from Canada and spreads out across the country. This shouldn’t come as a shock, I suppose. What does is how this depiction creates a sense of pressure one doesn’t normally encounter when thinking about leisure time spent outdoors. Looking at the landscape, our eyes are drawn to the trees that make peak such an appealing moment. Looking at these maps, we also see that peak is here, but we simultaneously notice that past peak is following close behind. There is, of course, a melancholy to the sight of changing foliage. We know the transformation reveals the trees’ hunkering down for a long winter. We face those same cold, dark months. The fall foliage map charts this somber process, but it has a melancholy all its own. Looking at foliage, you’re watching summer disappear. Looking at the maps, you’re watching the same thing happen to fall.

more from Jesse Smith at The Smart Set here.

Wrapping Up

091026_r18961_p233

Weighing in early on what academics call “periodization” is a dicey proposition. If you try to locate the moment of a major paradigm shift, in the moment, perhaps by calling your album “Hip Hop Is Dead,” as Nas did in 2006, you’re slipping into weatherman territory. Will it rain tomorrow? Will another great rap album pop up? The life spans of genres and art forms are best perceived from the distance of ten or twenty years, if not more. With that in mind, I still suspect that Nas—along with a thousand bloggers—was not fretting needlessly. If I had to pick a year for hip-hop’s demise, though, I would choose 2009, not 2006. Jay-Z’s new album, “The Blueprint 3,” and some self-released mixtapes by Freddie Gibbs are demonstrating, in almost opposite ways, that hip-hop is no longer the avant-garde, or even the timekeeper, for pop music. Hip-hop has relinquished the controls and splintered into a variety of forms. The top spot is not a particularly safe perch, and every vital genre eventually finds shelter lower down, with an organic audience, or moves horizontally into combination with other, sturdier forms. Disco, it turns out, is always a good default move.

more from Sasha Frere-Jones at The New Yorker here.

Lazy male spiders avoid dinner date

From Nature:

Spider Male spiders that saunter onto a female's web after a rival has spent hours wooing her can quickly copulate without being prematurely eaten by the female. This tactic could lead to small spider suitors seeking out competition with larger rival spiders rather than avoiding it, Canadian researchers say.

The Australian redback spider (Latrodectus hasselti), a member of the black widow family, has a particularly deadly mating ritual. It is one of only a handful of spider species in which the males willingly and actively assist the females with sexual cannibalism — in which the female consumes the male after copulation. In the process of mating, the tiny redback male, whose 4-millimetre body is dwarfed by that of the centimetre-long female, inserts one of his two penis-like organs into one of the female's two sperm-storage sacs. The male then somersaults to place his abdomen over the female's mouthparts, and the female starts eating him as they mate.

More here.

In Shift, Cancer Society Has Concerns on Screenings

Gina Kolata in The New York Times:

Cancer The American Cancer Society, which has long been a staunch defender of most cancer screening, is now saying that the benefits of detecting many cancers, especially breast and prostate, have been overstated. It is quietly working on a message, to put on its Web site early next year, to emphasize that screening for breast and prostate cancer and certain other cancers can come with a real risk of overtreating many small cancers while missing cancers that are deadly. “We don’t want people to panic,” said Dr. Otis Brawley, chief medical officer of the cancer society. “But I’m admitting that American medicine has overpromised when it comes to screening. The advantages to screening have been exaggerated.”

Prostate cancer screening has long been problematic. The cancer society, which with more than two million volunteers is one of the nation’s largest voluntary health agencies, does not advocate testing for all men. And many researchers point out that the PSA prostate cancer screening test has not been shown to prevent prostate cancer deaths. There has been much less public debate about mammograms. Studies from the 1960s to the 1980s found that they reduced the death rate from breast cancer by up to 20 percent. The cancer society’s decision to reconsider its message about the risks as well as potential benefits of screening was spurred in part by an analysis published Wednesday in The Journal of the American Medical Association, Dr. Brawley said.

More here.

Wednesday Poem

Red Stands Out

Think of the tropisms of hyacinth blooms
canted toward morning in Florida. Think

of slouching pole barns in August in Ohio.
Think of the terrain of exhausted summer

and certain varieties of Rome apple, rust—
the seasonally blistered hands of laborers.

Think of a pair of male cardinals preening
in shade, safe from the blur of knife-blades.

Think of ribboned trees marked for clearing
and the lipstick shivers of crimson brushfires.

Think of blood come to absolve the world
of its chief sin: loving all the wrong things.

Think of neural flarings in the cranial dark
mapped as signature moments on an MRI.

Red stands out. Who can trust bland white
when purity has fallen so far out of fashion?

And blue: so inseparable from sky as to be
ceded to the celestial clockwork of the literal.

Think of red as a definition of transcendence:
that intemperate slit-skirt she made magnificent

teaching you to tango in a bar in Buenos Aires,
the aria of her lies as sweet as won money.

by Roy Bently

from Magnolia, Oct. 20, 2009

An Open Letter to Bill Maher on Vaccinations

Michael Shermer in Skepticblog:

Dear Bill,

Years ago you invited me to appear as a fellow skeptic several times on your ABC show Politically Incorrect, and I have ever since shared your skepticism on so many matters important to both of us: creationism and intelligent design, religious supernaturalism and New Age paranormal piffle, 9/11 “truthers”, Obama “birthers”, and all manner of conspiratorial codswallop. On these matters, and many others, you rightly deserved the Richard Dawkins Award from Richard’s foundation, which promotes reason and science.

However, I believe that when it comes to alternative medicine in general and vaccinations in particular you have fallen prey to the same cognitive biases and conspiratorial thinking that you have so astutely identified in others. In fact, the very principle of how vaccinations work is additional proof (as if we needed more) against the creationists that evolution happened and that natural selection is real: vaccinations work by tricking the body’s immune system into thinking that it has already had the disease for which the vaccination was given. Our immune system “adapts” to the invading pathogens and “evolves” to fight them, such that when it encounters a biologically similar pathogen (which itself may have evolved) it has in its armory the weapons needed to fight it. This is why many of us born in the 1950s and before may already have some immunity against the H1N1 flu because of its genetic similarity to earlier influenza viruses, and why many of those born after really should get vaccinated.

Vaccinations are not 100% effective, nor are they risk free. But the benefits far outweigh the risks, and when communities in the U.S. and the U.K. in recent years have foregone vaccinations in large numbers, herd immunity is lost and communicable diseases have come roaring back. This is yet another example of evolution at work, but in this case it is working against us.

More here.

The Music Genome Project

Rob Walker in the New York Times Magazine:

ScreenHunter_02 Oct. 21 11.37 On first listen, some things grab you for their off-kilter novelty. Like the story of a company that has hired a bunch of “musicologists,” who sit at computers and listen to songs, one at a time, rating them element by element, separating out what sometimes comes to hundreds of data points for a three-minute tune. The company, an Internet radio service called Pandora, is convinced that by pouring this information through a computer into an algorithm, it can guide you, the listener, to music that you like. The premise is that your favorite songs can be stripped to parts and reverse-engineered.

Some elements that these musicologists (who, really, are musicians with day jobs) codify are technical, like beats per minute, or the presence of parallel octaves or block chords. Someone taking apart Gnarls Barkley’s “Crazy” documents the prevalence of harmony, chordal patterning, swung 16ths and the like. But their analysis goes beyond such objectively observable metrics. To what extent, on a scale of 1 to 5, does melody dominate the composition of “Hey Jude”? How “joyful” are the lyrics? How much does the music reflect a gospel influence? And how “busy” is Stan Getz’s solo in his recording of “These Foolish Things”? How emotional? How “motion-inducing”? On the continuum of accessible to avant-garde, where does this particular Getz recording fall?

There are more questions for every voice, every instrument, every intrinsic element of the music. And there are always answers, specific numerical ones. It can take 20 minutes to amass the data for a single tune. This has been done for more than 700,000 songs, by 80,000 artists. “The Music Genome Project,” as this undertaking is called, is the back end of Pandora.

More here.

Nearly universal literacy is a defining characteristic of today’s modern civilization; nearly universal authorship will shape tomorrow’s

Denis G. Pelli and Charles Bigelow in Seed:

Nearly everyone reads. Soon, nearly everyone will publish. Before 1455, books were handwritten, and it took a scribe a year to produce a Bible. Today, it takes only a minute to send a tweet or update a blog. Rates of authorship are increasing by historic orders of magnitude. Nearly universal authorship, like universal literacy before it, stands to reshape society by hastening the flow of information and making individuals more influential.

To quantify our changing reading and writing habits, we plotted the number of published authors per year, since 1400, for books and more recent social media (blogs, Facebook, and Twitter). This is the first published graph of the history of authorship. We found that the number of published authors per year increased nearly tenfold every century for six centuries. By 2000, there were 1 million book authors per year. One million authors is a lot, but they are only a tiny fraction, 0.01 percent, of the nearly 7 billion people on Earth. Since 1400, book authorship has grown nearly tenfold in each century. Currently, authorship, including books and new media, is growing nearly tenfold each year. That’s 100 times faster. Authors, once a select minority, will soon be a majority.

Authors-per-year_inline_640x262

But does increasing authorship matter? And is this increase a blip or a signpost? Authorship has risen steeply before. The period of the first steep rise, near 1500, coincides with the discovery of the New World and Protestantism, which saw the publication of the first vernacular Bible, translated by Martin Luther. The second, near 1800, includes the Industrial Revolution and its backlash, Romanticism. The current rise is much steeper.

More here.