Hüsker Dü

Bkr-Christgau-t_CA0-articleInline

It was early 1983, probably, after the “Everything Falls Apart” EP presaged Hüsker Dü’s departure from hard-core punk and before the “Metal Circus” EP made it official. Just a gig at a crummy club near CBGB, and late — after 1. There weren’t a dozen onlookers, but Hüsker Dü’s two early records were knockouts, and that Minneapolis trio never came east, so there we were. From our booth in back the music sounded terrific: headlong and enormous, the guitar unfashionably full, expressive and unending, with two raving vocalists alternating leads on songs whose words were hard to understand and whose tunes weren’t. Another half-dozen curious fans drifted in. And then, halfway through, the guitarist passed into some other dimension. When he stepped yowling off the low stage, most of us gravitated closer, glancing around and shaking our heads. The climax was the band’s now legendary cover of “Eight Miles High,” which transformed the Byrds’ gentle paean to the ­chemical-technological sublime into a roller coaster lifted screaming off its tracks — bruising and exhilarating, leaving the rider both very and barely alive.

more from Robert Christgau at the NYT here.

being human

Ab010364-9dfd-11e0-958b-00144feabdc0

What is human nature? A biologist might see it like this: humans are animals and, like all animals, consist mostly of a digestive tract into which they relentlessly stuff other organisms – whether animal or vegetable, pot-roasted or raw – in order to fuel their attempts to reproduce yet more such insatiable, self-replicating omnivores. The fundamentals of human nature, therefore, are the pursuit of food and sex. But that, the biologist would add, is only half the story. What makes human nature distinctive is the particular attribute that Homo sapiens uses to hunt down prey and attract potential mates. Tigers have strength, cheetahs have speed – that, if you like, is tiger nature and cheetah nature. Humans have something less obviously useful: freakishly large brains. This has made them terrifyingly inventive in acquiring other organisms to consume – and, indeed, in preparing them (what other animal serves up its prey cordon bleu?) – if also more roundabout in their reproductive strategies (composing sonnets, for example, or breakdancing). Human nature – the predilection for politics and war, industry and art – is, therefore, just the particularly brainy way that humans have evolved to solve the problems of eating and reproducing. Thus biologists believe that once they understand the human brain and the evolutionary history behind it, they will know all they need to about this ubiquitous brand of ape.

more from Stephen Cave at the FT here.

Rereading: Mildred Pierce

From Guardian:

Mildred-Pierce---2011-007 Edmund Wilson once called James M Cain (1892-1977) one of America's “poets of the tabloid murder”. After Dashiell Hammett and Raymond Chandler Cain is the writer most often credited with defining the “hard-boiled”, the tough-talking, fast-moving urban stories of violence, sex and money that characterised so much popular film and fiction in America during the 1930s and 40s. Unlike Hammett and Chandler, however, Cain did not focus his fiction on the consoling figure of the detective bringing a semblance of order to all that urban chaos. His novels are told from the perspective of the confused, usually ignorant, all-too-corruptible central actors in his lurid dramas of betrayal and murder. His first two novels, The Postman Always Rings Twice and Double Indemnity, were narrated by men destroyed by femmes fatales; both were made into enormously successful films, especially Billy Wilder's now-classic Double Indemnity, starring Fred MacMurray and Barbara Stanwyck in an improbable blonde wig.

In 1941, Cain published Mildred Pierce, his first novel to focus on a female protagonist; in 1945, it was duly made into a film, starring Joan Crawford in her only Oscar-winning performance, as an overprotective mother trying to cover up for her homicidal daughter. That version of Mildred Pierce is now a classic piece of stylish film noir; but its plot and tone diverge sharply from the novel, a more ostensibly “realistic” story about a divorced woman trying to raise her daughters in depression-era California. Now the film-maker Todd Haynes has returned to Cain's original text to bring us a mini-series of Mildred Pierce, with a cast including Kate Winslet in the title role, Evan Rachel Wood as the treacherous daughter and Guy Pearce. This new Mildred Pierce, produced for HBO with an apparently unlimited budget, may well be the most faithful adaptation of a book ever made: the dialogue is nearly verbatim, and the film moves painstakingly through a virtual transcription of Cain's novel. The attention to historical detail is astonishing, the performances outstanding, and the finished product is visually gorgeous, steeped in a golden sepia tone. But by the end some viewers may well be wondering what, exactly, about this story merited such reverential treatment: Cain's characterisation is uneven, to say the least, and the narrative is resolved only by means of contorted turns of the plot.

More here.

The Trouble With Common Sense

From The New York Times:

Christakis-popup The popularity of the Mona Lisa is an illusion. As Duncan J. Watts explains: “We claim to be saying that the Mona Lisa is the most famous painting in the world because it has attributes X, Y and Z. But really what we’re saying is that the Mona Lisa is famous because it’s more like the Mona Lisa than anything else.” In other words, we are trapped inside a hall of mirrors of our own devising. We think the Mona Lisa is famous because of its traits, but we think those traits are significant only because they belong to the Mona Lisa, which we know to be famous. Ditto Shakespeare? Yes. When an incredulous English professor asked him whether he believed “Shakespeare might just be a fluke of history,” Watts indicated that he meant exactly that.

Watts doesn’t tell us how that conversation ended, but common sense does. Either the literature professor sputtered that Watts — a sociologist, physicist and former officer of the Australian Navy — had no idea what he was talking about, and left him standing with a half-­empty drink in his hand, or she was quite taken with his unorthodox views and spent the rest of the evening engrossed. That both outcomes — although incompatible — strike us as predictable is actually Watts’s point in this penetrating and engaging book. We rely on common sense to understand the world, but in fact it is an endless source of just-so stories that can be tailored to any purpose. “We can skip from day to day and observation to observation, perpetually replacing the chaos of reality with the soothing fiction of our explanations,” Watts writes. Common sense is a kind of bespoke make-believe, and we can no more use it to scientifically explain the workings of the social world than we can use a hammer to understand mollusks.

More here.

Friday, June 24, 2011

Does Islam Stand Against Science?

Steve Paulson in the Chronicle of Higher Education:

Photo_13094_landscape_largeScience in Muslim societies already lags far behind the scientific achievements of the West, but what adds a fair amount of contemporary angst is that Islamic civilization was once the unrivaled center of science and philosophy. What's more, Islam's “golden age” flourished while Europe was mired in the Dark Ages.

This history raises a troubling question: What caused the decline of science in the Muslim world?

Now, a small but emerging group of scholars is taking a new look at the relationship between Islam and science. Many have personal roots in Muslim or Arab cultures. While some are observant Muslims and others are nonbelievers, they share a commitment to speak out—in books, blogs, and public lectures—in defense of science. If they have a common message, it's the conviction that there's no inherent conflict between Islam and science.

More here.

The Brain on Trial

Advances in brain science are calling into question the volition behind many criminal acts. A leading neuroscientist describes how the foundations of our criminal-justice system are beginning to crumble, and proposes a new way forward for law and order.

David Eagleman in The Atlantic:

Neuroscience2On the steamy first day of August 1966, Charles Whitman took an elevator to the top floor of the University of Texas Tower in Austin. The 25-year-old climbed the stairs to the observation deck, lugging with him a footlocker full of guns and ammunition. At the top, he killed a receptionist with the butt of his rifle. Two families of tourists came up the stairwell; he shot at them at point-blank range. Then he began to fire indiscriminately from the deck at people below. The first woman he shot was pregnant. As her boyfriend knelt to help her, Whitman shot him as well. He shot pedestrians in the street and an ambulance driver who came to rescue them.

The evening before, Whitman had sat at his typewriter and composed a suicide note:

I don’t really understand myself these days. I am supposed to be an average reasonable and intelligent young man. However, lately (I can’t recall when it started) I have been a victim of many unusual and irrational thoughts.

By the time the police shot him dead, Whitman had killed 13 people and wounded 32 more. The story of his rampage dominated national headlines the next day. And when police went to investigate his home for clues, the story became even stranger: in the early hours of the morning on the day of the shooting, he had murdered his mother and stabbed his wife to death in her sleep.

It was after much thought that I decided to kill my wife, Kathy, tonight … I love her dearly, and she has been as fine a wife to me as any man could ever hope to have. I cannot rationa[l]ly pinpoint any specific reason for doing this …

Along with the shock of the murders lay another, more hidden, surprise: the juxtaposition of his aberrant actions with his unremarkable personal life. Whitman was an Eagle Scout and a former marine, studied architectural engineering at the University of Texas, and briefly worked as a bank teller and volunteered as a scoutmaster for Austin’s Boy Scout Troop 5. As a child, he’d scored 138 on the Stanford-Binet IQ test, placing in the 99th percentile. So after his shooting spree from the University of Texas Tower, everyone wanted answers.

For that matter, so did Whitman. He requested in his suicide note that an autopsy be performed to determine if something had changed in his brain—because he suspected it had.

More here.

Leap Seconds May Hit a Speed Bump

Sophie Bushwick in Scientific American:

06-17-nistf1ph_1NIST-F1 is one of several international atomic clocks used to define international civil time (dubbed Coordinated Universal Time, or UTC), a job they perform a little too well. In fact, atomic clocks are actually more stable than Earth's orbit—to keep clocks here synched up with the motion of celestial bodies, timekeepers have to add leap seconds. The use of a leap year, adding a day to February every four years, locks the seasons, which result from Earth's orbit about the sun and the planet's tilt as it orbits, into set places in the civil calendar. Similarly, leap seconds ensure that the time it takes Earth to spin 360 degrees is equal to one day as defined by humans and their atomic clocks. Most recently, an extra second was tacked on to universal time on December 31, 2008.

However, since 1999, the Radiocommunication Sector of the ITU has been proposing the elimination of leap seconds from the measurement of UTC. Although the organization did not participate in the creation of the current leap second system, the radio waves it regulates are used to transmit UTC, giving it some influence.

Getting rid of leap seconds would certainly make it easier to calculate UTC, but this measure would also decouple astronomical time from civil time: The time measured by atomic clocks would gradually diverge from the time counted out by the movement of Earth through space. Eventually, one year will no longer be the length of Earth's orbit around the sun. Instead, it will be equivalent to a certain number of cycles of radiation from the cesium-133 atom (almost a billion billion cycles, to be precise).

More here.

Mohandas and the Unicorn

Patrick French in The National Interest:

Great_Soul_coverIf celebrity is a mask that eats into the face, posthumous fame is more like an accretion of silt and barnacles that can leave the face unrecognizable, or recognizable only as something it is not. We might feel we know Mohandas Gandhi, Abraham Lincoln, Albert Einstein, Joan of Arc or Martin Luther King Jr., but, rather, we know their iconic value: their portraits or statues, their famous deeds and sayings. We have trouble seeing them as their contemporaries did—as people. Jawaharlal Nehru, writing in the 1930s when he was in a British prison and some distance from becoming India’s prime minister, said that Gandhi’s views on marital relationships were “abnormal and unnatural” and “can only lead to frustration, inhibition, neurosis, and all manner of physical and nervous ills. . . . I do not know why he is so obsessed by this problem of sex.” Nehru was writing publicly, in his autobiography, but it is fair to say that few Indian politicians today would speak of the Father of the Nation in this unfettered way. Gandhi has become, in India and across the world, a simplified character: a celibate, cheerful saint who wore a white loincloth and round spectacles, ate small meals and succeeded in bringing down an empire through nonviolent civil disobedience. Barack Obama, who kept a portrait of Gandhi hanging on the wall of his Senate office, is fond of citing him.

Joseph Lelyveld has already found himself in some trouble over Great Soul, not for what he wrote, but for what other people say he wrote. In a contemporary morality tale of high-speed information transfer and deliberate misconstruction, his book has been identified as something it is not. The Daily Mail, one of London’s lively and vituperative tabloids, ran a story saying Great Soul claimed Gandhi “was bisexual and left his wife to live with a German-Jewish bodybuilder.”

More here.

Yemeles Bibesy, Illecebrous Quagswagging, Malagrugrous Sanguinolency, etc.

Heather Carreiro in Matador Abroad:

20101107-dictionaryDuring my undergraduate studies as a Linguistics major, one of the things that struck me most is the amazing fluidity of language. New words are created; older words go out of style. Words can change meaning over time, vowel sounds shift, consonants are lost or added and one word becomes another. Living languages refuse to be static.

The following words have sadly disappeared from modern English, but it’s easy to see how they could be incorporated into everyday conversation.

Words are from Erin McKean’s two-volume series: Weird and Wonderful Words and Totally Weird and Wonderful Words. Definitions have been quoted from the Oxford English Dictionary.

1. Jargogle

Verb trans. – “To confuse, jumble” – First of all this word is just fun to say in its various forms. John Locke used the word in a 1692 publication, writing “I fear, that the jumbling of those good and plausible Words in your Head..might a little jargogle your Thoughts…” I’m planning to use it next time my husband attempts to explain complicated Physics concepts to me for fun: “Seriously, I don’t need you to further jargogle my brain.”

2. Deliciate

Verb intr. – “To take one’s pleasure, enjoy oneself, revel, luxuriate” – Often I feel the word “enjoy” just isn’t enough to describe an experience, and “revel” tends to conjure up images of people dancing and spinning around in circles – at least in my head. “Deliciate” would be a welcome addition to the modern English vocabulary, as in “After dinner, we deliciated in chocolate cream pie.”

More here. [Thanks to Gabika Bočkaj.]

The Intelligent Homosexual’s Guide to Natural Selection and Evolution, with a Key to Many Complicating Factors

Jeremy Yoder in Scientific American:

ScreenHunter_08 Jun. 24 17.08I'm looking forward to celebrating Pride for the first time in my new hometown of Minneapolis this weekend–but as an evolutionary biologist, I suspect I have a perspective on the life and history of sexual minorities that many of my fellow partiers don't. In spite of the progress that LGBT folks have made, and seem likely to continue to make, towards legal equality, there's a popular perception that we can never really achieve biological equality. This is because same-sex sexual activity is inherently not reproductive sex. To put it baldly, as the idea is usually expressed, natural selection should be against men who want to have sex with other men–because we aren't interested in the kind of sex that makes babies. An oft-cited estimate from 1981 is that gay men have about 80 percent fewer children than straight men.

Focusing on the selective benefit or detriment associated with particular human traits and behaviors gets my scientific dander up, because it's so easy for the discussion to slip from what is “selectively beneficial” to what is “right.” A superficial understanding of what natural selection favors or doesn't favor is a horrible standard for making moral judgements. A man could leave behind a lot of children by being a thief, a rapist, and a muderer–but only a sociopath would consider that such behavior was justified by high reproductive fitness.

And yet, as an evolutionary biologist, I have to admit that my sexual orientation is a puzzle.

More here.

Biologists discover how yeast cells reverse aging

From PhysOrg:

Biologistsdi Human cells have a finite lifespan: They can only divide a certain number of times before they die. However, that lifespan is reset when reproductive cells are formed, which is why the children of a 20-year-old man have the same life expectancy as those of an 80-year-old man.

…When yeast cells reproduce, they undergo a special type of cell division called meiosis, which produces spores. The MIT team found that the signs of cellular aging disappear at the very end of meiosis. “There’s a true rejuvenation going on,” Amon says. The researchers discovered that a gene called NDT80 is activated at the same time that the rejuvenation occurs. When they turned on this gene in aged cells that were not reproducing, the cells lived twice as long as normal. “It took an old cell and made it young again,” Amon says. In aged cells with activated NDT80, the nucleolar damage was the only age-related change that disappeared. That suggests that nucleolar changes are the primary force behind the aging process, Amon says. The next challenge, says Daniel Gottschling, a member of the Fred Hutchinson Cancer Research Center in Seattle, will be to figure out the cellular mechanisms driving those changes. “Something is going on that we don’t know about,” says Gottschling, who was not involved in this research. “It opens up some new biology, in terms of how lifespan is being reset.” The protein produced by the NDT80 gene is a transcription factor, meaning that it activates other genes. The MIT researchers are now looking for the genes targeted by NDT80, which likely carry out the rejuvenation process.

More here.

A Painter Framed

From The Daily Beast:

Mf Maqbool Fida Husain, who died on June 8, was India’s most prominent painter—but in the last year of his life, he had become a national of Qatar, and he died in London, far from the city he loved, Mumbai. That rootlessness, in essence, captures the poignancy of the artist’s life—he became controversial, but didn’t choose to be so. He was born in pre-independence India around 1915 and lived there until the 1990s, when Hindu nationalists launched a vicious campaign against him. They were upset after a magazine found some of his old paintings and sketches, some dating back to the 1970s, which showed Hindu deities in the nude. That wasn’t really controversial; in sculptures in many ancient temples, including Khajuraho and Konarak, and in some paintings and manuscripts, Hindu deities have appeared without clothes, or wearing little.

But Husain was born a Muslim, and Hindu activists saw an opportunity to lead a sustained campaign against him. This included vigilantes damaging artworks and art galleries that showed his work in India and abroad; filing lawsuits against him throughout India for offending religious sensibilities; and attacking a television station that ran a poll among viewers asking them if Husain should be given India’s highest civilian honor, besides threatening him with violence. Instead of protecting Husain’s right of free expression, authorities filed charges under colonial-era Indian laws, which restrict freedom of expression, and judges admitted cases against him. Even after higher courts ruled in his favor, the hounding continued. Husain, who only wanted to paint, lived outside India for most of the past two decades.

Husain’s decision to leave a secular, democratic India for Qatar, an authoritarian theocracy in the Middle East, was a blot on India.

More here.

Friday Poem

High-Speed Bird

At full tilt, air gleamed –
and a window-struck kingfisher,
snatched up, lay on my palm
still beating faintly.

Slowly, a tincture
of whatever consciousness is
infused its tremor, and
ram beak wide as scissors

all hurt loganberry inside,
it crept over my knuckle
and took my outstretched finger
in its wire foot-rings.

Cobalt wings, shutting on beige
body. Gold under-eye whiskers,
beak closing in recovery
it faced outward from me.

For maybe twenty minutes
we sat together, one on one,
as if staring back or
forward into prehistory.

by Les Murray
from Taller When Prone
Publisher: Black Inc., Melbourne, © 2010

Thursday, June 23, 2011

against reviews

Image

Who reads reviews? Occasionally a lot of people. But usually just the book’s author, if she Googles herself, plus any pals, parents, exes, etc. who also search for her. Otherwise, our only readers are our friends, who feel obligated to at least skim our boring review because we liked theirs on Facebook. Why do we prioritize some imaginary “public” over people we actually know, and who read our work? Why don’t we want to write, and read, for our friends? Perhaps we fear our freedom. If we could read and write anything we wanted, what would we read and write? Probably not book reviews. Choices would have to be made. Imagine a literary culture in which the relationship between reader and writer was as intimate and direct as the relationship between poet and patron. This would not be, and never was, a recipe for health or contentment—most marriages are unhappy. But the “passion” that Arnold thought needed to be neutralized could proudly speak its name. Why should a writer be ashamed to write for someone she knows? Why should her friends and enemies feign a lack of interest in her work? Affection, attraction, admiration, rivalry, resentment: all are aphrodisiacs, and heighten our interest in what’s before us. Nobody insists we fuck strangers—why must we read them? If the privacy of pure patronage is impossible or undesirable, the traditional courtship can be replaced by the orgy.

more from Elizabeth Gumport at n+1 here.

conjure beings and invoke spirits

Art110613_198

Mark Grotjahn’s large new paintings abound with torrents of ropy impasto, laid down in thickets, cascading waves, and bundles that swell, braid around, or overlap one another. Noses and mouths appear in kaleidoscopic furrows. Eyes, too—sometimes in clusters, other times alone. Often these eyes are gouged out, opaque, blank, like those of some simian being or blind oracle. There are echoes of Cubism here and Vlaminck’s Fauvism, of mid-century abstraction, German and neo-Expressionism, rock painting, folk art, and fabric design. I’m tantalized by the facture and physicality of these paintings. What Grotjahn (pronounced groat-john) paints doesn’t stay put on these variegated surfaces; instead, it shifts around the involuting centerless space. You can discern the ways in which this work is made, yet no formal system appears. (I surmise that the artist himself is sometimes caught off guard by what he’s produced.) His strangely shamanic art gives me a remnant of the pow I get from those ancient eternal faces in Picasso’s Les Demoiselles d’Avignon. The winding rows of oil paint have been carefully laid on, wet-on-wet. Sometimes these lines look like colored grubs or raffia, in tones that are rich and saturated, ranging from mauve and apple to emerald and blood red. I think of magic carpets and magnetic fields. I spy networks of Martian canals and landscapes folding over themselves. I glimpse one of painting’s oldest purposes: the uncanny ability to conjure beings and invoke spirits.

more from Jerry Saltz at New York Magazine here.

who’s afraid of the late essays?

TLS_Broughton_738213a

As an essayist she had cut her teeth on the outermost crusts of literary minority, reviewing the long-forgotten work of the pointlessly pseudonymous. In 1907 she was sent five dramas in verse to review: “my mind feels as though a torrent of weak tea has been poured over it”, she complained in a letter. Here we see her learning the ropes: forging an argument out of unprepossessing materials (“these stories were meant to be read swiftly on a train, and to preserve them in a book is to imprison them unkindly”); experimenting with the avuncular and the urbane and the knowingly fogeyish (Rose Macaulay’s characters “say a great many very clever things”); and mastering the delicate art of praise never quite so faint as to descend to a sneer: “It would seem inexcusably bad taste to pull such innocent work to pieces; it seems to confide in you”. The apprenticeship gave her a respect for “ingenuity and good workmanship”, however mistakenly applied. (In later life she wrote of her friend Roger Fry that, because he was a painter himself, his criticism was “full of respect and admiration for the artist who has used his gift honourably and honestly even though it is a small one”.) It also gave her a lifelong appreciation of the variegatedness of the experience of reading, and of the fact that sometimes, as she said of Frederick Marryat, “our critical faculties enjoy whetting themselves upon a book which is not among the classics”.

more from Trev Broughton at the TLS here.

Barking Island

Colin Dayan in the Boston Review:

Dayan_36_4_dogs A sound of gulls, a sunlit port, human voices, barking dogs. In a city market, dogs are sitting, lying down, walking past. Dogs gather in the center of the screen. Night falls. A dog gives birth; she nurses her babies. A constable in sharp silhouette comes and looks on as, growling, she huddles over her young.

So begins Serge Avedikian’s fifteen-minute animated film Barking Island (originally Chienne d’histoire in French), which, in 2010, won the Palme d’Or as the best short film at Cannes. The images are paintings by Thomas Azuélos, made deep and weighty, contoured yet dissolving at the edges, almost palpable.

Once the music changes, the scene shifts to humans at a long table discussing how to eliminate the dogs. Newspapers announce that there are more than 60,000 dogs on the streets of Constantinople. The Turkish authorities appeal for an end to them. After exploring various options—gassing, incineration, turning corpses into meat for human consumption—offered by the Pasteur Institute in Paris and other European experts, the Turks decide to round the dogs up and abandon them on a deserted island in the Bosporus.

More here.

Paul Krugman on Inspiration for a Liberal Economist

From The Browser:

I wanted to start by saying how pleased I am you call yourself a liberal, because there are a lot of people – politicians – who are reluctant to be associated with the word.

ScreenHunter_07 Jun. 23 14.02As I see it, there has been a lot of effective propaganda. As a result, a lot of people adopted the term “progressive” as a somehow less charged way of saying the same thing, which I don’t think works. I consider myself both – liberal and progressive. It’s not too different from what would be called a social democrat in Europe – you believe in a decent-sized welfare state, you believe that we are our brothers’ keepers. Of course I’m not a politician so I can afford to label myself in a way that might lose some votes…

The first book you’ve chosen isn’t about economics at all; it’s a work of science-fiction, Isaac Asimov’s Foundation trilogy. But was it part of what inspired you to become an economist?

Yes. This is a very unusual set of novels from Isaac Asimov, but a classic. It’s not about gadgets. Although it’s supposed to be about a galactic civilisation, the technology is virtually invisible and it’s not about space battles or anything like that. The story is about these people, psychohistorians, who are mathematical social scientists and have a theory about how society works. The theory tells them that the galactic empire is failing, and they then use that knowledge to save civilisation. It’s a great image. I was probably 16 when I read it and I thought, “I want to be one of those guys!” Unfortunately we don’t have anything like that and economics is the closest I could get.

More here.

The Scientific Case for Masturbation

Sharon Begley in Newsweek:

ScreenHunter_06 Jun. 23 13.53Since Christine “I’m Not a Witch” O’Donnell is campaigning for the U.S. Senate and not the directorship of the Kinsey Institute, maybe we should give her a pass when it comes to her views on sex and, specifically, masturbation. But that would be a mistake: the stakes are simply too high, going all the way up the very survival of our species. For while O’Donnell crusaded against masturbation in the mid-1990s, denouncing it as “toying” with the organs of procreation and generally undermining baby making, the facts are to the contrary. Evidence from elephants to rodents to humans shows that masturbating is—counterintuitively—an excellent way to make healthy babies, and lots of them. No one who believes in the “family” part of family values can let her claims stand.

The science is straightforward. Whenever a behavior is common in the animal kingdom, biologists suspect it has an adaptive function. That is, the behavior enabled individual animals to survive better and leave more offspring than animals that did not engage in the behavior. As a result, genes for the behavior spread throughout that population until it became essentially ubiquitous. And so it is with autoeroticism, which is common—really common. As the Science in Seconds blog noted this week, what with “spanking the monkey,” “charming the snake,” and “freeing willy,” a remarkable number of the slang terms for pleasuring oneself refer to animals. That reflects reality: the practice has been documented in Japanese macaques, gibbons, baboons, chimps, elephants, dogs, cats, horses, lions, donkeys, “and walruses that manage to flog the bishop with their fins.” (Bonus for clicking on the blog link above: excellent photo of an elephant in flagrante dilecto.)

What, then, might be its adaptive function?

More here.