From Lensculture:

Speers_immortals_2 The idea of stopping time and preserving fleeting moments is, in many ways, at the heart of photography and central to human desire. Psychologically and emotionally, humans have always dreaded growing old, losing youthful vigor, and falling prey to the weaknesses of old age, faulty memories, morbidity, decay and death. These fears and anxieties are deeply rooted in our psyches, and have been played out in ancient myths, classical painting, modern poetry, theater and cinema, as well as in practically every other art form in the history of humanity, including digital gaming. We don’t want to grow old and weak, nor do we want our children and loved ones to grow old. We would like to stay forever young. We want to be beautiful, desirable, powerful and perfect like gods and goddesses. Or at least we think we do.

Vee Speers’ latest body of photo-based art, aptly titled Immortal, plays to these age-old sensibilities and timeless longings while riffing on the very contemporary convergence of similar ideas, ideals, and forms that have invaded our consciousness in our media-driven, technology-rich consumer cultures.

More here.

How animals made us human

From The Boston Globe:

Animal Who among us is invulnerable to the puppy in the pet store window? Not everyone is a dog person, of course; some people are cat people or horse people or parakeet people or albino ferret people. But human beings are a distinctly pet-loving bunch. In no other species do adults regularly and knowingly rear the young of other species and support them into old age; in our species it is commonplace. In almost every human culture, people own pets. In the United States, there are more households with pets than with children. On the face of it, this doesn’t make sense: Pets take up resources that we would otherwise spend on ourselves or our own progeny. Some pets, it’s true, do work for their owners, or are eventually eaten by them, but many simply live with us, eating the food we give them, interrupting our sleep, dictating our schedules, occasionally soiling the carpet, and giving nothing in return but companionship and often desultory affection.

What explains this yen to have animals in our lives?

An anthropologist named Pat Shipman believes she’s found the answer: Animals make us human. She means this not in a metaphorical way — that animals teach us about loyalty or nurturing or the fragility of life or anything like that — but that the unique ability to observe and control the behavior of other animals is what allowed one particular set of Pleistocene era primates to evolve into modern man. The hunting of animals and the processing of their corpses drove the creation of tools, and the need to record and relate information about animals was so important that it gave rise to the creation of language and art. Our bond with nonhuman animals has shaped us at the level of our genes, giving us the ability to drink milk into adulthood and even, Shipman argues, promoting the set of finely honed relational antennae that allowed us to create the complex societies most of us live in today. Our love of pets is an artifact of that evolutionary interdependence.

More here.

Big Sometimes Friendly Giant

Dahl100913_1_250Sam Anderson reviews Donald Sturrock’s new biography of Roald Dahl Storyteller, in NY Magazine:

Many of Roald Dahl’s book covers today come stamped with an official-looking logo proclaiming him “The World’s No. 1 Storyteller.” The declaration is, like Dahl’s fiction itself, simultaneously thrilling and absurd and puzzling and oddly disturbing. How, one has to wonder, was the ranking determined? Was there some kind of single-elimination global storytelling showdown, in which the creator of Willy Wonka, presumably as an eighth-seeded underdog, managed to out-yarn a bracket of, say, Jack London, Salman Rushdie, Isak Dinesen, Victor Hugo, Lewis Carroll, Mark Twain, and—in what must have been a squeaker of a final—the mighty Dickens? And even if we do accept that result, isn’t the title somehow slightly patronizing? After all, we don’t celebrate Faulkner or Conrad or Shakespeare primarily as “storytellers.” It would be like calling a master chef “The World’s No. 1 Pan-Fryer”—a great compliment, but also one that immediately raises questions about his ability to bake, braise, roast, grill, stew, poach, and flambé.

Dahl was, indeed, a great storyteller: Anyone who doubts that can pull aside a random child on the street and start reading her James and the Giant Peach or Fantastic Mr. Fox. If an adult comes up to object, you can start reading him one of the short stories: maybe “Taste” (in which a dinner-party bet among wine connoisseurs spirals out of control) or “The Sound Machine” (in which a man can hear plants screaming). If a policeman intervenes, read him “Lamb to the Slaughter,” in which a woman kills her husband with a frozen lamb chop, then cooks and feeds it to the detectives who come to investigate. You could probably go on like that forever.

Dahl’s own favorite of his yarns was The BFG, a children’s book in which the eponymous hero, the Big Friendly Giant, walks around city streets at night blowing dreams through a long tube into kids’ bedroom windows. The giant keeps thousands of dreams stored in neatly labeled glass jars in his cave—with the good ones (what he calls “phizzwizards”) carefully segregated from the bad (“trogglehumpers”). “I IS ONLY AN EIGHT YEAR OLD LITTLE BOY,” runs one of the good dreams, “BUT I IS GROWING A SPLENDID BUSHY BEARD AND ALL THE OTHER BOYS IS JALOUS.” (The BFG is a self-taught writer: He learned to read from a borrowed copy of Nicholas Nickleby, whose author he identifies as “Dahl’s Chickens.”) One of the giant’s best dreams reads like a mission statement for Dahl’s career:


Democracy After Citizens United

Ndf_35.5_octobucks Lawrence Lessig in the Boston Review:

Institutional corruption does not refer to the knowing violation of any law or ethical rule. This is not the problem of Rod Blagojevich, or, more generally, of bad souls acting badly. It instead describes an influence, financial or otherwise, within an economy of influence, that weakens the effectiveness of an institution, especially by weakening public trust in that institution. (An “economy of influence” rather than the simpler “system of influence” to emphasize the reciprocal character of such influence, often requiring little or no direct coordination.)

Congress is a paradigm case. Members of Congress run privately financed campaigns. The contributions that fund those campaigns are not illegal, or even unethical. To the contrary, they are protected speech under the First Amendment.

Yet arguably—or maybe obviously—those contributions are (1) an influence (2) within an economy of influence that has (3) (quite likely) weakened the ability of Congress to do its work, by (4) (certainly) weakening public trust in Congress. The vast majority of Americans believe money buys results in Congress; less than a quarter of Americans believe the institution worthy of their trust. When “free-market” Republicans vote to support milk subsidies or sugar tariffs, or when “pro-consumer” Democrats vote to exempt used-car dealers from consumer financial-protection legislation, it is easy to understand the mistrust and hard to believe that the influence of money hasn’t weakened the ability of members to serve the principles, or even the interests, they were elected to represent.

This is “corruption” not because it describes the acts of evil or corrupted individuals. Members of Congress are insanely hard-working, decent souls, the overwhelming majority of whom entered public service to do good, as they see it. And indeed, the traditional corruption of politics—bribery—is likely at its lowest point in American history. From the perspective of criminal law, this is among the cleanest Congresses ever.

Instead, this is “corruption” because it weakens the integrity of the institution, of Congress itself.

Responses from Will Wilkinson, Marvin Ammori, and David Donnelly, with others to come, can be found here.

Berlusconi’s Machiavellian Moment

AP100830132500_jpg_470x420_q85 Ingrid D. Rowland in the NYRB blog:

Several remarkable things have happened here in Italy in the past week.

One: Prime Minister Silvio Berlusconi, that self-styled man for all seasons—tycoon, soccer team owner, politician, crooner, swain—the perennial fixer who not too long ago said, in Milanese dialect, ghe pensi mi, “I’ll take care of it”—“il premier,” il Cavaliere (that is, Sir Silvio), has apparently been driven by the present political situation to say, “I don’t know what to do.” And he doesn’t, though a few days of reading Machiavelli and Thucydides might provide him with a clue or two. (Beginning with Chapter 23 of The Prince: “How Flatterers are to be Shunned.”)

Two: The independent television channel La7 (Channel Seven) is stealing viewers from all the other channels—all but one of which are controlled directly or indirectly by Berlusconi—by providing real news, and forcing the speakers on its talk shows to observe the basics of civil behavior. (The norm on the other channels is for everyone to shout at once, except Umberto Bossi, head of the Northern League, whose speech is impaired by a stroke—but his middle finger is fat and fit). Its evening newscast is conducted by Enrico Mentana, who has worked in the past for both the RAI state TV and Berlusconi’s Channel 5.

Three: Gianfranco Fini, the President of the lower house of Parliament, used a speech on September 4 to break publicly with Berlusconi in a way that brooks no return.

Busted: Stories of the Financial Crisis

Joshua Clover in The Nation:

For all the fame surrounding Milton Friedman, Ayn Rand and Alan Greenspan, their contributions to a political economy of modern capitalism are minor in relation to those of Friedrich von Hayek, a founder of the Mont Pelerin Society and prophet of the “price signal.” A striking and original intellect, Hayek argued that something's market price is not simply what it would cost you but a kind of information used to allocate goods and services most efficiently within the social matrix. Because centrally planned economies lack a mechanism to price commodities correctly, they are unable to put things where they need to go. Individuals wouldn't get what they desired; the larger economy would be unable to balance production and consumption, supply and demand. Shortages would appear cheek by jowl with surpluses. This would have disastrous and eventually fatal consequences, not only for the market but for the lives of its subjects.

The free market, contrarily, able to revalue every object with supple velocity according not to some ideological program but the aggregate will of the people—not just the invisible hand but the invisible spirit, as it were—was more suited not simply to survival but to individual freedom. Hayek's case, best known from The Road to Serfdom (1944), remains the most rigorously persuasive brief for twentieth-century capitalism in its long, acrimonious and cordite-scented war against every other form of life. At the time, the 1989 collapse of the Soviet bloc, and the discrediting of its economic hypotheses, seemed to confer on Hayek's insight the aura of truth.

And yet, having triumphed more or less absolutely, the American model of capitalism has proved itself to be catastrophically lacking in the very balance that Hayek suggested was its singular virtue.

‘Must we dream our dreams?’

From The Guardian:

Elizabeth-Bishop-006 Elizabeth Bishop's status as one of the greatest American poets of the 20th century is based on the smallest of oeuvres. Some 70 poems were published in her lifetime in four very slim volumes. She died in 1979, aware that her reputation was steadily increasing, eclipsing that of her close friend and fellow poet Robert Lowell. Since her death and the publication of two superb volumes of her correspondence, One Art and Words in Air (letters between her and Lowell) it has grown ever more secure. In the select pantheon of 20th-century poets writing in English, she is placed with TS Eliot, WB Yeats, Wallace Stevens and WH Auden. Her poems often took her years to write and complete, and their formal perfection and the simple, limpid accuracy of their language have always drawn the admiration of other poets. John Ashbery called her “the writer's writer's writer.”

More here.

Code World

From The New York Times:

Book There are many stories Tom McCarthy chooses not to tell in “C,” his tour de force new novel encompassing the short life of one Serge Carrefax, born at the turn of the 20th century on a rural English estate. Serge’s father, a manic tinkerer with early wireless technology, runs a school for the deaf but seems oblivious to his own deaf wife, Serge’s mother, who’s so blinkered on opium (supplied by a mute gardener who grows the poppies himself) that she nearly lets Serge drown in a creek at age 2. Serge’s beloved older sister, Sophie, becomes sexually involved with a friend of their father’s and winds up committing suicide at 17 — possibly after having an abortion. Serge’s relationship to Sophie is preternaturally close, with incestuous overtones, and her death severs his only real human connection. But these dramas are merely suggested, their shadowy outlines ignored, sublimated or flat-out denied by those involved; Sophie’s self-poisoning is deemed an accident.

McCarthy, author of the ingenious 2006 novel “Remainder,” withstands the temptations of emotional plotting and holds out instead for something bigger, deeper, more universal and elemental. “C” is a rigorous inquiry into the meaning of meaning: our need to find it in the world around us and communicate it to one another; our methods for doing so; the hubs and networks and skeins of interaction that result. Gone is the minimalist restraint he employed in “Remainder”; here, he fuses a Pynchonesque revelry in signs and codes with the lush psychedelics of William Burroughs to create an intellectually provocative novel that unfurls like a brooding, phosphorescent dream.

More here.

Cherchez la femme


James Ellroy’s breakthrough 1987 crime novel The Black Dahlia opens with a rookie cop being given words of advice about how to solve cases. “Cherchez la femme,” his mentor says. The rookie cop’s French isn’t up to the task. “What?” “Look for a woman,” translates the mentor. Or as Ellroy’s life and work obsessively reiterate: “Look for the woman.” The Black Dahlia elevated Ellroy to the pantheon of US crime writers alongside Raymond Chandler and Dashiell Hammett. It was dedicated to his mother with the inscription: “Twenty-nine Years Later, This Valediction in Blood.” The reference was to Jean Ellroy’s murder in 1958 when Ellroy was 10. The crime went unsolved. Ellroy, both a canny self-publicist and a genuinely extreme personality, has deployed this original trauma to devastatingly powerful effect in his writing and public persona. In his case, “Cherchez la femme” leads with gothic inevitability back to the same woman: his dead mother.

more from Ludovic Hunter-Tilney at the FT here.

I mean, it’s American music


“We’re all wondering, what’s he gonna do, what’s he gonna do? It was a strange performance. It wasn’t technically brilliant. But he played a festival’s array of styles. It was like ‘Here’s American music, guys. You want a festival? I’m going to give you a festival.’ And then the ghosts began to me to become corporeal. You know, already there were everybody from Tennessee Ernie Ford and God knows who, not all Newport people by any means, but Son House and Muddy Waters; they were all kind of assuming shape again.” Wilentz says that by the time Dylan encored with the Grateful Dead’s arrangement of “Not Fade Away,” the spirits of Buddy Holly and Jerry Garcia had joined the mix. That show, the discussion of which comes near the end of “Bob Dylan in America,” can stand for just some of the historical strands that come together in Dylan. “People think of it as Americana,” Wilentz says, “which is a term I can’t stand. Or roots music. Or all those labels. It’s not, it’s none of that. I mean, it’s American music. It’s the music this country has created out of blood, sweat and tears, and he picks all of that up and raises it, lowers it, he takes it and makes it his own.”

more from Charles Taylor at the LAT here.

Something has gone away


In an essay about his friendship with Norman Mailer, written in early 1961 at the peak of his eloquence, James Baldwin recalled his reaction to the news that Mailer was running for mayor of New York. Baldwin initially dismissed the rumor as a joke, until “it became hideously clear that it was not a joke at all. I was furious. I thought . . . you’re copping out . . . It’s not your job.” Within a year or two, Baldwin himself had accepted a new job. Having attained prominence over the course of the 1950s as a novelist, with “Go Tell It on the Mountain” and “Giovanni’s Room,” and as a reporter issuing passionately perceptive dispatches from Paris, Harlem and the disintegrating South, Baldwin found himself increasingly in demand as a speaker on behalf of the civil rights movement. After publication of “The Fire Next Time” in 1963, he became a celebrity presence at events — a “face.” At the end of the decade, however, demoralized by the assassinations of Medgar Evers, Malcolm X and Martin Luther King, he suffered a form of nervous collapse and retreated to the French hilltop village of St.-Paul-de-Vence, near Nice, where he lived in subdued peace and where he died in 1987. “Since Martin’s death . . . something has altered in me,” he wrote in his account of the tumultuous period, “No Name in the Street.” “Something has gone away.”

more from James Campbell at the NYT here.

The Neurobiology of Evil

John Cookson at Big Think:

Is a person's propensity toward evil a matter of malfunctioning synapses and neurons?

Michael Stone, professor of clinical psychiatry at Columbia University and author of “The Anatomy of Evil,” says it is. Ever-more-detailed brain scans are revealing the biological origins of psychological issues in “evil” people, from those who are mildly antisocial to serial murderers.

Under each brain’s wrinkly cortex lies the limbic system, an evolutionary heirloom controlling emotion and motivation, among other functions. Within this limbic system is the amygdala, an almond-shaped cluster of nuclei that processes our feelings of fear and pleasure.

Murderers and other violent criminals have been shown to have amygdalae that are smaller or that don’t function properly, explains Stone. One recent study concluded that individuals who exhibit a marker of “limbic neural maldevelopment” have “significantly higher levels of antisocial personality, psychopathy, arrests and convictions compared with controls.”

The amygdala is important because, among its other functions, it allows an individual to respond to the facial expressions of others. When a person has an abnormal amygdala—one that doesn't process the facial expressions of emotion—they can have an inability to register the fear and suffering of a victim, says Stone. This lack of response to the emotions of others predisposes an individual to antisocial, even criminal, behavior.

More here.

Pakistani Rock Star Declares ‘Rock & Roll Jihad’ Against Extremists

From Radio Free Europe:

Salman Salman Ahmad was a 19-year-old medical student in 1982 when he performed music on stage for the first time in his native Pakistan. Having just returned from six years in the United States, where he'd earned enough money clearing restaurant tables and delivering newspapers to buy an electric guitar, the future Pakistani rock star began to play a song by the rock group Van Halen at a talent show in Lahore. Suddenly, Ahmad heard cries of rage in Urdu from a gang of bearded young men who stormed toward the stage. They were an early manifestation of the Taliban: Islamic student extremists affiliated with a local religious party, acting as self-appointed music police.

While some of the extremists threw burqas and chadors over the women in the audience, Ahmad says one bearded student jumped on stage and grabbed his electric guitar — “his eyes filled with a madness that has nothing to do with God” as he smashed the precious instrument beyond repair. Ahmad tells RFE/RL it was a transformational moment in his life — the moment when he declared “rock and roll jihad” against the “ideology of hate.” “The Taliban and their brand of Islam is not Islam at all. Islam doesn't teach you to kill innocent women, children, and men. Islam doesn't teach you to commit suicide,” Ahmad says. “That's haram,” or forbidden. Ahmad says that “as long as the Taliban pursue a strategy of violence, subjugation of women, destroying girls schools, killing musicians,” he doesn't see how anyone can “reconcile with that sort of mentality and ideology. The ideology of hate, the ideology of terrorism, has no place in Islam, or anywhere else in the world, and I will continue saying that.”

More here. (Note: Salman is a dear friend, and above everything else, one of the most decent human beings I have ever met. Salman Zindabad!)

Friday Poem

Who could have predicted that a … campaign (of) anti-Muslim hate speech (by) … the country's most prominent politicians … could have led to (Koran burning) in Florida? —Josh Marshall, TPM

The Balloon of the Mind

Hands, do what you're bid:
Bring the balloon of the mind
That bellies and drags in the wind
Into its narrow shed

by William Butler Yeats

Why some memories stick

From Nature:

News_2010_457_MRI_scan Practice makes perfect when it comes to remembering things, but exactly how that works has long been a mystery. A study published in Science this week1 indicates that reactivating neural patterns over and over again may etch items into the memory.

People find it easier to recall things if material is presented repeatedly at well-spaced intervals rather than all at once. For example, you're more likely to remember a face that you've seen on multiple occasions over a few days than one that you've seen once in one long period. One reason that a face linked to many different contexts — such as school, work and home — is easier to recognize than one that is associated with just one setting, such as a party, could be that there are multiple ways to access the memory. This idea, called the encoding variability hypothesis, was proposed by psychologists about 40 years ago2.

More here.

I’m Sorry, Michael Pollan

Rowan Jacobsen in Eating Well:

ScreenHunter_03 Sep. 10 12.14 I am a granola-eating, free-range-chicken-chasing, broccoli-hugging foodie. I recoil from junk food like vampires shun sunlight. You’d think me the ideal audience for Food Rules, Michael Pollan’s latest megaseller, which consists of 64 “straightforward, memorable rules for eating wisely.” But Food Rules awakened strange feelings in me. As I paged through the book, being advised “Don’t eat anything your great-grandmother wouldn’t recognize as food” (#2), I felt a big finger wagging at me. This was not having the intended effect. Instead, I’m sorry to say, it was awakening my inner Bart Simpson. Somewhere between “The whiter the bread, the sooner you’ll be dead” (#37) and “Eat when you are hungry, not when you are bored” (#47), I became consumed by the notion of breaking all 64 food rules in one day of gloriously irresponsible eating.

It wouldn’t be easy. It would take significant planning and discipline, as well as digestive fortitude, but I might just be able to do it. I would eat whatever I saw advertised on television (#11). I would eat it alone and bored (#59, #47). I would eat breakfast cereal that changed the color of my milk (#36), I would eat way beyond full (#46) and I would scramble to go back for seconds (#53) of a food that was incapable of rotting (#13).

More here. [Thanks to Elatia Harris.]