a desperate nomadic quality

58208423-10154202

“I am submitting the enclosed short story ‘LIFE-LINE’ for either ‘Astounding’ or ‘Unknown,'” Robert A. Heinlein wrote to editor John Campbell in 1939, “because I am not sure which policy it fits the better.” The former magazine published science fiction, the latter fantasy. Heinlein’s short story — the first he had attempted professionally, at age 31 — concerns a machine that can predict when a person will die. That he sold this neophyte production, on first submission, to a top pulp editor (kicking off an intense friendship and correspondence) is exciting in and of itself. Heinlein’s uncertainty about to which slice of genre this story belonged is an ironic and humanizing detail, given what a titan Heinlein would become as the author of everything from juvenile SF in character-building Horatio Alger mode to the counterculture touchstone “Stranger in a Strange Land” (1961).

more from Ed Park at the LA Times here.

Iftar at Isabelle’s

Ian Bassingthwaighte in Guernica:

Adbusters_84_Personal All that remains are pitted dates. Everything else has been eaten. The plates have been licked and the glasses emptied. Residue lingers on the table: dirty napkins, forgotten forks, leftover crumbs and morsels, plastic water bottles. “Would you pray more if I left you?” I ask Isabelle, a Cairo native and the girl of my dreams. She pretends to be religious, but I don’t mind because she’s pretty when her head’s pressed to the floor.

“Probably,” she says. “Are you going to?”

“Not yet,” I say.

Wendell laughs, but I don’t think it’s funny.

Wendell is a fat man with a twisted wit and he wants to fuck my lover more than I do. He is the best friend I have and I would loathe to lose him. He has a glass in his hand and he raises it. I raise mine. Isabelle raises the entire bottle and we drink until the room spins faster than the Earth does, which makes us dizzy and prone to tipping. We go outside and into the city, which is a messy conglomerate of heat and waste. We would breathe air if there were any, but instead there are varieties of emissions and so we breathe those instead. We dodge speeding vehicles as we meander blindly across highways and side streets. Isabelle whispers a prayer at each crossing. She says it for all of us so that we won’t get splattered. One street short of the Nile, a cat scrambles across the road and is squished halfway to the other side by a bus. Then I mimic the cat and scramble too, but I make it. Divine intervention or slop-fed luck, I don’t know which. But I’m the winner this time and I celebrate by cheering.

More here.

Working on the Ending

Gail Godwin in The New York Times:

Godwin450 When you’re a young writer, you subtract the birth dates of authors from their publication dates and feel panic or hope. When you’re an old writer, you observe the death dates of your favorite writers and you reflect on their works and their lives. This past year I outlived Henry James, who died two months short of his 73rd birthday. In his final years, he wrote an autobiography of his childhood, befriended badly wounded World War I soldiers and changed his citizenship. I have catapulted myself out of many writing setbacks and humiliations with the rallying cry of the dying novelist Dencombe, in James’s story “The Middle Years”: “We work in the dark — we do what we can — we give what we have. Our doubt is our passion, and our passion is our task.” The words have the stride of a march and the echo of a mantra. Already I have missed being able to ask James, “When you were my age, what did you do when . . . ?”

“How does what you want out of writing change with age?” Terry Gross asked Philip Roth on NPR’s “Fresh Air” in October. Roth, 77, told her it hadn’t changed much for him. He wanted to be as alert and energetic as ever at the keyboard, he wanted to be taken seriously, and he wanted to make a work of art out of his subject. You want to be taken seriously; that doesn’t change. What has changed for me is the degree of compromise I am willing to inflict on my work in order to see it in print. As a young writer, I was told by the fiction editor at Esquire that he’d publish my story if I took out the woman’s dreams. I took them out. “It will make her more inscrutable,” he promised, chuckling. It certainly did. Forty years later, “A Sorrowful Woman” is my most anthologized story, and I get regular e-mails from bewildered high school and college students asking why this woman did what she did.

More here.

Friday, December 10, 2010

WikiLeaky Power, or The Dark Side of Internet Freedom

Dr4613c_thumb3 First, Pierre Buhler in Project Syndicate:

In the late 1980’s, glasnost – transparency – was one of the nails in the coffin of the Soviet Union. While WikiLeaks has certainly not had a similar effect, it epitomizes the extent of the individual’s empowerment in a networked world. All that was necessary to challenge the world’s mightiest power, after all, was a disgruntled US Army intelligence analyst, some hacking knowledge, a few computers, and a handful of determined activists enrolled under the contested banner of transparency.

At the time she was named Director of Policy Planning at the US State Department, Anne-Marie Slaughter, a respected scholar of international affairs, boldly heralded the advent of a networked world. “War, diplomacy, business, media, society…are networked,” she wrote in Foreign Affairs in January 2009, and “in this world, the measure of power is connectedness.” Having the greatest potential for connectivity, America has the edge in a “networked century.”

This drive prompted US Secretary of State Hillary Clinton in January 2010 to proclaim the “freedom to connect” as the cyber-equivalent of the more familiar freedoms of assembly or expression. Of course, Clinton added, these technologies are not an unmitigated blessing, and can be misused for darker purposes. But her list of the potential abuses of the connected world contained nothing similar to the WikiLeaks storm.

That storm will leave behind no trace of understanding if it is assessed in isolation, rather than as part of a broader pattern. WikiLeaks’ latest release demonstrates that the transformation of power by the “digital revolution” could be as far-reaching as that brought about by the fifteenth-century printing revolution. In this game, where new players invite themselves, the edge goes to agility and innovation.

Second, Evgeny Morozov in The Christian Science Monitor:

Read more »

The Crisis of the American Intellectual

Guilds_WIkicommonsWalter Russell Mead in The National Interest (h/t: Boston Review):

[T]he biggest roadblock today is that so many of America’s best-educated, best-placed people are too invested in old social models and old visions of history to do their real job and help society transition to the next level. Instead of opportunities they see threats; instead of hope they see danger; instead of the possibility of progress they see the unraveling of everything beautiful and true.

Too many of the very people who should be leading the country into a process of renewal that would allow us to harness the full power of the technological revolution and make the average person incomparably better off and more in control of his or her own destiny than ever before are devoting their considerable talent and energy to fighting the future.

I’m overgeneralizing wildly, of course, but there seem to be three big reasons why so many intellectuals today are so backward looking and reactionary.

First, there’s ideology. Since the late nineteenth century most intellectuals have identified progress with the advance of the bureaucratic, redistributionist and administrative state. The government, guided by credentialed intellectuals with scientific training and values, would lead society through the economic and political perils of the day. An ever more powerful state would play an ever larger role in achieving ever greater degrees of affluence and stability for the population at large, redistributing wealth to provide basic sustenance and justice to the poor. The social mission of intellectuals was to build political support for the development of the new order, to provide enlightened guidance based on rational and scientific thought to policymakers, to administer the state through a merit based civil service, and to train new generations of managers and administrators. The modern corporation was supposed to evolve in a similar way, with business becoming more stable, more predictable and more bureaucratic.

Most American intellectuals today are still shaped by this worldview and genuinely cannot imagine an alternative vision of progress. It is extremely difficult for such people to understand the economic forces that are making this model unsustainable and to see why so many Americans are in rebellion against this kind of state and society – but if our society is going to develop we have to move beyond the ideas and the institutions of twentieth century progressivism.

Tea’d Off

Hitchens Christopher Hitchens in Vanity Fair:

It is often in the excuses and in the apologies that one finds the real offense. Looking back on the domestic political “surge” which the populist right has been celebrating since last month, I found myself most dispirited by the manner in which the more sophisticated conservatives attempted to conjure the nasty bits away.

Here, for example, was Ross Douthat, the voice of moderate conservatism on the New York Times op-ed page. He was replying to a number of critics who had pointed out that Glenn Beck, in his rallies and broadcasts, had been channeling the forgotten voice of the John Birch Society, megaphone of Strangelovian paranoia from the 1950s and 1960s. His soothing message:

These parallels are real. But there’s a crucial difference. The Birchers only had a crackpot message; they never had a mainstream one. The Tea Party marries fringe concerns (repeal the 17th Amendment!) to a timely, responsible-seeming message about spending and deficits.

The more one looks at this, the more wrong it becomes (as does that giveaway phrase “responsible-seeming”). The John Birch Society possessed such a mainstream message—the existence of a Communist world system with tentacles in the United States—that it had a potent influence over whole sections of the Republican Party. It managed this even after its leader and founder, Robert Welch, had denounced President Dwight D. Eisenhower as a “dedicated, conscious agent” of that same Communist apparatus. Right up to the defeat of Barry Goldwater in 1964, and despite the efforts of such conservatives as William F. Buckley Jr. to dislodge them, the Birchers were a feature of conservative politics well beyond the crackpot fringe.

Now, here is the difference. Glenn Beck has not even been encouraging his audiences to reread Robert Welch. No, he has been inciting them to read the work of W. Cleon Skousen, a man more insane and nasty than Welch and a figure so extreme that ultimately even the Birch-supporting leadership of the Mormon Church had to distance itself from him. It’s from Skousen’s demented screed The Five Thousand Year Leap (to a new edition of which Beck wrote a foreword, and which he shoved to the position of No. 1 on Amazon) that he takes all his fantasies about a divinely written Constitution, a conspiratorial secret government, and a future apocalypse. To give you a further idea of the man: Skousen’s posthumously published book on the “end times” and the coming day of rapture was charmingly called The Cleansing of America. A book of his with a less repulsive title, The Making of America, turned out to justify slavery and to refer to slave children as “pickaninnies.” And, writing at a time when the Mormon Church was under attack for denying full membership to black people, Skousen defended it from what he described as this “Communist” assault.

All Of Humanity

Stone_35.6_learAlan A. Stone in The Boston Review:

For centuries after Shakespeare wrote King Lear, interpreters refused to accept the play’s desolation and lack of redemption. Nahum Tate gave it a happy ending in 1691, and for 200 years a redeemed Lear and the Earl of Gloucester would peacefully retire while their good children, Cordelia and Edgar, marry and rule a unified Britain. As late as the start of the twentieth century, preeminent Shakespeare scholar A. C. Bradley lectured that Lear had reached transcendence through his suffering and died happy. Even though the play contains the bleakest line in all of Shakespeare—Lear’s “Never, never, never, never, never,” as he holds his daughter’s dead body in his arms—Bradley insisted on a Christian moral to the story.

Today King Lear is recognized as the greatest tragedy in the English language, less brilliant than Hamlet but more profound and prophetic: “Humanity,” the Duke of Albany laments, “must perforce prey on itself, Like monsters of the deep.” There is no god or justice in the pre-Christian world that Shakespeare invented for Lear. Stanley Cavell’s justly famous essay “The Avoidance of Love” captures the paradox of Lear for modern audiences. “We can only learn through suffering” but have “nothing to learn from it,” he writes.

Stalin’s reign of terror, Hitler’s concentration camps, and the atomic bombs that fell on Hiroshima and Nagasaki gave philosophers, literary critics, and theater directors a context for understanding Shakespeare’s grim text. Even so, actors and directors remained puzzled by Lear. Lear himself asks in Act I, “Does any here know me?” Was he already senile before he divided up his kingdom in exchange for public professions of love from his three daughters, or was he driven mad by the consequences of his rash decision as he realized that the two daughters who professed so much love and devotion—Goneril and Regan—now ruled over him? Regan and her husband, the Duke of Cornwall, lock Lear out of Gloucester’s home, and leave him in a terrible storm. “Blow winds and crack your cheeks! Rage, blow! You cataracts and hurricanes, spout till you have drenched out steeples,” he cries. But there is also pathos, “Here I stand your slave, a poor, infirm, weak and despised old man.” Lear is at once emotionally transparent and unable to acknowledge what he has done—“Who is it that can tell me who I am?” he wonders.

Over the past four decades some of the world’s most eminent film directors and actors have attempted an answer to Lear’s question. Among them are the Soviet Russian Grigori Kozintsev with the Estonian actor Jüri Järvet as Lear, and Britain’s Peter Brook with Paul Scofield—both productions from 1971. On television, Michael Elliott and Laurence Olivier tackled Lear in 1983, and Trevor Nunn and Ian McKellen tried their hand in 2008. And there are many others, including Akira Kurosawa, whose Ran is an Edo-period interpretation in which three sons replace Lear’s daughters.

None of these films give us the definitive answer to Lear’s question.

Consciousness on the Page: A Primer on the Novels of Nicholson Baker

0679735755.01.MZZZZZZZ Colin Marshall in The Millions:

1. Telepathy on a budget

If you don’t know Nicholson Baker as an intensive describer of everyday minutiae, surely you know him as an intensive describer of goofy sexual fantasy. At the very least, you might hold the broad notion that he’s very, very detail-oriented. None of those images capture the novelist in full, but if you twist them into a feedback loop by their common roots, you’ll get closer to the reality. Whatever the themes at hand, Baker adheres with utter faith to his narrators’ internal monologues, carefully following every turn, loop, and kink (as it were) in their trains of thought. He understands how often people think about sex, but he also understands that, often times, they just think about shoelaces — and he understands those thoughts of sex and shoelaces aren’t as far apart, in form or in content, as they might at first seem.

This is why some find Baker’s novels uniquely dull, irritating, or repulsive, and why others place them in the small league of books that make sense. Not “sense” in that they comprise understandable sentences, paragraphs, and chapters; the existential kind of sense. So many novels exude indifference to their medium, as though they could just as easily have been — or are merely slouching around before being turned into — movies, comics, or interpretive dances. The Baker novel is long-form text on the page as well, but it’s also long-form text at its core, and on every level in between. Adapting it into anything else would be a ludicrous project at best and an inconceivable one at worst; you might as well “adapt” a boat into a goat.

Baker lays out certain clues to the effectiveness — or if you’re on the other side, ineffectiveness — of his concept of the novel in the texts themselves. Brazen, perhaps, but awfully convenient.

Friday Poem

When the Tower of Babel was built we began to speak in tongues, which was confusing
at first and set us at odds with each other. But then we remembered that behind the
babble everything remained one, though now
we had so many more ways to say “thing”!
But we're still confused.

–Roshi Bob

How many?

Australia, a group of girls at a corroboree
Lapland, reindeer herdgirls

China, the “yaktail”

Greece, the seven daughters, sisters,
or “the sailing stars”

a cluster of faint stars in Taurus,
the Pleiades,

name of a car in Japan —
Subaru

in Mayan —A fistful of boys —

by Gary Snyder
from Danger on Peaks

Mice created from two dads

From MSNBC:

Mouse Reproductive scientists have used stem cell technology to create mice from two dads. The breakthrough could be a boon to efforts to save endangered species — and the procedure could make it possible for same-sex couples to have their own genetic children. The scientists, led by Richard Berhringer at the M.D. Anderson Cancer Center in Texas, describe the process in a study posted Wednesday in the journal Biology of Reproduction. Here's how it works:

Cells from a male mouse fetus were manipulated to produce an induced pluripotent stem cell line. These iPS cells are ordinary cells that have been reprogrammed to take on a state similar to that of an embryonic stem cell, which can develop into virtually any kind of tissue in the body. About 1 percent of the iPS cell colonies spontaneously lost their Y chromosome, turning them into “XO” cells. These cells were injected into embryos from donor female mice, and transplanted into surrogate mothers. The mommy mice gave birth to babies carrying one X chromosome from the original male mouse. Once these mice matured, the females were mated with normal male mice. Some of their offspring had genetic contributions from both fathers.

More here.

To Eat Less, Imagine Eating More

From Science:

M Before dipping your hand into that bowl of M&Ms at the holiday party, think about what you're about to do. A lot. A new study finds that people who imagine themselves consuming many pieces of candy eat less of the real thing when given the chance. The finding, say experts, could lead to the development of better weight-loss strategies. Picturing a delicious food—like a juicy steak or an ice cream sundae—generally whets the appetite. But what about visualizing yourself eating the entire sundae, spoonful by spoonful? There's reason to think that might have the opposite effect, says Carey Morewedge, an experimental psychologist at Carnegie Mellon University in Pittsburgh, Pennsylvania. Researchers have found that repeated exposure to a particular food—as in taking bite after bite of it—decreases the desire to consume more. This process, which psychologists call habituation, dampens appetite independently of physiological signals like rising blood sugar levels or an expanding stomach. But no one had looked to see whether merely imagining eating has the same effect.

To investigate, Morewedge and colleagues Young Eun Huh and Joachim Vosgerau fed M&Ms and cheese cubes to 51 undergraduate students. In one experiment, the participants first imagined performing 33 repetitive motions: Half of them imagined eating 30 M&Ms and inserting three quarters into the slot of a laundry machine. The other half envisioned eating three M&Ms and inserting 30 quarters. Then everyone was allowed to eat their fill from a bowl of M&Ms. Those who'd envisioned eating more candy ate about three M&Ms on average (or about 2.2 grams), whereas the others ate about five M&Ms (or about 4.2 grams), the researchers report in the 10 December issue of Science.

More here.

Thursday, December 9, 2010

Who Gets to Keep Secrets

DH200-1Over at Edge, Aalam Wassef, Clay Shirky , Gloria Origgi, George Church, Noga Arikha, Douglas Rushkoff , George Dyson, Simona Morini answer W. Daniel Hillis's Question:

The question of secrecy in the information age is clearly a deep social (and mathematical) problem, and well worth paying attention to.

When does my right to privacy trump your need for security?; Should a democratic government be allowed to practice secret diplomacy? Would we rather live in a world with guaranteed privacy or a world in which there are no secrets? If the answer is somewhere in between, how do we draw the line?

I am interested in hearing what the Edge community has to say in this regard that's new and original, and goes beyond the political. Here's my question:

WHO GETS TO KEEP SECRETS?

Clay Shirky:

Social & Technology Network Topology Researcher; Adjunct Professor, NYU Graduate School of Interactive Telecommunications Program (ITP); Author, Here Comes Everybody

“When does my right to privacy trump your need for security?” seems analogous to the question “What is fair use?”, to which the answer is “Whatever activities don't get people convicted for violating fair use.” Put another way, fair use is a legal defense rather than a recipe.

Like fair use, the tension between a right to privacy and a right to know isn't one-dimensional — there are many independent variables involved. For instance, does the matter at hand involve elected or appointed officials? Working in the public name? The presumption should run to public review.

Does it involve private citizens? Would its release subject someone to harassment or other forms of attack? The presumption should run to privacy. And so on.

There are a set of clear cases. Congress should debate matters openly. The President should have a private channel with which to communicate with other world leaders. Restaurants must publish the results of their latest health inspection. You shouldn't have your lifetime browsing history published by Google. These are not complicated cases.

The complicated cases are where our deeply held beliefs clash. Should all diplomats have the right to all communications being classified as secret? Should all private individuals be asked, one at a time, how they feel about having their house photographed by Google?

The Hipster in the Mirror

Greif-t_CA0-articleInline Mark Greif in the NYT:

A  year ago, my colleagues and I started to investigate the contemporary hipster. What was the “hipster,” and what did it mean to be one? It was a puzzle. No one, it seemed, thought of himself as a hipster, and when someone called you a hipster, the term was an insult. Paradoxically, those who used the insult were themselves often said to resemble hipsters — they wore the skinny jeans and big eyeglasses, gathered in tiny enclaves in big cities, and looked down on mainstream fashions and “tourists.” Most puzzling was how rattled sensible, down-to-earth people became when we posed hipster-themed questions. When we announced a public debate on hipsterism, I received e-mail messages both furious and plaintive. Normally inquisitive people protested that there could be no answer and no definition. Maybe hipsters didn’t exist! The responses were more impassioned than those we’d had in our discussions on health care, young conservatives and feminism. And perfectly blameless individuals began flagellating themselves: “Am I a hipster?”

I wondered if I could guess the root of their pain. It’s a superficial topic, yet it seemed that so much was at stake. Why? Because struggles over taste (and “taste” is the hipster’s primary currency) are never only about taste. I began to wish that everyone I talked to had read just one book to give these fraught debates a frame: “Distinction: A Social Critique of the Judgement of Taste,” by Pierre Bourdieu.

A French sociologist who died in 2002 at age 71, Bourdieu is sometimes wrongly associated with postmodern philosophers. But he did share with other post-1968 French thinkers a wish to show that lofty philosophical ideals couldn’t be separated from the conflicts of everyday life. Subculture had not been his area, precisely, but neither would hipsters have been beneath his notice.

Does the Slut Gene Exist?

Img-author-photo---casey-schwartz_182641765133Casey Schwartz in Daily Beast:

Maybe there’s a gene for the belief that genes can explain everything.

If so, I’m missing it.

In the last seven days, we’ve been hearing a lot about the DRD4 gene, dubbed by the media as the “slut gene,” that supposedly explains why certain people are likely to have lots of sex, especially of the adulterous variety.

In a study published last week in the journal PLoS One, a group of researchers, led by Justin Garcia at Binghamton University in New York, took 181 undergraduate-aged subjects, asked them about their sex lives, and ran a DNA test to see which version of the DRD4 gene they had: the 7R+ or the 7R- kind. The DRD4 gene has made headlines before. In fact, it’s a goldmine of scandalous behaviors, linked to everything from alcoholism to impulsive financial decisions. It influences how our brains respond to dopamine, a feel-good neurotransmitter unleashed by new and rewarding experiences. So the Binghamton group had good reason to think that they’d find something if they looked at its role in sexual behavior. And they did find something. But first, here’s what they didn’t find:

They didn’t find that those with one version of the gene had more sex than those with the other. And they didn’t find that the people with the so-called slut gene had more sexual partners, or that they're more likely to cheat.

What they found is that the group who had the 7R+ version was more likely to have had, at one point or another, “a one-night stand,” and that if someone with a 7R+ did cheat in a relationship, they were likely to have done so with more people than their 7R- counterparts.

The study leaves several questions unanswered. Was this 7R+ group really more likely to have had a one-night stand, or just more likely to report it? Did they actually cheat with more partners, or were they simply more willing to reveal the full extent of their adultery? You could just as easily interpret the study’s results this way and declare DRD4 the “candor gene.”

The DRD4 study isn’t an isolated case of shaky genetic science. In fact, it joins a cadre of questionable scientific assertions that link single genes to much broader patterns of behavior.

bubbles of comparative orderliness

TLS_Leslie_732374a

Why is there a universe, not a blank? The Grand Design and Cycles of Time suggest very different answers. Stephen Hawking and Leonard Mlodinow make The Grand Design reader-friendly. Its physics and cosmology are enlivened by myth (“In the Mayan legend the Maker, unhappy because there was no one to praise him, decided to create humans”). You’ll find colourful artwork, jokes, a quick history of science, no mathematics. And the book can seem astonishingly open-minded. Even Archbishop Ussher’s view that things began in 4004 BC appears to get considerable respect. Suppose that Ussher’s modern disciples taught that in 4004 BC God created the universe exactly as if it had existed for billions of years, inclusive of fossils in the rocks: Hawking–Mlodinow’s “model-dependent realism” wouldn’t call their teaching mistaken, or its imagined facts “less real” than those you presumably believe in. “Philosophy”, the book declares, “is dead. Philosophy has not kept up with modern developments in science.” The authors then make bold philosophical claims. For example they aren’t attracted by the idea, perhaps it has never occurred to them, that even chess-playing computers “make choices” in a sense. So they theorize that “though we feel that we can choose what we do”, we are in fact “governed by the laws of physics and chemistry”, which at once proves we can’t. Presumably, they hope that after weighing the alternatives we will select their theory without actually choosing it.

more from John Leslie at the TLS here.

kung fu pragmatism

Stone_kungfu-tmagArticle

One might well consider the Chinese kung fu perspective a form of pragmatism. The proximity between the two is probably why the latter was well received in China early last century when John Dewey toured the country. What the kung fu perspective adds to the pragmatic approach, however, is its clear emphasis on the cultivation and transformation of the person, a dimension that is already in Dewey and William James but that often gets neglected. A kung fu master does not simply make good choices and use effective instruments to satisfy whatever preferences a person happens to have. In fact the subject is never simply accepted as a given. While an efficacious action may be the result of a sound rational decision, a good action that demonstrates kung fu has to be rooted in the entire person, including one’s bodily dispositions and sentiments, and its goodness is displayed not only through its consequences but also in the artistic style one does it. It also brings forward what Charles Taylor calls the “background” — elements such as tradition and community — in our understanding of the formation of a person’s beliefs and attitudes. Through the kung fu approach, classic Chinese philosophy displays a holistic vision that brings together these marginalized dimensions and thereby forces one to pay close attention to the ways they affect each other.

more from Peimin Ni at The Opinionater here.

Thursday Poem

Water/Zero

After the garden, the man and woman
squatted in a field of thorns.

See, they had become like us,
although they didn’t know it yet,
knowing good and evil, which meant also

a whole bestiary of pain,
which was new to them, and so
in the infancy of their wanting

thirst and hunger, famine and drought
lacked at first their proper names,
settling slowly on their tongues
like sand blown through the teeth.

It took some time before they saw
that certain things were missing:
the beasts of the field fled from them,
the evening thrush was still.

And something else as well, a thing
that had no name was missing too,
which had been everywhere before.

They began to speak of a before,
arranging stones to track the days,
circling them in the dry grass,
counting backward in their grief
to the first stone, day after day,
until there were too many stones to count,
and they built a house out of their grief.

What had gone? It was not in the sky,
or in the root, or in the wilted throat.
Before, it had been everywhere.

Above, the sky was blue and hard.
The leaves cracked in the wind.
They searched and searched the field in vain.

Only by digging an O in the earth,
carving and carving the shape of their grief,

did they find at last what they had lost,
and draw it up, and call it by its name.

Before, it had been everywhere.
Like nothing, it could not be
conceived. Now, in the sterile earth,
the man and woman made it into a thing.

And they saw that it was useful
for calling back the world,
the wild ass, the ox.
Later they found it could call forth
the green plants of the field as well.

But as with all the things that are
both intimate and necessary, they saw
how it could swallow and withhold:
the gourd dropped in the well.
The sea which never speaks.

We can imagine how the first echo
must have terrified them,
their own voices in the well
calling back to them, their words
the only things that would return.

And so they kept the words,
and made themselves a song about the whole,
their small, round world, held out to hold
a place for everything that’s lost.

by Leon Weinman
from Blackbird; Spring 2010

A SENSE OF CLEANLINESS

Simone Schnall in Edge:

Schnall200 I am a social psychologist, and study judgments and decisions from the perspective that emotions, and all kinds of feelings, including physical sensations, play a really important role. For example, such simple things as a sense of cleanliness can make a difference to how people decide whether something is right or wrong. We've been looking at, in general, how people make decisions, and how they arrive at judgments. In particular we've been studying moral judgments, that is, how do people tell right from wrong? It used to be thought for the longest time, going back for thousands of years of philosophical investigation, that people think of why a certain behavior might be wrong. They think of all the rational reasons, all the things they can come up with, they go through all the pros and cons, and then arrive at the judgment, and say, “Behavior X is either wrong, or very wrong, or not so wrong, it's fine”, and so on. So it used to be thought that people think long and hard, and then figure out the answer.

Now it turns out that actually this does not seem to be the case because first of all, people don't always think that much, and many thought processes are not really conscious, but rather, they happen outside of consciousness. Many thoughts just happen incidentally, and people aren't even aware of them. Therefore, instead of all these sophisticated thoughts and reasons, accidental factors enter the picture such as feelings and intuitions, for example, a sense of, “Well, I just have an intuition that this is the case”, and such factors can be much more powerful than rational thought. For morality this idea first became popular in 2001 when Jonathan Haidt published his paper on the social intuitionist model, which has been a really influential idea.

More here.

Look: What your reaction to someone’s eye movements says about your politics

From PhysOrg:

Obamaeyes In a new study, UNL researchers measured both liberals' and conservatives' reaction to “gaze cues” – a person's tendency to shift attention in a direction consistent with another person's eye movements, even if it's irrelevant to their current task – and found big differences between the two groups. Liberals responded strongly to the prompts, consistently moving their attention in the direction suggested to them by a face on a computer screen. Conservatives, on the other hand, did not. Why? Researchers suggested that conservatives' value on personal autonomy might make them less likely to be influenced by others, and therefore less responsive to the visual prompts.

“We thought that political temperament may moderate the magnitude of gaze-cuing effects, but we did not expect conservatives to be completely immune to these cues,” said Michael Dodd, a UNL assistant professor of psychology and the lead author of the study. Liberals may have followed the “gaze cues,” meanwhile, because they tend to be more responsive to others, the study suggests. “This study basically provides one more piece of evidence that liberals and conservatives perceive the world, and process information taken in from that world, in different ways,” said Kevin Smith, UNL professor of political science and one of the study's authors.

More here.