An Antimatter Breakthrough

From Liz Mermin's documentary in progress: “On 7 March, the journal Nature published the latest results from the ALPHA experiment at CERN. The findings were called “historic.” ALPHA first made science history in 2010, when they created atoms of anti-hydrogen; in 2011 they succeeded in trapping and holding these atoms for an astonishing 1000 seconds. In these three short films, members of the ALPHA collaboration explain their latest triumph, revealing the excitement behind this extroardinary scientific process.”

Read more »

Cynthia Nixon, Joseph Massad, and Not Being an American Gigolo

FoucaultScott Long in A Paper Bird:

In the politics of identity, bisexuals are hated because they stand for choice. The game is set up so as to exclude the middle; bisexuals get squeezed out. in the “LGBT” word, the “B” is silent. John Aravosis, for instance, says that if you’re into both genders, “that’s fine” — great! — but “most people” aren’t. First off, that rather defies Freud and the theory of universal infantile bisexuality. But never mind that. The business of “outing,” of which Aravosis has been an eloquent proponent, also revolves around the excluded middle. It’s not a matter of what you think of outing’s ethics, on which there’s plenty of debate. It’s that the underlying presumption is that one gay sex act makes you “gay” — not errant, not bisexual, not confused or questioning: gay, gay, gay. I saw you in that bathroom, for God’s sake! You’re named for life! It’s also that the stigma goes one way only: a lifetime of heterosexual sex acts can’t make up for that one, illicit, overpowering pleasure. As I’ve argued, this both corresponds to our own buried sense, as gays, that it is a stigma, and gives us perverse power. In the scissors, paper, rock game of sexuality, gay is a hand grenade. It beats them all.

And this fundamentalism infects other ways of thinking about sexuality, too. Salon today carries an article about multiple sex-and-love partners: “The right wants to use the ‘slippery slope’ of polyamory to discredit gay marriage. Here’s how to stop them.” I’ll leave you to study the author’s solution. He doesn’t want to disrespect the polyamorists:

I reject the tactic of distinguishing the good gays from the “bad” poly people. Further marginalizing the marginalized is just the wrong trajectory for any liberation movement to take.

That’s true — although whether we’re still really a liberation movement, when we deny the liberty of self-description, is a bit doubtful. But he goes on, contemplating how polyamory might in future be added to the roster of rights:

Really, there are a host of questions that arise in the case of polyamory to which we just don’t know the answer. Is polyamory like sexual orientation, a deep trait felt to be at the core of one’s being? Would a polyamorous person feel as incomplete without multiple partners as a lesbian or gay person might feel without one? How many “truly polyamorous” people are there?

Well, what if it’s not? What if you just choose to be polyamorous? God, how horrible! You beast! What can be done for the poor things? Should some researcher start looking for a gene for polyamory, so it can finally become respectable, not as a practice, but as an inescapable doom? (I shudder to think there’s one gene I might share with Newt Gingrich.)

What, moreover, if sexual orientation itself is not “a deep trait felt to be at the core of one’s being,” one that people miraculously started feeling in 1869, when the word “homosexual” was coined? What if it’s sometimes that, sometimes a transient desire, sometimes a segment of growth or adolescent exploration, sometimes a recourse from the isolations of middle age, sometimes a Saturday night lark, sometimes a years-long passion? What if some people really do experience it as … a choice?

What if our model for defending LGBT people’s rights were not race, but religion? What if we claimed our identities were not something impossible to change, but a decision so profoundly a part of one’s elected and constructed selfhood that one should never be forced to change it?

Hey Dude

0212ILIN01Robert Lane Greene in More Intelligent Life (for Sophie Schulte-Hillen):

Slang rarely has staying power. That is part of its charm; the young create it, and discard it as soon as it becomes too common. Slang is a subset of in-group language, and once that gets taken up by the out-group, it’s time for the in-crowd to come up with something new. So the long life of one piece of American slang, albeit in many different guises, is striking. Or as the kids would say, “Dude!”

Though the term seems distinctly American, it had an interesting birth: one of its first written appearances came in 1883, in the American magazine, which referred to “the social ‘dude’ who affects English dress and the English drawl”. The teenage American republic was already a growing power, with the economy booming and the conquest of the West well under way. But Americans in cities often aped the dress and ways of Europe, especially Britain. Hence dude as a dismissive term: a dandy, someone so insecure in his Americanness that he felt the need to act British. It’s not clear where the word’s origins lay. Perhaps its mouth-feel was enough to make it sound dismissive.

From the specific sense of dandy, dude spread out to mean an easterner, a city slicker, especially one visiting the West. Many westerners resented the dude, but some catered to him. Entrepreneurial ranchers set up ranches for tourists to visit and stay and pretend to be cowboys themselves, giving rise to the “dude ranch”.

By the 1950s or 1960s, dude had been bleached of specific meaning. In black culture, it meant almost any male; one sociologist wrote in 1967 of a group of urban blacks he was studying that “these were the local ‘dudes’, their term meaning not the fancy city slickers but simply ‘the boys’, ‘fellas’, the ‘cool people’.”

From the black world it moved to hip whites, and so on to its enduring associations today—California, youth, cool. In “Easy Rider” (1969) Peter Fonda explains it to the square Jack Nicholson: “Dude means nice guy. Dude means a regular sort of person.” And from this new, broader, gentler meaning, dude went vocative.

The Return of Mad Men and the End of TV’s Golden Age

IAndy Greenwald in Grantland:

[L]ike the Komodo dragon or Kirk Cameron, a few Golden Age shows remain in production even if their evolutionary time has passed. Larry David will keep kvetching as long as there's bile in his body, and the brilliant Breaking Bad has one more batch of crystal to cook. But with three full seasons stretching out before us like the red carpet at the Clios, Mad Men will be the last of the Golden Age shows to grace our flat-screens. With a typically outstanding new episode, the first in 17 months, due to premiere on Sunday, it's worth asking: Is it also the best?

The line of inheritance from first to last is almost too neat: David Chase hired Matt Weiner to the Sopranos off of the cigarette-stained spec of Mad Men, a script originally written by Weiner in an aspirational frenzy while toiling on the Bronze Age Ted Danson sitcom Becker. Weiner's infamous penchant for micromanaging and rewriting was learned at the foot of Chase, and Don Draper is a direct descendent of Tony Soprano; the two share a charismatic corruption, the last of the troubled titans. But this is where the comparisons end. The Sopranos, in all its digressive genius, was a show dedicated to the impossibility of change. Season by season, Chase built a red-sauce-spattered shrine to a lifetime of lessons learned on Dr. Melfi-esque couches: that people are who they are, no matter what. At its core, The Sopranos was Chase's grand F.U. to all the hard-worn stereotypes of Television 1.0, the boring brontosaur he'd finally managed to dump in the Meadowlands. There was no hugging in Tony's New Jersey. No learning or smoothing or straightening. Tony Soprano was Tony Soprano: an amiable monster. In the end, Chase argued with nihilistic aplomb, it doesn't much matter how the Satriale sausage was made, just whether it was spicy or sweet. And when he began to feel revulsion toward his audience's bloodlust, he denied them even that: The finale's fade to black ensured Tony would be stuck with himself for eternity. To Chase it was a fate worse than prison or a slug to the head from a mook in a Member's Only jacket; a karmic feedback loop in the shape of an onion ring.

Mad Men is different. It's less dark and more expansive than its ancestor because, unlike Chase, Weiner isn't asking questions that he's already convinced himself can't be answered. Where The Sopranos was angry, Mad Men is curious. Even at his grief-wracked, whiskey-bloated nadir last season, being Don Draper wasn't a life sentence because Don Draper doesn't exist. He's merely a particularly dapper suit that Dick Whitman is trying on for size. On Mad Men, identity is what's fungible, not nature.

The Birangana and the birth of Bangladesh

From Himal Southasian:

The year 1971 was a landmark in Southasian history for many reasons. It included the birth of Bangladesh but also the war fought by Pakistan and India. It was perhaps the only such conflict involving the three most populous Southasian countries, clashing for the first time since the end of colonial rule. High-level politics and the tumultuous times spawned a number of books on war, international relations and human rights. However, an uncanny silence has remained about one aspect of the war – the sexual crimes committed by the Pakistan Army and its collaborators, the Razakar militia, against Bangladeshi women. It is only now, 40 years on, that some of that silence is being broken.

Bina D’Costa’s new Nationbuilding, Gender and War Crimes in South Asia takes on the mammoth task of placing violence against women during the war in a larger political context. While what D’Costa calls the ‘original cartographic trauma’ of the Subcontinent has been well researched, gendered nation-building narratives have been given little consideration. Yet D’Costa proposes that any theorisation of nation-building in post-Partition India and Pakistan, or post-Liberation Bangladesh, is incomplete without a gendered analysis. Recognising that women have largely been silenced by state historiography, feminist scholars and activists in Southasia – Veena Das, Kamla Bhasin, Ritu Menon, Urvashi Butalia – have attempted to explore this sordid aspect of war. That rape has been used as a weapon of war has been well documented. One of the more famous examples is American feminist Susan Brownmiller’s investigation of rapes committed during the two World Wars, in Vietnam and then in Bangladesh, which emerged as the 1975 classic Against Our Will: Men, women and rape. The idea of defiling the enemy population by raping its women and impregnating them, often while their helpless and ‘feminised’ menfolk watch, is based on notions of honour, purity and emasculating the opposition. These notions of defilement also led to the sacrificial killing, sometimes by their own families, of women who had either been raped or even simply exposed to the potential of sexual violence.

More here.

The Chemistry of Tears

From The Telegraph:

Daniel_main_2174773bIt is not an exaggeration to say that Peter Carey has given new meaning to the term “historical fiction”. Nowadays novels set in the past are the norm; they seem likely to outnumber those set in the present. In the Eighties, when Carey started writing them, they constituted a separate genre. His early novels were genuinely innovative, and played a large part in that transformation. Impressively, he continues to produce another masterclass every couple of years. His modus operandi is to intertwine his unique fictions with historical documents – from Edmund Gosse’s autobiography in Oscar and Lucinda (1988), to the work of Alexis de Tocqueville in Parrot and Olivier in America 20 years later, most audaciously Great Expectations in Jack Maggs, most spectacularly Ned Kelly’s letters in True History of the Kelly Gang. His reshaping of history, particularly Australian history, arriving at assertive postcolonial versions of Australian national identity, is central to his technique.

In this, his 12th novel, imperial patronage takes a bashing and Victoria and Albert are glimpsed in their nighties, but the seed of historical truth is the 18th-century inventor Jacques de Vaucanson’s mechanical duck. This famed automaton supposedly ate, digested and excreted grain in front of an audience, but was something of a fraud, because its droppings were made in advance. In The Chemistry of Tears, Catherine Gehrig, a conservator at London’s Swinburne Museum, learns of the death of her married lover and colleague. It is 2010, and in the midst of her secret grief Catherine’s boss gives her a mysterious object to reconstruct. It is a copy of the famous duck, commissioned by one Henry Brandling. His notebooks, written in 1854, detail his intention to build Vaucanson’s duck to enliven the spirits of his dangerously ill son, by arousing his “magnetic agitation”, as if the boy himself were an automaton.

More here.

A Boy to Be Sacrificed

Abdellah Taïa in the New York Times:

ScreenHunter_05 Mar. 25 07.46In the Morocco of the 1980s, where homosexuality did not, of course, exist, I was an effeminate little boy, a boy to be sacrificed, a humiliated body who bore upon himself every hypocrisy, everything left unsaid. By the time I was 10, though no one spoke of it, I knew what happened to boys like me in our impoverished society; they were designated victims, to be used, with everyone’s blessing, as easy sexual objects by frustrated men. And I knew that no one would save me — not even my parents, who surely loved me. For them too, I was shame, filth. A “zamel.”

Like everyone else, they urged me into a terrible, definitive silence, there to die a little more each day.

How is a child who loves his parents, his many siblings, his working-class culture, his religion — Islam — how is he to survive this trauma? To be hurt and harassed because of something others saw in me — something in the way I moved my hands, my inflections. A way of walking, my carriage. An easy intimacy with women, my mother and my many sisters. To be categorized for victimhood like those “emo” boys with long hair and skinny jeans who have recently been turning up dead in the streets of Iraq, their skulls crushed in.

The truth is, I don’t know how I survived. All I have left is a taste for silence. And the dream, never to be realized, that someone would save me. Now I am 38 years old, and I can state without fanfare: no one saved me.

More here.

Saturday, March 24, 2012

Robert Reich on Saving Capitalism and Democracy

From The Browser:

ScreenHunter_04 Mar. 24 15.03In a recent post on your website, you said there was “moral rot” in America. And you say: “It’s located in the public behaviour of people who control our economy and are turning our democracy into a financial slush pump.” Can you expand on this?

An economy depends fundamentally on public morality; some shared standards about what sorts of activities are impermissible because they so fundamentally violate trust that they threaten to undermine the social fabric. Without trust it has to depend upon such complex contracts and such weighty enforcement systems that it would crumble under its own weight. What we’ve seen over the last two decades in the United States is a steady decline in the willingness of people in leading positions in the private sector – on Wall Street and in large corporations especially – to maintain those minimum standards. The new rule has become making the highest profits possible regardless of the social consequences.

In the first three decades after World War II – partly because America went through that terrible war and also experienced before that the Great Depression – there was a sense in the business community and on Wall Street of some degree of social responsibility. It wasn’t talked about as social responsibility, because it was assumed to be a bedrock of how people with great economic power should behave. CEOs did not earn more than 40 times what the typical worker earned. Rarely were there mass layoffs by profitable firms. The marginal income tax on the highest income earners in the 1950s was 91%. Even the effective rate, after all deductions and tax credits, was still well above 50%. The game was not played in a cutthroat way. In fact, consumers, workers, the community, were all considered stakeholders of almost equal entitlement as shareholders.

Around about the late 1970s and early 1980s, all of this changed quite dramatically.

More here.

Is Free Will an Illusion?

800px-ToppledominosOver at The Chronicle of Higher Education, Jerry A. Coyne, Alfred R. Mele, Michael S. Gazzaniga, Hilary Bok, Owen D. Jones and Paul Bloom address the question. (Image from wikipedia.) Bloom:

Common sense tells us that we exist outside of the material world—we are connected to our bodies and our brains, but we are not ourselves material beings, and so we can act in ways that are exempt from physical law. For every decision we make—from leaning over for a first kiss, to saying “no” when asked if we want fries with that—our actions are not determined and not random, but something else, something we describe as chosen.

This is what many call free will, and most scientists and philosophers agree that it is an illusion. Our actions are in fact literally predestined, determined by the laws of physics, the state of the universe, long before we were born, and, perhaps, by random events at the quantum level. We chose none of this, and so free will does not exist.

I agree with the consensus, but it's not the big news that many of my colleagues seem to think it is. For one thing, it isn't news at all. Determinism has been part of Philosophy 101 for quite a while now, and arguments against free will were around centuries before we knew anything about genes or neurons. It's long been a concern in theology; Moses Maimonides, in the 1100s, phrased the problem in terms of divine omniscience: If God already knows what you will do, how could you be free to choose?

More important, it's not clear what difference it makes. Many scholars do draw profound implications from the rejection of free will. Some neuroscientists claim that it entails giving up on the notion of moral responsibility. There is no actual distinction, they argue, between someone who is violent because of a large tumor in his brain and a neurologically normal premeditated killer—both are influenced by forces beyond their control, after all—and we should revise the criminal system accordingly. Other researchers connect the denial of free will with the view that conscious deliberation is impotent. We are mindless robots, influenced by unconscious motivations from within and subtle environmental cues from without; these entirely determine what we think and do. To claim that people consciously mull over decisions and think about arguments is to be in the grips of a prescientific conception of human nature.

I think those claims are mistaken. In any case, none of them follow from determinism. Most of all, the deterministic nature of the universe is fully compatible with the existence of conscious deliberation and rational thought.

On the Origin of Everything

ALBERT-articleLargeDavid Albert reviews Lawrence M. Krauss's A Universe From Nothing, in the NYT:

The fact that some arrangements of fields happen to correspond to the existence of particles and some don’t is not a whit more mysterious than the fact that some of the possible arrangements of my fingers happen to correspond to the existence of a fist and some don’t. And the fact that particles can pop in and out of existence, over time, as those fields rearrange themselves, is not a whit more mysterious than the fact that fists can pop in and out of existence, over time, as my fingers rearrange themselves. And none of these poppings — if you look at them aright — amount to anything even remotely in the neighborhood of a creation from nothing.

Krauss, mind you, has heard this kind of talk before, and it makes him crazy. A century ago, it seems to him, nobody would have made so much as a peep about referring to a stretch of space without any material particles in it as “nothing.” And now that he and his colleagues think they have a way of showing how everything there is could imaginably have emerged from a stretch of space like that, the nut cases are moving the goal posts. He complains that “some philosophers and many theologians define and redefine ‘nothing’ as not being any of the versions of nothing that scientists currently describe,” and that “now, I am told by religious critics that I cannot refer to empty space as ‘nothing,’ but rather as a ‘quantum vacuum,’ to distinguish it from the philosopher’s or theologian’s idealized ‘nothing,’ ” and he does a good deal of railing about “the intellectual bankruptcy of much of theology and some of modern philosophy.” But all there is to say about this, as far as I can see, is that Krauss is dead wrong and his religious and philosophical critics are absolutely right. Who cares what we would or would not have made a peep about a hundred years ago? We were wrong a hundred years ago. We know more now. And if what we formerly took for nothing turns out, on closer examination, to have the makings of protons and neutrons and tables and chairs and planets and solar systems and galaxies and universes in it, then it wasn’t nothing, and it couldn’t have been nothing, in the first place. And the history of science — if we understand it correctly — gives us no hint of how it might be possible to imagine otherwise.

And I guess it ought to be mentioned, quite apart from the question of whether anything Krauss says turns out to be true or false, that the whole business of approaching the struggle with religion as if it were a card game, or a horse race, or some kind of battle of wits, just feels all wrong — or it does, at any rate, to me. When I was growing up, where I was growing up, there was a critique of religion according to which religion was cruel, and a lie, and a mechanism of enslavement, and something full of loathing and contempt for every­thing essentially human. Maybe that was true and maybe it wasn’t, but it had to do with important things — it had to do, that is, with history, and with suffering, and with the hope of a better world — and it seems like a pity, and more than a pity, and worse than a pity, with all that in the back of one’s head, to think that all that gets offered to us now, by guys like these, in books like this, is the pale, small, silly, nerdy accusation that religion is, I don’t know, dumb.

A Kennedy for Pakistan?

Mohsin Hamid in the New York Review of Books:

ScreenHunter_03 Mar. 24 14.32Most likely to be cast as heroes are the media, the country’s independent-minded Supreme Court, which has recently indicted the Prime Minister on contempt of court charges (related to the corruption investigation of Zardari), and the Pakistani “people.” There is much talk of democratic ideals, but little love for the country’s current crop of politicians, and so there seems to be a yearning for a new kind of leader able to break the cycle of weakness and mediocrity.

Into this situation has surged the former cricket superstar Imran Khan, who in recent months has suddenly become the country’s most popular political figure. My first intimation that people might be taking Khan seriously as a politician came in February 2011, in Karachi, when I asked the driver of a car belonging to my publisher whom he’d vote for if elections were held today.

“Imran Khan,” he replied without hesitation.

I was surprised. Khan’s fifteen-year-old party, the Pakistan Tehreek-e-Insaf (PTI), or Pakistan Movement for Justice, had never managed to win more than a single seat in the country’s 272-member parliament. Yet my publisher’s driver was on to something. By October, well over 100,000 people were thronging a Khan-led PTI rally in Lahore, an event that seemed to change Pakistan’s political landscape. It had been billed as a make-or-break chance for Khan to show, finally, whether he was capable of building a true mass movement.

The size of the support it generated clearly shook Punjab’s traditional power-brokers, the brothers Nawaz and Shahbaz Sharif, leaders of the Pakistan Muslim League – Nawaz (PML-N). I know a university professor who went, and he said it was the largest such gathering he had ever seen. He was particularly struck by the socio-economic diversity of those present, by the large numbers of women as well as men, and by the orderliness and unforced enthusiasm of the crowd, in contrast to the rent-a-mob environment typical of big political gatherings.

More here.

Saturday Poem

Blueberries

I am in California. The moon –
colour of grandmother’s Irish butter – is lifting
over the Mount Diablo hills and the sky
is tinged a ripening strawberry. You sleep
thousands of miles from me and I pray your dreams
are a tranquil sea. Eight hours back
you watched this moon, our love-, our marriage-moon,
rise silently over our Dublin suburb, and you
phoned to tell me of it. I sit in stillness
though I am called where death is by; I am eating
night and grief in the sweet-bitter flesh
of blueberries, coating tongue and lips with juice
that this my kiss across unconscionable distances
touch to your lips with the fullness of our loving.

by John F. Deane
Publisher: PIW, 2012

Mummy Dearest

From The New York Times:

HaPerhaps every autobiographical first novel serves its author as Jeanette Winterson’s did — as “a story I could live with.” But “Oranges Are Not the Only Fruit,” buoyant and irrepressible, was published in 1985, for its author half a lifetime ago, and what one can live with changes over time. “Why Be Happy When You Could Be Normal?” is a memoir as unconventional and winning as the rollicking bildungsroman Winterson assembled from the less malignant aspects of her eccentric Pentecostal upbringing, a novel that instantly established her distinctive voice. This new book wrings humor from adversity, as did the fictionalized version of Winterson’s youth, but the ghastly childhood transfigured there is not the same as the one vivisected here in search of truth and its promise of setting the cleareyed free. At the center of both narratives is “Mrs. Winterson,” as the author often calls her mother in “Why Be Happy.” It would be easy to dismiss this formality as an attempt to establish retroactively something that never existed between Winterson and her adoptive mother: a respectful distance governed by commonly accepted standards of decency and reason. But, even more, the form of address suggests the terrible grandeur of a character who transcends the strictly mortal in her dimensions and her power, a monolith to whom any version of “mother” cannot do justice.

“Tallish and weighing around 20 stone” (in other words, about 280 pounds), Mrs. Winterson, now deceased, was “out of scale, larger than life,” “now and again exploding to her full 300 feet,” a force that eclipsed Winterson’s self-effacing father, who couldn’t protect himself, let alone his child, from the woman he had married. It wasn’t her physical size that tipped Mrs. Winterson from mere gravity toward the psychic equivalent of a black hole, vacuuming all the light into her hysterical fundamentalism, so much as it was her monumental derangement. “A flamboyant depressive . . . who kept a revolver in the duster drawer, and the bullets in a tin of Pledge,” Mrs. Winterson waited not in joyful so much as smug anticipation for the apocalypse that would destroy the neighbors and deliver her to the exalted status piety had earned her. Opposed to sexual intercourse, as she was to all forms of intimacy, Winterson’s mother adopted her in hopes of raising a friend, the author speculates, for her mother had no other. But the trouble Mrs. Winterson found in reading a book, “that you never know what’s in it until it’s too late,” is the same trouble that complicates parenthood. Or, as Mrs. Winterson explained it: “The Devil led us to the wrong crib.”

More here.

Memories reside in specific brain cells

From PhysOrg:

HipOur fond or fearful memories — that first kiss or a bump in the night — leave memory traces that we may conjure up in the remembrance of things past, complete with time, place and all the sensations of the experience. Neuroscientists call these traces memory engrams. But are engrams conceptual, or are they a physical network of neurons in the brain? In a new MIT study, researchers used optogenetics to show that memories really do reside in very specific brain cells, and that simply activating a tiny fraction of brain cells can recall an entire memory — explaining, for example, how Marcel Proust could recapitulate his childhood from the aroma of a once-beloved madeleine cookie.

“We demonstrate that behavior based on high-level cognition, such as the expression of a specific memory, can be generated in a mammal by highly specific physical activation of a specific small subpopulation of brain cells, in this case by light,” says Susumu Tonegawa, the Picower Professor of Biology and Neuroscience at MIT and lead author of the study reported online today in the journal Nature. “This is the rigorously designed 21st-century test of Canadian neurosurgeon Wilder Penfield’s early-1900s accidental observation suggesting that mind is based on matter.” In that famous surgery, Penfield treated epilepsy patients by scooping out parts of the brain where seizures originated. To ensure that he destroyed only the problematic neurons, Penfield stimulated the brain with tiny jolts of electricity while patients, who were under local anesthesia, reported what they were experiencing. Remarkably, some vividly recalled entire complex events when Penfield stimulated just a few neurons in the hippocampus, a region now considered essential to the formation and recall of episodic memories.

More here.

Against austerity

Bc91cf32-749e-11e1-9e4d-00144feab49a

In an essay in her new non-fiction collection, Marilynne Robinson marvels at the enormous number of English words that describe the behaviour of light. “Glimmer, glitter, glister, glisten, gleam, glow, glare, shimmer, sparkle, shine, and so on,” she writes. “These old words … reflect an aesthetic attention to experience that has made, and calls us to make, pleasing distinctions among, say, a candle flame, the sun at its zenith, and the refraction of light by a drop of rain.” “Imagination and Community” is ostensibly about language, but, like all the essays in When I Was A Child I Read Books, it is written through the prism of Robinson’s Christian Calvinist faith. Rather than focusing on mystical awakenings, she calls the everyday material of life a “sacred mystery”, and argues that religion should illuminate, rather than be separated from, science, politics and literature. “My point is that lacking the terms of religion,” she writes, “essential things cannot be said.” As a novelist, Robinson imbues ordinary, concrete details with grace, in the manner of the authors she most admires – Emerson, Whitman, Melville, William James, Emily Dickinson – “for whom creeds fall away and consciousness has the character of revelation.”

more from Emily Stokes at the FT here.

Why Won’t They Listen?

Saletan-articleLarge

You’re smart. You’re liberal. You’re well informed. You think conservatives are narrow-minded. You can’t understand why working-class Americans vote Republican. You figure they’re being duped. You’re wrong. This isn’t an accusation from the right. It’s a friendly warning from Jonathan Haidt, a social psychologist at the University of Virginia who, until 2009, considered himself a partisan liberal. In “The ­Righteous Mind,” Haidt seeks to enrich liberalism, and political discourse generally, with a deeper awareness of human nature. Like other psychologists who have ventured into political coaching, such as George Lakoff and Drew Westen, Haidt argues that people are fundamentally intuitive, not rational. If you want to persuade others, you have to appeal to their sentiments. But Haidt is looking for more than victory. He’s looking for wisdom. That’s what makes “The Righteous Mind” well worth reading. Politics isn’t just about ­manipulating people who disagree with you. It’s about learning from them.

more from William Saletan at the New York Times here.

the continent of concrete abstraction

JO_ANTHO_MARCH_CO_001

In this season of marking South Pole centennials, March is the last and cruelest month. On March 17, 1912, a starved, injured and frostbitten Lawrence “Titus” Oates famously crawled out through the tube door of Robert Falcon Scott’s tent to die deliberately in a blizzard. His last words, “I am just going outside and may be some time,” were transcribed two days later by a storm-bound Scott, making notes as his own death closed in, ice crystals already claiming his insensate right foot. Two months earlier, Scott, Oates, Edgar “Taff” Evans, Edward Wilson, and Henry “Birdie” Bowers had reached the South Pole, but instead of a blank nexus of latitude and longitude in an unmapped wilderness of ice, they found Roald Amundsen’s tent and Norwegian flag. The British team’s return was dismal, a trudging descent from the polar plateau into crippling starvation, dehydration, and nutritional deprivation. Having become a limping hindrance to his team’s already slow progress, Titus Oates hoped in his self-sacrifice to give the trio of Scott, Wilson, and Bowers (Taff Evans was already dead) a chance to reach their next depot of fuel and food. They would not, dying instead at month’s end as Oates had, prone and frozen solid under the snows of the blizzard-swept Ross Ice Shelf.

more from Jason Anthony at The Smart Set here.

Friday, March 23, 2012

The lesser-known atrocities of the 1971 India-Pakistan-Bangladesh War

Batool Zehra in The Express Tribune:

ScreenHunter_02 Mar. 23 14.57History is always written by the victors, and in the case of the 1971 war, the dominant narrative has been that of atrocities committed against the Bengali population. But in her upcoming novel, Of Martyrs and Marigolds, Aquila Ismail dredges up the memories of her traumatic past in order to shine a light on the lesser-known atrocities of that conflict.

“My mother forgot how to speak Bengali after the trauma of 1971. It just went out of her head. She cannot speak it to this day,” says Aquila Ismail, as we sip tea in her sitting room on a winter’s evening in Karachi. One of the few Biharis who managed to flee Bangladesh after what is known in that country as the War of Liberation, Aquila now lives in the UAE. But over 250,000 of her fellow Biharis still live in squalid conditions in Bangladesh today, as a stateless minority.

While the atrocities of the Pakistan Army against the Bengali population during the war are well-documented, little is known about the plight of the Biharis who were left stranded when East Pakistan seceded in 1972, and what they suffered during and after the conflict. According to some estimates, 750,000 Biharis were left in Bangladesh in 1972, and not only did they face persecution at the hands of Bengalis, they were also disowned by Pakistan and became stateless overnight…

More here.

Giant Silkmoths: Colour, Mimicry and Camouflage

Anna Lena Phillips in American Scientist:

ScreenHunter_01 Mar. 23 12.54Last spring, the periodical cicadas emerged across eastern North America. Their vast numbers and short above-ground life spans inspired awe and irritation in humans—and made for good meals for birds and small mammals. Such snacks do not come without cost, however: Cicadas emit extremely loud shrieks when captured. Perhaps the pattern of the giant silk moth Citheronia azteca (right) evolved to resemble a cicada as a form of Batesian mimicry—imitation by a nonpoisonous species of a poisonous or unappetizing one. So Philip Howse speculates in Giant Silkmoths: Colour, Mimicry and Camouflage (Papadakis, $40 paper), in which he and photographer Kirby Wolfe showcase these members of the Saturniidae family.

Wolfe offers notes on collecting and raising silk moths. But the book’s wealth of photographs serves as collection enough: Viewing page after page of stunning moths from all over the world, I felt the guilty pleasure of seeing more of these creatures than one would ever normally encounter. Caterpillars are well represented also.

More here.