One can do worse than adopt a flip view of life

John Allen Paulos in his Who's Counting column at ABC News:

Jumbo_coin_quarter To obtain a fair result from a biased coin, the mathematician John von Neumann devised the following trick. He advised the two parties involved to flip the coin twice. If it comes up heads both times or tails both times, they are to flip the coin two more times.

If it comes up H-T, the first party will be declared the winner, while if it comes up T-H, the second party is declared the winner. The probabilities of both these latter events (H-T and T-H) are the same because the coin flips are independent even if the coin is biased.

For example, if the coin lands heads 70 percent of the time and tails 30 percent of the time, an H-T sequence has probability .7 x .3 = .21 while a T-H sequence has probability .3 x .7 = .21. So 21 percent of the time the first party wins, 21 percent of the time the second party wins, and the other 58 percent of the time when H-H or T-T comes up, the coin is flipped two more times.

More here.

The Soul-Sucking Suckiness of B.R. Myers

0971865906.01.MZZZZZZZ Garth Risk Hallberg in The Millions, via Andrew Sullivan:

One opens The Atlantic Monthly and is promptly introduced to a burst of joyless contrarianism. Tiring of it, one skims ahead to the book reviews, only to realize: this is the book review. A common experience for even the occasional reader of B.R. Myers, it never fails to make the heart sink. The problem is not only one of craft and execution. Myers writes as if the purpose of criticism were to obliterate its object. He scores his little points, but so what? Do reviewers really believe that isolating a few unlovely lines in a five hundred page novel, ignoring the context for that unloveliness, and then pooh-poohing what remains constitutes a reading? Is this what passes for judgment these days?

If so, Myers would have a lot to answer for. But in the real world, instances don’t yield general truths with anything like the haste of a typical Myers paragraph (of which the foregoing is a parody). And so, even as he grasps for lofty universalism, Brian Reynolds Myers remains sui generis, the bad boy of reviewers, lit-crit’s Dennis Rodman.

Myers came to prominence, or what passes for it in the media microcosmos, via “A Reader’s Manifesto,” a long jeremiad against “the modern ‘literary’ best seller” and “the growing pretentiousness of American literary prose.” It earned notice primarily for its attack on the work and reputation of novelists lauded for their style – Cormac McCarthy, Don DeLillo, and E. Annie Proulx, among others. Many of these writers were ripe for reevaluation, and “A Reader’s Manifesto” was read widely enough to land Myers a contributing editor gig at The Atlantic. It was subsequently published as a stand-alone book. Yet the essay was itself little more than an exercise in style, and not a very persuasive one at that. It was hard to say which was more irritating: Myers’ scorched-earth certainties; his method, a kind of myopic travesty of New Criticism; or his own prose, a donnish pastiche of high-minded affectation and dreary cliché.

I can’t be the only reader who wanted to cry out against the manifesto being promulgated on my behalf, but Myers had insulated himself in several ways. First, he had been so thoroughgoingly tendentious, and at such length, that to rebut his 13,000 words required 13,000 of one’s own. Second: his jadedness was infectious. It made one weary of reading, weary of writing, weary of life.

The Myth of Scientific Literacy

1392198655_15cb81a60a_b Alice Bell over at her blog:

Every now and again, the term “scientific literacy” gets wheeled out and I roll my eyes. This post is an attempt to explain why.

The argument for greater scientific literacy is that to meaningfully participate, appreciate and even survive our modern lives, we all need certain knowledge and skills about science and technology. Ok. But what will this look like exactly, how will you know what we all need to know in advance and how on earth do you expect to get people trained up? These are serious problems.

Back in the early 1990s, Jon Durant very usefully outlined out the three main types of scientific literacy. This is probably as good a place to start as any:

* Knowing some science – For example, having A-level biology, or simply knowing the laws of thermodynamics, the boiling point of water, what surface tension is, that the Earth goes around the Sun, etc.

* Knowing how science works – This is more a matter of knowing a little of the philosophy of science (e.g. ‘The Scientific Method’, a matter of studying the work of Popper, Lakatos or Bacon).

* Knowing how science really works – In many respects this agrees with the previous point – that the public need tools to be able to judge science, but does not agree that science works to a singular method. This approach is often inspired by the social studies of science and stresses that scientists are human. It covers the political and institutional arrangement of science, including topics like peer review (including all the problems with this), a recent history of policy and ethical debates and the way funding is structured.

The problem with the first approach is what IB Cohen, writing in the 1950s, called “The fallacy of miscellaneous information”: that a set of often unrelated nuggets of information pulled from the vast wealth of human knowledge is likely to be useful in everyday life (or that you'll remember it when it happens to be needed). That's not to say that these bits of knowledge aren't useful on occasion. Indeed, I remember my undergraduate science communication tutor telling us about how she drowned a spider in the toilet with a bit of basic knowledge of lipids and surface tension. However, it's unrealistic to list all the things a modern member of society might need to know at some point in their life, get everyone to learn them off in advance and then wash our hands of the whole business. This is especially problematic when it comes to science, as such information has the capacity to change (or at least develop). Instead, we all need access to useful information when it is needed.

[H/t: Jennifer Ouellette]

Obama in India

Tb0120c_thumb3 Jaswant Singh in Project Syndicate:

Barack Obama, the sixth American president to visit India since it gained independence, arrives at a trying time, both for the United States and for India. Some of Obama’s closest advisers have just resigned, opening an awkward gap on national security and the economy – the focus of his meetings with India’s government.

For India, the issues on the agenda for Obama’s visit are immense and complex, and the options for resolving them are extremely limited. Those related to security in Afghanistan and Pakistan are as treacherous as they have ever been. Bilateral economic, trade, and currency disagreements may not be as bitter as they are between the US and China, but they are thorny, and lack of resolution is making them more intractable.

Nuclear non-proliferation remains one of Obama’s priorities, as does the sale of US civilian nuclear technology to India, for which former President George W. Bush cleared the way. And Obama will be keen to know what help India can provide with Iran, a country with which India has smooth relations, owing to their shared worries over Afghanistan and Pakistan.

Given this potent list of challenges, what are the prospects for Obama’s passage to India? Some years ago, I was queried by then US Deputy Secretary of State Strobe Talbott, who was helping to prepare President Bill Clinton’s visit. As India’s foreign minister at the time, I told him: “Why make the visit destinational? Be content with the directional,” or some such words. That response retains its flavor today: as new directions in India-US relations are set, new destinations will follow.

Religion as a Catalyst of Rationalization

Habermas-and-Religion-Cover1 Eduardo Mendieta in The Immanent Frame:

The centrality of religion to social theory in general and philosophy in particular explains why Jürgen Habermas has dealt with it, in both substantive and creative ways, in all of his work. Indeed, religion can be used as a lens through which to glimpse both the coherence and the transformation of his distinctive theories of social development and his rethinking of the philosophy of reason as a theory of social rationalization.

For Habermas, religion has been a continuous concern precisely because it is related to both the emergence of reason and the development of a public space of reason-giving. Religious ideas, according to Habermas, are never mere irrational speculation. Rather, they possess a form, a grammar or syntax, that unleashes rational insights, even arguments; they contain, not just specific semantic contents about God, but also a particular structure that catalyzes rational argumentation.

We could say that in his earliest, anthropological-philosophical stage, Habermas approaches religion from a predominantly philosophical perspective. But as he undertakes the task of “transforming historical materialism” that will culminate in his magnum opus, The Theory of Communicative Action, there is a shift from philosophy to sociology and, more generally, social theory. With this shift, religion is treated, not as a germinal for philosophical concepts, but instead as the source of the social order. This approach is of course shaped by the work of the classics of sociology: Weber, Durkheim, and even Freud. What is noteworthy about this juncture in Habermas’s writings is that secularization is explained as “pressure for rationalization” from “above,” which meets the force of rationalization from below, from the realm of technical and practical action oriented to instrumentalization. Additionally, secularization here is not simply the process of the profanation of the world—that is, the withdrawal of religious perspectives as worldviews and the privatization of belief—but, perhaps most importantly, religion itself becomes the means for the translation and appropriation of the rational impetus released by its secularization. Here, religion becomes its own secular catalyst, or, rather, secularization itself is the result of religion.

On How To Mourn: Roland Barthes’ Beautiful, Private Meditation on His Mother’s Death.

101101_GRIEVE_mournDiaryTN Meghan O'Rourke in Slate:

There are few writers as suited to writing insightfully about loss as the mature Barthes was. Grief is at once a public and a private experience. One's inner, inexpressible disruption cannot be fully realized in one's public persona. As a brilliant explicator of how French culture shapes its self-understanding through shared “signs,” Barthes was primed to notice the social dynamics at play among friends and colleagues responding to his bereavement. As an adult son whose grief for his beloved mother—he lived with her and said she provided his “internalized law”—was unusually acute, he was subject to grief's most disorienting intensities. The result is a book that powerfully captures, among other things, the shiver of strangeness that a private person experiences in the midst of friends trying to comfort or sustain him in an era that lacks clear-cut rituals or language for loss.

In one of Mourning Diary's first entries, Barthes describes a friend worrying that he has been “depressed for six months.” (His mother was ill before her death.) It was “said discreetly, as always,” Barthes notes. Yet the statement irritates him: “No, bereavement (depression) is different from sickness. What should I be cured of? To find what condition, what life? If someone is to be born, that person will not be blank, but a moral being, a subject of value—not of integration.” Noting that “signs” fail to convey the private depths of mourning, he comments on the tension between others' expectant curiosity and the mourner's own suffering: “Everyone guesses—I feel this—the degree of a bereavement's intensity. But it's impossible (meaningless, contradictory signs) to measure how much someone is afflicted.”

Guilty pleasures #1: James Bond

200px-Fleming007impression Hussein Ibish over at his blog:

Some months ago my dear friend the great critical theorist R. Radhakrishnan suggested I pay some attention in writing to the phenomenon I discussed with him on several occasions whereby we respond emotionally, aesthetically or intellectually to cultural artifacts that we nonetheless do not, at a certain level, respect. In fact, we may know very well that a cultural product is inferior if not fundamentally absurd, and yet it may have a profound impact and even an irresistible draw to us. How and why does that operate? What's going on when we respond so powerfully at all kinds of levels to something we feel, whether on reflection or viscerally, is either completely or in some senses beneath contempt? How do we account for such “guilty pleasures?” Of course, this version of guilty pleasure is a subset of the deeper existential problem of why we want things that we know very well are bad for us: why we cling to, or mourn the loss of, dysfunctional relationships with toxic people; persist with, or pine for, self-destructive behavior of one kind or another; or find ourselves in the grip of a political or religious ideology we know very well, at a certain level at any rate, is indefensible and possibly loathsome. But for the meanwhile, let's stick to the subject of bad art.

I'm going to begin looking at this problem by taking on one of what has been, in my life at any rate, one of its more gruesome manifestations: films featuring the character James Bond and the Ian Fleming novels on which they are based. Let's be clear at the outset: on the whole and in most senses they are without question garbage, and toxic garbage at that. The films are militantly stupid and implausible, often insultingly so, distinctly racist and irredeemably sexist, and the novels even more racist and sexist (more about the dismal ideology at work in them a little later). And yet some of us are drawn to some of them in spite of having no respect for them whatsoever, and even finding them offensive. In particular the early Bond movies starring Sean Connery have a real pull on my imagination and I'd like to begin my exploration of the morphology of guilty pleasures by considering how on earth that could possibly be the case.

The Bond films are useful as a starting point because, for me at any rate, they point directly to one of the most important and powerful forces behind guilty pleasures of this kind: nostalgia.

Socrates, Athens and the Search for the Good Life

From The Telegraph:

Hughesstory_1749573f There’s a charming poem by Seamus Heaney about Socrates’ last day. It expresses a brief surprise that Socrates could believe in dreams. But the poet quickly acknowledges that the philosopher did live in a dream world. Bettany Hughes’s book leaves us in no doubt. The Hemlock Cup is a biography of Socrates, and also a lot more than that. Yes, it speculates on the walks he would have taken around the Agora in Athens (admittedly with bundles of suggestive evidence); it suggests just what the hemlock would have done to him; and it attributes Socrates’ habit of standing stock still for hours to cataleptic seizures. For all that, Hughes is more concerned with the philosopher’s time and place. As she unfolds the tale, she brings us an edited history of fifth-century BC Athens, too.

This isn’t padding, or even scene-setting (atmospheric though it always is). Without overstating the case, she shows how the city’s life runs alongside the philosopher’s, and then takes a different course. Socrates would always warn that an acquisitive life was not worth living and that the pursuit of gold is vacuous; meanwhile Athens revelled in becoming an empire, so it conquered more and mined more and showed off more. And then there was an attempt to colonise Sicily. Out of Athens and Socrates, the former emerges as the more tragic character, with its greed and its failure to learn from its wisest citizen until in the throes of its downfall.

More here.

Femme Fatale

Kathryn Harrison in The New York Times:

Cleo Papyri crumble away. What remains of her home is 20 feet underwater. She died before Jesus was born. Her first biographers never met her, and she deliberately hid her real self behind vulgar display. A cautious writer would never consider her as a subject. Stacy Schiff, however, has risen to the bait, with deserved confidence. “Saint-Exupéry: A Biography” and “Véra (Mrs. Vladimir Nabokov)” demonstrated her mastery of the form. “The Great Improvisation,” Schiff’s analysis of Benjamin Franklin’s years in Paris, revealed a different genius: the intellectual stamina required to untangle the endlessly tricky snarls created by the intersection of human personalities and international relations. “Mostly,” Schiff says of “Cleopatra: A Life,” “I have restored context.”

The claim stops sounding humble when we understand what it entails. Although it’s not Schiff’s purpose to present us with a feminist revision of a life plucked from antiquity, in order to “restore” Cleopatra — to see her at all — one must strip away an “encrusted myth” created by those for whom “citing her sexual prowess was evidently less discomfiting than acknowledging her intellectual gifts.” Lucan, Appian, Josephus, Dio, Suetonius, Plutarch — the poets, historians and biographers who initially depicted Cleopatra were mostly Roman and all male, writing, for the most part, a century or more after her death with the intent to portray her reign as little more than a sustained striptease.

More here.

Friday, November 5, 2010

The tumultuous trial of ‘Lady Chatterley’s Lover’

From The Telegraph:

Chatterley_1750261c The latest edition of Lady Chatterley’s Lover has been published with little fanfare, no salacious press reports and no questions raised in Parliament. Copies are unlikely to be burned in public, there will be no prosecution, and no one is likely to be depraved or corrupted or even slightly outraged. Exactly 50 years ago, it was a different story. Kenneth Tynan, touting unsuccessfully to report on the forthcoming trial, promised the New Yorker 'the most marvellous circus for ages’. And so it would prove, as Lady Chatterley’s Lover, first published in Italy in 1928 and subsequently banned in Britain, became the first novel to be tested under the 1959 Obscene Publications Act. D H Lawrence, relentlessly hounded by the British establishment throughout his career, was back in court 30 years after his death, accidentally inspiring a seismic cultural shift – much of which encapsulated the exact antithesis of his ethos, presented in this extended sermon on the horrors of war, industrialisation and, most of all, the beauty and purity of loving sexual relations. In 1959, with the 30th anniversary of Lawrence’s death approaching, Penguin was planning a further collection of his works.

Penguin had always boasted (though not entirely accurately) that it published 'complete and unabridged’ texts and the successful passage of the Obscene Publications Act seemed to clear the way for the unabridged Lady Chatterley’s Lover. Indeed, Penguin editors could not have countenanced the various expurgated versions, where 'penis’ was rendered as 'liver’ and 'purple’ passages deleted, to the undoubted confusion and disappointment of readers. The Act was designed to make it easier for the police to root out exploitative pornography, while protecting works of literary merit. However, by August 1960, prosecution of Penguin had become inevitable. 'I don’t think this novel is one of Lawrence’s best, or a great work of art,’ wrote Doris Lessing when approached by the defence as a possible witness. 'I’m sorry, if there is to be a test case, that it will be fought over this particular book.’

More here.

The Skeptic’s Skeptic: In the battle for ideas, scientists could learn from Christopher Hitchens

From Scientific American:

Skeptics-skeptic_1 Science values data and statistics and champions the virtues of evidence and experimentation. Those of us “viewing the world with a rational eye” (as the new descriptor for this column reads) also have another, underutilized tool at our disposal: rapier logic like that of Christopher Hitchens, a practiced logician trained in rhetoric. Hitchens—who is “leaving the party a bit earlier than I’d like” because of esophageal cancer, as he lamented to Charlie Rose in a recent PBS interview—has something deeply important to offer on how to think about unscientific claims. Although he has no formal training in science, I would pit Hitchens against any of the purveyors of pseudoscientific clap­trap because of his unique and enviable skill at peeling back the layers of an argument and cutting to its core.

We would all do well to observe and emulate his power to detect and dissect baloney through pure thought. To wit, after watching a quack medicine man fleecing India’s poor one Sunday afternoon, the belletrist scowled in a 2003 Slate column, “What can be asserted without evidence can also be dismissed without evidence.” The observation is worthy of elevation to a dictum. Of course, as scientists we prefer to tether evidence, when it is available, to logical analysis in support of a claim or to proffer counterevidence that disputes a claim. A radiant example of Hitchens’s insightful thinking, coupled to the effective employment of counterevidence, is his reaction to an episode of the television series Planet Earth. As he watched, he had a revelation of creationism’s profound flaws. The episode was on life underground, during which Hitchens noticed that the blind salamander had “eyes” that “were denoted only by little concavities or indentations,” as he recounted in a 2008 Slate commentary. “Even as I was grasping the implications of this, the fine voice of Sir David Attenborough was telling me how many millions of years it had taken for these denizens of the underworld to lose the eyes they had once possessed.”

More here.

Simon Keller and Valerie Tiberius DIscuss Well-Being and Social Psychology

Over at Philosophy TV:

The nature and conditions of well-being have long held philosophers’ attention, but well-being is also now a major focus of psychological research. In this conversation, Keller and Tiberius discuss the possibilities for cooperation in this area between psychologists and philosophers. Are philosophers able to reveal conceptual truths about well-being that are of value to psychologists? Can psychological research usefully correct philosophers’ naive intuitions about what makes us better off?Publish

Outrage, Misguided

Noam_ChomskyNoam Chomsky in In These Times:

Ridiculing Tea Party shenanigans is a serious error, however. It is far more appropriate to understand what lies behind the movement’s popular appeal, and to ask ourselves why justly angry people are being mobilized by the extreme right and not by the kind of constructive activism that rose during the Depression, like the CIO (Congress of Industrial Organizations).

Now Tea Party sympathizers are hearing that every institution—government, corporations and the professions—is rotten, and that nothing works.

Amid the joblessness and foreclosures, the Democrats can’t complain about the policies that led to the disaster. President Ronald Reagan and his Republican successors may have been the worst culprits, but the policies began with President Jimmy Carter and accelerated under President Bill Clinton. During the presidential election, Barack Obama’s primary constituency was financial institutions, which have gained remarkable dominance over the economy in the past generation.

That incorrigible 18th-century radical Adam Smith, speaking of England, observed that the principal architects of power were the owners of the society—in his day the merchants and manufacturers—and they made sure that government policy would attend scrupulously to their interests, however “grievous” the impact on the people of England; and worse, on the victims of “the savage injustice of the Europeans” abroad.

A modern and more sophisticated version of Smith’s maxim is political economist Thomas Ferguson’s “investment theory of politics,” which sees elections as occasions when groups of investors coalesce in order to control the state by selecting the architects of policies who will serve their interests.

Ferguson’s theory turns out to be a very good predictor of policy over long periods. That should hardly be surprising. Concentrations of economic power will naturally seek to extend their sway over any political process. The dynamic happens to be extreme in the U.S.

Yet it can be said that the corporate high rollers have a valid defense against charges of “greed” and disregard for the health of the society.

The Strange Behavior of Positronium

500x_brawley1hr Alasdair Wilkins in io9:

Positronium is a particle created when you bind together an electron and its antimatter counterpart, the positron. It doesn't interact with other atoms in the way we would expect, and this discovery could help us solve the universe's biggest mysteries.

Positronium is sort of like a hydrogen atom, except if you took away the lone proton in the nucleus and replace it with a positron. Because electrons – and, by extension, positrons – are only 1/1836 the mass of a proton, that means positronium particles are much less massive than their hydrogen counterparts. The particle is a common byproduct of the interaction between regular matter and positrons. It's an unstable particle, only remaining together for an average of 142 nanoseconds before decaying into two gamma ray particles.

During their very short lifetimes, however, it's possible to probe some of their properties and characteristics, and that's what researchers at University College London recently attempted. They tried a scattering experiment, in which they sent streams of positronium particles at different atoms and molecules and measured how they interacted. Because positronium is a hybrid of an electron and positron, they expected the particle to act in a way that was some sort of average of these two.

But that isn't what they found. Instead, positronium acted precisely the same as an electron would. That doesn't make much sense – electrons are negatively charged, while positronium is neutral, and it's obviously twice the mass of a lone electron. In some weird way, the effects of the positron's presence seems to be cloaked so that only the electron half of the positronium interacts with other matter.

Reger said

T_Bernhard Michael Hofmann on Thomas Bernhard's Old Masters: A Comedy:

The Austrian novelist and playwright Thomas Bernhard (1931-89) once said: ‘You have to understand that in my writing the musical component comes first, and the subject matter is secondary.’ It’s a strange thing for this professional controversialist and Austropathic ranter to have said – that we should attend to the form, balance and measure in his work, when everything in it would seem to lead to the giggle and gasp of hurt given or received, or the hush and squeal of scandal – but it is sound advice. Before we talk about the quality of the opinions, or the kilotonnage of the diatribes, or the relentlessness of the assault (is anything exempt?), we ought to talk about the patterns of repetition and variation in the unspooling sentences of the unparagraphed prose. If Bernhard is anything, he is a stuck harpsichord record, knocking out its trapped and staggered shards of shrilly hammered phrases.

Old Masters, first published in Germany in 1985 and recently reissued, is Bernhard’s penultimate novel. It comes before Extinction and after Cutting Timber (also translated as Woodcutting), which was seized on publication because a couple who thought they recognised themselves in it, the Lampersbergs, old friends of Bernhard, had an injunction taken out against it. (Publicity not being an advantage to them in their circumstances, they eventually relented.) Old Masters is typical of Bernhard in that it is both a parodically eccentric version – one isn’t sure, or it’s not sure, as often in Bernhard, if it’s a skit or a rarefied, laboratory version – of life, but at the same time it is almost reassuringly normal. A Bernhard novel is a bizarrely skewed but immediately familiar planet, whose rules and concerns we grasp as readily as those of Le Petit Prince. Old Masters takes place in a single location, more or less in real time, and yet is able to include in its purview most things under the sun. Come to think of it, even the sun: ‘He avoids the sun, there is nothing he shuns more than the sun,’ it says in Ewald Osers’s terrific and calm and thoughtful translation. Nothing happens and little is revealed; it is mostly talk and remembered talk, and thought and remembered thought.

(Image from Wikipedia, shows the front book cover art for the book Thomas Bernhard.)

And yet, and yet

Wallace-Shawn-bilde-4

This may be the great drawback, the gnawing defect, that has made Shawn’s work less visible than it ought to be. In “Myself and How I Got into the Theater,” he writes that he was drawn to theater in the late Sixties because “the whole field…was a strange sort of non-field, in which the whole business of ‘standards’ just didn’t apply.” But that anarchic spirit—the rediscovery of the communal, ritualistic, dangerous energies of performance, in Artaud, in the Living Theatre, in the anthropological work of Victor Turner—has long been subsumed and synthesized into the bourgeois world of subscription-based regional theaters and Performing Arts Centers, Broadway and fringe festivals. Theater’s very irreproducibility, the fact that it can’t be televised, archived, linked, or DVR-ed, makes it both expensive and remote, especially to young and curious audiences. Shawn’s target is also his target audience, and this presents a problem, because the well-off, well-educated, middle-aged-to-elderly theatergoing public still wants to be flattered and reassured, not taken on a tour of its frailties presented in the harshest possible light. And yet, and yet: we still live in a large country with pockets of insurgent energy, and Shawn’s least actable play, A Thought in Three Parts, finally premiered in Austin in 2007, three decades after it was written, courtesy of the Rubber Repertory Company.

more from Jess Row at the Threepenny Review here.

Can you provide humanitarian aid without facilitating conflicts?

Philip Gourevitch in The New Yorker:

ScreenHunter_06 Nov. 05 12.45 In Biafra in 1968, a generation of children was starving to death. This was a year after oil-rich Biafra had seceded from Nigeria, and, in return, Nigeria had attacked and laid siege to Biafra. Foreign correspondents in the blockaded enclave spotted the first signs of famine that spring, and by early summer there were reports that thousands of the youngest Biafrans were dying each day. Hardly anybody in the rest of the world paid attention until a reporter from the Sun, the London tabloid, visited Biafra with a photographer and encountered the wasting children: eerie, withered little wraiths. The paper ran the pictures alongside harrowing reportage for days on end. Soon, the story got picked up by newspapers all over the world. More photographers made their way to Biafra, and television crews, too. The civil war in Nigeria was the first African war to be televised. Suddenly, Biafra’s hunger was one of the defining stories of the age—the graphic suffering of innocents made an inescapable appeal to conscience—and the humanitarian-aid business as we know it today came into being.

More here.

Friday Poem

The Nature of Words

Except for the order to which they belong,
I do not know the names of the butterflies I follow.
Some words like Lepidoptera delight by their flutter
Demulcent, evanescent, ephemeral, fenestration, moraine.
I do not pin them down or make demands

If I find a word to love, I explore with drunken wonder
Under the covers the convolutions of its typography.
If I am a lover of words, are words my lovers?
Lenticular, gravisphere, arête, paternoster lake.

Who am I with this one? Do I associate with another?
With some, I procreate, turning tender thoughts their way,
Giving birth in springwater. But words can be undisciplined.
Perhaps not innately bad but brought up to slip one by.
Split estate, water rights, the elk harvest in the fall.
Blame their fathers, who act without shame.

They engineer the tongue to click, to claim
through contrivance, to make it stick. Question these words,
call them into account, bust the teeth of despotic jaw
Eminent domain, terrorist threat, hegemony, original sin.
Codified perception is the jail of tiny laws.

by Chavawn Kelly
from You Are Here, 2010