sneaky unpretentiousness


About Robert Frost, it should be borne in mind that he was fashioning his distinctive style and voice long after Walt Whitman’s “barbaric yawp” had sounded to raze the old structures of meter and rhyme, when poets like Robinson Jeffers, say, were building up from that rubble. Perhaps partly for this reason, the several neat rows of poems Frost left behind look now, in posterity, to be weathering well, some maybe tilting in the turf these days, but deeply inscribed with a traditional, formal simplicity to keep them legible through many more ages’ storms of fashion. Frost always did say it was his goal “just to lodge a few poems where they’ll be hard to get rid of,” and in that effort he apparently sacrificed innovation. (If he was tempted at all by innovation.) Right up until his death in 1962, he went on offering out the same four trusty iambs, mostly—or sometimes three iambs, sometimes five—while the world’s podiums were occupied by such free-verse innovators as T. S. Eliot and Wallace Stevens and even the first raiding parties of so-called “Beats.” All these phenomena Frost ignored or openly deplored. Wallace Stevens he called “the bric-a-brac poet.” Eliot he snubbed most churlishly on a number of awkward public occasions, always in the face of Eliot’s more debonair grace and forgiveness. He met with Pound in London before he’d ever published a book, and the great mentor did try to educate him in the new manner, though it didn’t stick.

more from Louis B. Jones at the Threepenny Review here.

an autobiographical game creator


When Shigeru Miyamoto was a child, he didn’t really have any toys, so he made his own, out of wood and string. He put on performances with homemade puppets and made cartoon flip-books. He pretended that there were magical realms hidden behind the sliding shoji screens in his family’s little house. There was no television. His parents were of modest means but hardly poor. This was in the late nineteen-fifties and early nineteen-sixties, in the rural village of Sonobe, about thirty miles northwest of Kyoto, in a river valley surrounded by wooded mountains. As he got older, he wandered farther afield, on foot or by bike. He explored a bamboo forest behind the town’s ancient Shinto shrine and bushwhacked through the cedars and pines on a small mountain near the junior high school. One day, when he was seven or eight, he came across a hole in the ground. He peered inside and saw nothing but darkness. He came back the next day with a lantern and shimmied through the hole and found himself in a small cavern. He could see that passageways led to other chambers. Over the summer, he kept returning to the cave to marvel at the dance of the shadows on the walls. Miyamoto has told variations on the cave story a few times over the years, in order to emphasize the extent to which he was surrounded by nature, as a child, and also to claim his youthful explorations as a source of his aptitude and enthusiasm for inventing and designing video games. The cave has become a misty but indispensable part of his legend, to Miyamoto what the cherry tree was to George Washington, or what LSD is to Steve Jobs. It is also a prototype, an analogue, and an apology—an illuminating and propitious way to consider his games, or, for that matter, anyone else’s.

more from Nick Paumgarten at The New Yorker here.

Wednesday Poem


Up the white road through the olive groves,
past the stretched red string, plumb line for the mason.

At the last bend below the high-tension wires
a three-coned trullo someone has let go.

No lights inside―shuttered windows turn back the stars.
Just as in her last year my mother’s eyes flattened.

Her starlessness caught in the camera at the registry of motor vehicles.
Record of her letting go.

Our guide tells us, “In Puglia, olive trees and stone.”
(And so, white faces peek from shadows.)

In my mind I have made this house my mother.
I have opened her gate; righted her tumbled walls;

painted the petroglyph for Joy on every cone.
What we do not see we cannot change.

by Miriam On'Neal
from Blackbird, Spring 2010

The Bed of Procrustes by Nassim Nicholas Taleb

From The Guardian:

Digested-read-The-Bed-of--007 Procrustes, in Greek myth, was the cruel owner of an estate in Attica who abducted travellers and cut off their heads to ensure they fitted his bed perfectly. Every aphorism here is about a Procrustean bed of sorts: faced with the imperfection of the unknown and the unobserved, we humans tend to backfit the world into reductive categories such that only someone of my immense intellect is able to point out the inherent futility of modern life. Since aphorisms lose their charm whenever explained – especially when they are as banal as the ones that follow – I pompously relegate further discussion to the postface, though they all revolve around matters more deeply dealt with in my extremely significant and influential book, The Black Swan.

▶ If your anger decreases with time, you did injustice: If it increases, you suffered injustice.

▶ The opposite of manliness isn't cowardice; it's technology.

▶ Most of what they call humility is successfully disguised arrogance.

▶ The more a writer thinks himself to be serious, the less serious his writing becomes.

▶ A man who is labelled a guru for his last book, will think himself a philosopher in the next.

▶ You can be once, twice, three times a lady; but only once a man.

▶ People used to wear ordinary clothes weekdays and formal attire on Sunday. Today it is the exact reverse.

More here.

Group IQ

From The Boston Globe:

GroupIQ1__1292697415_5596 For a century, people have been devising tests that aim to capture a person’s mental abilities in a score, whether it is an IQ test or the SAT. In just an hour or an afternoon, a slate of multiple choice questions or visual puzzles helps sift out the superstars — people whose critical thinking skills suggest they have potent intellectual abilities that could one day help solve real-world problems. But separating the spectacularly bright from the merely average may not be quite as important as everyone believes. A striking study led by an MIT Sloan School of Management professor shows that teams of people display a collective intelligence that has surprisingly little to do with the intelligence of the team’s individual members. Group intelligence, the researchers discovered, is not strongly tied to either the average intelligence of the members or the team’s smartest member. And this collective intelligence was more than just an arbitrary score: When the group grappled with a complex task, the researchers found it was an excellent predictor of how well the team performed.

More here.

The Winners of the 3 Quarks Daily 2010 Politics Prize

Top_2010_politics_Lapham_19th Politics_160_winner1 Politics-winner-2010

Lewis H. Lapham has picked the three winners:

  1. Top Quark, $1000: Stephen Walt, Why America is going to regret the Cordoba House controversy
  2. Strange Quark, $300: Huffington Post, The Two Most Essential, Abhorrent, Intolerable Lies Of George W. Bush's Memoir
  3. Charm Quark, $200: The Philosopher's Beard, Politics: Can't Someone Else Do It?

Here is what Mr. Lapham had to say about them:

As an editor, I take seriously the craft of writing. The political blog at its best accounts for the editorial process—for the checking of facts as well as the redrafting of manuscripts; the political blog at its worst disregards even the semblance of considered judgment, leaving the internet user to portion out the wheat of informed opinion from the chaff of paranoid rant. The readers and editors of 3QD nominated more than forty pieces of internet writing, then narrowed the field by a public vote to nine finalists. I cannot say these are the best pieces of political blog writing on the internet, but they are representative of the impassioned rhetoric that has engulfed the public political discourse.

In accordance with the rules of the 3QD political prize, I have chosen three entries that exemplify the spirit of the times.

1st Place: “Why America is Going to Regret the Cordoba House Controversy” by Stephen M. Walt

A measured, thoughtful response to the religious intolerance that surrounded the “mosque” at ground zero.

2nd Place: “The Two Most Essential, Abhorrent, Intolerable Lies of George W. Bush's Memoir” by Dan Froomkin

This post uses the publication of George Bush's memoir Decision Points as the occasion to reexamine the Bush years and remind us of the facts embedded in the history of his presidency.

3rd Place: “Politics: Can't Someone Else Do It?” by Thomas Rodham

A philosophical look at the constraints imposed upon liberal politicians during their terms in office. They expect too much from their time in government and their constituents expect too much from them.

Congratulations also from 3QD to the winners (I will send the prize money later today–and remember, you must claim the money within one month from today–just send me an email). And feel free to leave your acceptance speech as a comment here! And thanks to everyone who participated. Thanks also, of course, to Lewis Lapham for doing the final judging.

The three prize logos at the top of this post were designed, respectively, by Sughra Raza, Carla Goller, and me. I hope the winners will display them with pride on their own blogs!

Details about the prize here.

Endgame Capitalism

9780415246552Nathan Schneider interviews Simon During over at The Immanent Frame:

NS: Why is capitalism the focal point of your recent book? And what about capitalism is “postsecular”?

SD: Can I begin with the “postsecular”? It’s a rather confusing term. Mainly it points to a ceasefire—or, anyway, a slowdown—in the long battle between secular reason and religion. That’s ultimately what it implies in the recent work of Habermas, for instance. And that’s also what it means in the kind of intellectual history that uncovers the religious prehistory of secular concepts. But I suspect such work can usually be understood as secularism proceeding under the flag of its own decease. I am more interested in two other possibilities that occur when we think about a zone that is neither secular nor non-secular. The first appears when the limits of the (secular) world become apparent in everyday or mundane life, outside of religion. The second appears when we are compelled to radical leaps of faith—again, outside religion.

Both of these have a direct relation to democratic state capitalism. That’s because democracy and capitalism have each become compulsory and fundamental. They ground everything we do, including religious practice—so we can only get outside them through the kind of postsecular leap of faith that I am talking about. That realization is one of the things that is important about Alain Badiou’s thought. Such leaps may also be relevant to situations in which we encounter secularism’s limits—when secularism can’t contain the ethical and epistemological demands we make of it.

NS: Why can’t secularism itself contain leaps of faith? Why do we need to move past it, to the postsecular?

SD: Of course, there are all kinds of secular leaps of faith. But the will to get outside democratic state capitalism requires something else. It’s true that secular reason is useful in adjudicating upon the current system. You can at least attempt to measure its benefits—the joys, capacities, wealth, and opportunities that it does indeed provide us, and the way that it makes so much seem “interesting,” for instance—against the insecurities, inequalities, restrictions, and controls that it also imposes.


Kalidasa_sm Kalidasa in the 4th or 5th century C.E. (translated by W. S. Merwin & J. Moussaieff Masson):

Even the man who is happy
glimpses something
or a hair of sound touches him

and his heart overflows with a longing
he does not recognize

then it must be that he is remembering
in a place out of reach
shapes he has loved

in a life before this

the print of them still there in him waiting

Of Course the Civil War Was About Slavery

Mmw_succession Emily Badger in Miller-McCune:

The Sons of Confederate Veterans are holding a gala this week in Charleston, S.C., a hundred-dollar-a-ticket affair celebrating the state’s secession from the Union 150 years ago. It’s the first of countless commemorations planned for the coming four years — lectures, conferences, parades, re-enactments, museum exhibitions and government proclamations — to mark the sesquicentennial of the Civil War.

The Charleston “Secession Ball” — advertised as “an event of a lifetime” — includes a theatrical re-enactment of the signing of the Ordinance of Secession (the original version of which will be on display), as well as dinner and a dance.

“We’re celebrating that those 170 people risked their lives and fortunes to stand for what they believed in, which is self-government,” one of the event’s organizers told The New York Times. “Many people in the South still believe that is a just and honorable cause.”

“Of course, when South Carolina did secede, there was enormous celebration, dancing in the streets and so on,” said James McPherson, a Princeton Civil War historian and author of the Pulitzer Prize-winning history Battle Cry of Freedom.

But something about the Charleston event of 2010 strikes an odd tone.

“They didn’t know what was going to happen to them,” McPherson said of the original revelers. “Now we do know what happened to them, and maybe a celebratory note is not very appropriate.”

Scholars today are mostly of one mind about why South Carolina seceded and what caused the war. But Americans, even a century and a half later, still deeply disagree with each other and historians, many of them embracing a Civil War story about self-government and “states’ rights” that reveals more about America in 2010 than what actually occurred in the 1860s.

Discoverer of Arsenic Bacteria, in the Eye of the Storm

Sn-wolfesimon-thumb-200xauto-5044 In Science magazine, Elizabeth Pennisi interviews Felisa Wolfe-Simon:

Three weeks ago, Felisa Wolfe-Simon, 33, a former performance oboist with a doctorate in oceanography and a NASA fellowship in astrobiology, published a paper online in Science about bacteria that can use arsenic instead of phosphorus in DNA and other biomolecules. Four days before the publication, NASA sent out a media advisory that it would hold a press briefing “to discuss an astrobiology finding that will impact the search for evidence of extraterrestrial life.” That led to wild speculations on the Web about extraterrestrial life, and when the paper was published, many headlines made the most of the “alien” nature of the discovery by Wolfe-Simon and her colleagues at the U.S. Geological Survey in Menlo Park, California.

Then came a torrent of criticism by scientists. A highly critical blog post by Rosie Redfield, a microbiologist at the University of British Columbia, Vancouver, quickly drew hundreds of comments, many also finding fault with the study. Wolfe-Simon and her co-author Ronald Oremland then came under attack by journalists when they declined to respond to media calls for a response to these comments. On 16 December, the authors posted responses to some of the issues, and Science will publish technical comments and responses in early 2011. In the meantime, Wolfe-Simon agreed to share some of her thoughts in an interview with Science’s news department, which covered the original finding in early December. The following has been edited for brevity.

Q: How would you characterize your life since the press conference?

F.W.-S.: Since the press conference, my life has been really busy and stressful. When the paper was accepted for publication, we told the Astrobiology Program and NASA, … and when they asked me to come in and talk about the paper, I said, “Sure.” I was obliged. It had been 2 months or so, and the paper had been accepted for a while, so I thought this would be great, I’ll bring the information to the public.

In the Time of Not Yet: On Edward Said’s Imaginary

200px-Edward_SaidMarina Warner in the LRB (photo from Wikipedia):

Edward Said first met Daniel Barenboim by chance, at the reception desk of the Hyde Park Hotel in June 1993; Said mentioned he had tickets for a concert Barenboim was playing that week. They began to talk. Six years later, in Weimar, they dreamed up the idea of a summer school in which young musicians from the Arab world and from Israel could play together. They hoped, Said remembered in Parallels and Paradoxes, that it ‘might be an alternative way of making peace’. It was in Weimar, he noted, that Goethe had composed ‘a fantastic collection of poems based on his enthusiasm for Islam … He started to learn Arabic, although he didn’t get very far. Then he discovered Persian poetry and produced this extraordinary set of poems about the “other”, West-östlicher Divan, which is, I think, unique in the history of European culture.’ The West-Eastern Divan: the orchestra had a name; it was never discussed again.

It seems odd that Said, the fierce critic of European Orientalism, chose to use the title of a work that, on the face of it, belongs in the Orientalist tradition. Goethe’s poems are filled with roses and nightingales, boys beautiful as the full moon, wine, women and song. Yet as Said saw it, Goethe’s lyric cycle is animated by a spirit of open inquiry towards the East, grounded in a sense of the past in art and culture, not in dogma or military and state apparatuses. He read it as calling for an understanding of individuality as a process of becoming and therefore fluid. He also believed that poetry can have the metaphorical power to proclaim a visionary politics. The cycle represented for him an alternative history and epistemology, concerned with the cross-pollination between East and West. It seemed to confirm the orchestra’s principle that ‘ignorance of the other is not a strategy for survival.’

Said’s approach was always historical; his work as a critic and intellectual was rooted in an examination of context, both cultural and political, and the orchestra, which this summer toured South America, embodies his commitment to the work of art as an actor in its time. The word theoria, he liked to remind us, means ‘the action of observing’; for him, theory was a dynamic, engaged activity, not a matter of passive reception. The theorist-critic should be a committed participant in the works he observes, and the works themselves aren’t self-created or autonomous but precipitated in the crucible of society and history. ‘My position is that texts are worldly,’ he writes in The World, the Text and the Critic. ‘To some degree they are events, and, even when they appear to deny it, they are nevertheless a part of the social world, human life, and of course the historical moments in which they are located and interpreted.’ The making of music is an event in this sense too.

The Wheels of Injustice Grind Slowly

Khodo2_0Julia Ioffe in Foreign Policy:

When journalists showed up to hear the judge read the long-awaited verdict in the case of jailed oil tycoon Mikhail Khodorkovsky, they found a note on the courthouse door. The reading of the verdict, it said, would be postponed. It was still early in the morning, though, and the note — unsigned and typewritten — seemed like it could easily be fake. This was, after all, the denouement of a highly politicized, hyper-publicized trial, both in Russia and abroad. So one of the puzzled journalists called Khodorkovsky's lawyer, Genrikh Padva, who had not yet heard of the note's existence. “I might have expected this,” he said. “But no one warned me about it ahead of time.”

By the time Padva got to the courthouse, there was a scrum of reporters and elderly Khodorkovsky supporters by the door. They swarmed him, demanding an explanation. “Apparently the court just didn't have enough time to write the verdict,” the lawyer explained. He also had not gotten an official explanation (just an official version of the note on the door) but Padva and the rest of the legal team tried to play it down. This happens all the time, they said. Only Khodorkovsky's father, Boris, had a more probing — and Russian — explanation: After the delay, he said, “a lot fewer people will come” for the actual verdict.

The date was April 27, 2005.

Five and a half years later, on December 15, journalists awaited another Khodorkovsky verdict; the scene was almost identical, with a few names and details changed around. It was a different Moscow courthouse and a different case in question, this one brought in 2007 when Khodorkovsky and his partner Platon Lebedev were just about to be up for parole. The new charges alleged that the two stole all the oil their company Yukos ever produced and then laundered the ill-begotten proceeds. (The first case was that they neglected to pay taxes on this laundered oil money. The apparent contradiction between these two cases has yet to be explained.)

Finding the Facts About Mao’s Victims

20101130-yangjisheng01_jpg_210x640_q85 Ian Johnson in the NYRB blog:

Yang Jisheng is an editor of Annals of the Yellow Emperor, one of the few reform-oriented political magazines in China. Before that, the 70-year-old native of Hubei province was a national correspondent with the government-run Xinhua news service for over thirty years. But he is best known now as the author of Tombstone (Mubei), a groundbreaking new book on the Great Famine (1958–1961), which, though imprecisely known in the West, ranks as one of worst human disasters in history. I spoke with Yang in Beijing in late November about his book, the political atmosphere in Beijing, and the awarding of the Nobel Peace Prize to Liu Xiaobo.

Tombstone, which Yang began working on when he retired from Xinhua in 1996, is the most authoritative account of the Great Famine. It was caused by the Great Leap Forward, a millennial political campaign aimed at catapulting China into the ranks of developed nations by abandoning everything (including economic laws and common sense) in favor of steel production. Farm work largely stopped, iron tools were smelted in “backyard furnaces” to make steel—most of which was too crude to be of any use—and the Party confiscated for city dwellers what little grain was sown and harvested. The result was one of the largest famines in history. From the government documents he consulted, Yang concluded that 36 million people died and 40 million children were not born as a result of the famine. Yang’s father was among the victims and Yang says this book is meant to be his tombstone.

The strange things people swallow

From Salon:

Md_horiz A set of drawers in Philadelphia's Mütter Museum of human pathology contains some very curious artifacts: thousands of objects, from umbrella tips to diminutive opera glasses, that have been extracted from the human body. They were swallowed or inhaled (sometimes accidentally, sometimes on purpose) and later removed by Dr. Chevalier Jackson, a man who dedicated much of his life to removing odd objects from people's insides. The collection is a remarkable testament to the strangeness of the human experience — and our ability to swallow.

In her new book, “Swallow,” Mary Cappello uncovers the stories behind those objects, and the peculiar life story of Chevalier himself. Cappello is a professor at the University of Rhode Island and the author of the bestselling book “Awkward,” a meditation on uncomfortableness. Here, she packs her story with surprising imagery and extravagant lyricism, taking a highly literary approach to the subject — meandering from Chevalier's biography to the odd story of early 1920s women who compulsively ingested pieces of hardware.

More here.

Young Female Chimps Play Out Motherly Role

From The New York Times:

Chimps-popup Young female chimpanzees like to play with sticks as if they were dolls, according to a new study in the journal Current Biology. Although both juvenile male and female chimpanzees were seen playing with sticks in Kibale National Park in Uganda, females were more likely to cradle the sticks and treat them like infants. In human children, societal stereotypes may dictate what boys and girls play with, said Sonya Kahlenberg, a biologist at Bates College in Maine and one of the study’s authors. “The monkeys tell us there is something different there,” she said. The researchers studied juvenile behavior in a single chimpanzee colony over 14 years, and observed 15 females and 16 males.

Of the 15 females, 10 carried around sticks, while five of the males were seen with sticks. The young females were apparently mimicking their mothers, she said. “Females are the main caretakers,” Dr. Kahlenberg said. “Though it’s not that we didn’t see that in male chimps at all.” In one instance, an eight-year-old male with a stick stepped out of his mother’s nest, built a smaller nest and laid his stick in it. Although adult chimpanzees are also known to use sticks, they use them as foraging tools, not toys. Juveniles were defined as chimpanzees between the ages of five and 7.9. This is roughly equivalent to the human age range of six to nine, Dr. Kahlenberg said.

More here.

Selling a disability

Socsec If you're an American and you have a job, you're supposed to get an annual statement from the Social Security Administration explaining how much money you stand to receive at retirement. It also reports what your dependents will get from the SSA if you die, and what you'll get if you become disabled.

For me, the statement is a stark reminder of how much I rely on my wife's income to survive. As a writer, my income is sporadic, and if I couldn't work, I'd have a difficult time living on my Social Security benefits alone. Many people see the Social Security program as a sort of charity, but fundamentally it is not: The more you put in, the more you get back from it. If a person hasn't made much money, they won't be able to collect enough benefits from Social Security to live on. But even when people do pay in, the system has made it nearly impossible for some people to receive the benefits they deserve.

For physical laborers, the very work they do can end up causing disabilities that prevent them from working. My stepbrother Mark had always had a bad back, but he'd dealt with the problem by loading up on Advil and taking an occasional day off. He never visited a doctor about the problem because his jobs never provided health coverage. Often, before starting a job, his boss would pull him aside and remind him that he was not an employee; he was an “independent contractor,” which meant that the boss wasn't responsible for any injuries or other problems that occurred on the job site. There was no health coverage, no unemployment insurance, no safety net at all, physical or financial.

Once Mark was working on a makeshift bit of scaffolding in the cavernous great room of a partially-completed McMansion. He was 30 feet above the rough plywood floor, balancing on a narrow plank, attaching blocks to the rafters with a nail gun so heavy it was difficult for him to hold it over his head, weakened as he was by his deteriorating back. A nail got caught in the gun, causing it to backfire; the 15-pound piece of equipment glanced off the ceiling before crashing down on his face. The blow cracked a tooth and nearly knocked him unconscious. He's still not sure how he managed to stay on that plank. If he had fallen—supposing he managed to survive—he would have had no way to pay his medical bills.

About eight years ago, Mark realized that he wasn't going to be able to continue doing construction work and other low-paying manual labor. He enrolled in a vocational school to become a dental technician, but as I mentioned last month, even this quickly became too demanding for him. Hours of sitting in class only made his condition worse.

Read more »

Language, on and off Holiday


Which of the following would best be described as an hiatus?

Which of the following would most likely be considered a furlough?

For extra money, an hour or two most days, I pose variations of these questions over and over. Each of them, along with its set of possible answers, goes into the database of an on-line vocabulary-building tool. It's a pretty straightforward formula, the interrogative of multiple choice. However, what strikes me with each variation is the tense in which it must be formed: would be most likely to, would best be described as, would most likely. Always the conditional – to signal, I suppose, that these are all hypothetical instances – and thus the words here deployed are equivalent to blanks in a loaded gun: they make the same sound but do not pierce us in any way.

And so I compose these questions, one after another, ten to fifteen an hour, careful to insert the conditional, as if I were setting up a practice shooting range, a multiple choice of clay pigeons and cardboard targets. I do wonder, though, how effective this sort of vocabulary building will be for its subscribers. (I have no concrete research before me, but I suspect that learning vocabulary outside of its natural habitat1 is somewhat analogous to swallowing vitamin pills instead of eating actual vegetables: less is absorbed.) But to be perfectly candid, I'm not so much concerned about the improved verbal scores of these potential subscribers as I am just a bit saddened each time I confine another word to a purely hypothetical existence.

“Philosophical problems arise when language goes on holiday,” Wittgenstein so quaintly tells us in his Philosophical Investigations. He was getting at how difficult it is to actually learn much at all about words and their attendant conventions once you've removed them from the everday speech and printed page that is their office – once you're fanning an isolated word with the palm front of philosophical analysis.2 And certainly the practice shooting range of this vocabulary-building tool is just such a holiday setting. Being presented with a cursory definition of a word, its part of speech, and then asked to identify the most plausible instantiation of it in a lineup of four, is hardly akin to encountering it under workaday circumstances. But this, of course, is true of any number of tools and programs aimed at improving one's vocabulary. What really underscores the disparity between holiday and workaday, it seems, is the use here of the conditional tense – that single block of would – that confines each word to something like a cryogenic chamber of unreality.3

Which of the following would most likely be considered a sabbatical?

Because, you see, what is never explicitly stated in these questions, but what's undeniably understood, is the condition for using the vocabulary word in question – the condition, of course, being actuality. If this weren't a practice shooting range; if you were ever to encounter these words in their natural habitat – each question implicitly (hauntingly) begins. The would that always follows, in the explicitly stated clause of the question, is that ghostly class of the conditional called the speculative, or counter-factual, conditional The Bedford Handbook's4 description of it gives me chills: speculative conditional sentences express unlikely, contrary-to-fact, or impossible conditions in the present or future. I.e., it is unlikely, contrary-to-fact, or even impossible that you, the type who subscribes to this kind of vocabulary-building tutelage, will ever, ever encounter these words in their natural habitat.5

Read more »

On being a Shia

by Feisal Hussain Naqvi

ScreenHunter_01 Dec. 20 08.50 Being a Shia means different things to different people. In my case, being a Shia means that if I am as professionally successful as I hope to be, someone will want to kill me. Or, to be less melodramatic, it means I can’t play golf on Ashura.

The fact that being a Shia means such wildly divergent and generally irrelevant things to me also indicates that I am not much of a Shia, which I confess to be true. The question then is, why do I feel compelled to identify myself as a Shia.

Before I try to answer, some background is in order. The roots of Shi’ism go deep into Islamic history, more specifically, into the issue of who was to succeed the Prophet as the leader of the Muslim community. Ali, the Prophet’s son in law, was favoured by one group while Abu Bakr, one of the Prophet’s closest companions was favoured by another. Ultimately, the group supporting Abu Bakr prevailed, so much so that Ali did not become Caliph until two other members of the community (Omar and Usman) had preceded him. When Ali did become Caliph, he faced a challenge led by Aisha, one of the Prophet’s wives (and also the daughter of Abu Bakr).

The schism worsened after the assassination of Ali in 661 AD. Ali was succeeded as Caliph by Muawiya, the governor of Syria under Ali. When Muawiya’s son, Yazid, took over as Caliph in 680 AD, the stage was set for another clash.

Shortly after Yazid’s accession to the caliphate, Ali’s son, Hussain, was called to Kufa (a city in present-day Iraq) by leaders of the community there. Hussain set off from Medina with his family and a small group of followers but he never made it to Kufa. Instead, at a place called Karbala, his family and he were surrounded by the armies of Yazid. For three days, Hussain and his family were not allowed access to water. On the 10th day of the Islamic month of Muharram (now called Ashura), Hussain, his family and his followers were slaughtered to a man. The only survivors of the battle were the women of Hussain’s family and his son Zain ul Abedin, who had been too ill to accompany his father into battle.

The Shias, then, are those people who mourn the tragedy of Karbala and who believe that Ali should have succeeded the Prophet. But that is only the simplest aspect of Shi’ism. What Shias also believe is that the Prophet, Ali and his descendants (the Imams) were intrinsically superior to other humans and that they were thus qualified to lead the Muslims in a way that no other Muslim could ever match.

There is, of course, much more to Shi’ism than what I have just outlined. The point I was trying to make though is that mourning the martyrdom of Hussain is essential to the concept of being a Shia. Each community of Shias is different in its mourning rituals, but every year, the first ten days of Muharram bring to a halt the lives of devout Shias.

Read more »

Standing Erect in the Face of Christmas

Ever OnwardEvery year it seems, the weeks at the end of the calendar designated as the “Christmas Season” expand further and further in the direction of the blessed sunny months. Like a plodding, methodical, remorseless, invading imperial army, moving inward slowly and ineluctably towards the capital, grinding up territory with terrifying banality, the “Christmas Season's” expansion is relentless. I fear it shall not be content until it reaches the holy shrine of Memorial Day, which currently stands as an indefatigable bulwark, ushering in our unofficial beginning of summer, whether it be in long simmering Georgia or resplendently spring-like Wisconsin.

But it was not always this way. In days of yore, not even the most precocious child dared to speak of it before December, lest they incur the wrath of a Santa Clause who was still more associated with donations to the Salvation Army than frenzied 40% markdowns on garish clothing made by exploited Indonesian children; a stern, Teutonic St. Nick who really did keep two lists, who never dreamed of offering punch-card guarantees on the latest electronic do-dads, whose ire manifested itself in the form of coal lumps, who demanded to be placated not only with modesty and obedience but also with offerings of milk and cookies, and who seemed to more closely resemble a red-robed Karl M Santa Karl arx than some jolly, docile servant whose fetching and offering was at the beck and call of screaming, sugar-crazed children.

But that was then. Things were different. During the war there was rationing. Before that, during the Great Depression expectations were understandably minimal. If Santa showed up at all, you cried tears of joy, stared to the heavens, and thanked him earnestly for that raw wooden block with crudely drawn wheels. My friend Tom, who’s proudly pushing 90, remembers that the most amazing gift he would get each Christmas was an orange. Not an orange iPod, but the actual fruit, which was rare, delicious, and expensive.

And bJoyeux Noel!efore the Depression? Hell, today’s consumer economy was just a twinkle in the cold, glassy eyes of early ad men. The portable vacuum cleaner was a modern fucking day miracle in 1907 when it was invented by a janitor in Canton, Ohio. His brother in-law was a saddle maker named William Hoover who, despite not being of Scottish descent, nonetheless figured out a way to make the new contraption look more like a bagpipe, and to make it sound just as sweet. A Wii and a NetFlix subscription? You’ve gotta be kidding me, right?

Amazing Grace, How Sweet the Sound But after the war ended, another war began: a war against crass commercialism, spiraling consumer debt, and the guilt and shame born of unreasonable expectations. After the war. THE WAR. That’s when it all started to change. First the barrier got pushed back to Thanksgiving weekend, and the late November beachhead had been established. Then the weekend itself eroded, and the Friday after Thanksgiving yielded, in a precursor to the grotesque spectacle that is Black Friday. The dominoes continued to fall, and soon the entire month of November was occupied, marking Halloween as a new Last Stand. And by the 21st century, perhaps even sooner in some quarters, Christmas displays had begun to precede cardboard turkey cut-outs and pre-fabricated children’s costumes in stores.

Something inside you dies the first time you see a box of candy canes sitting on a store aisle with quiet confidence and indiscretion in late October.

Read more »