Addition Is Useless, Multiplication Is King

Sanjoy Mahajan in Freakonomics:

Logs-300x200TIME magazine has been running a series called “Brilliant: The science of smart” by Annie Murphy Paul. The latest column, “Why Guessing Is Undervalued,” quoted several results from research on learning estimation, a topic near to my heart. One result surprised me particularly:

…good estimators possess a clear mental number line — one in which numbers are evenly spaced, or linear, rather than a logarithmic one in which numbers crowd closer together as they get bigger. Most schoolchildren start out with the latter understanding, shedding it as they grow more experienced with numbers.

I do agree that children start out with a logarithmic understanding. I first learned this idea from a wonderful episode of WNYC’s Radio Lab on “Innate numbers” (Nov. 30, 2009). The producers had asked Stanislas Dehaene to discuss his research on innate number perception.

One of his studies involved an Indian tribe in the Amazon. This tribe does not have words for numbers beyond five, and does not have formal teaching of arithmetic. But the members have a sophisticated understanding of numbers. They were given the following problem: on a line with one object at one end, and nine objects at the other end, they were asked, “How many objects would you place directly in the middle of the line?”

What number would you choose?

Twenty years ago I would have chosen five, for five is the average of one and nine: It is larger than one by four, and smaller than nine also by four. Thus, it is halfway on a linear (or additive) number line. Almost all Western students also choose five. However, the members of the Indian tribe chose three. That choice also makes sense, but in a different way: Three is larger than one by a factor of 3, and smaller than nine also by a factor of 3. Thus, three is halfway between one and nine on a multiplicative or logarithmic number line.

Dehaene concludes, and I agree, that our innate perception of numbers is logarithmic (or multiplicative); and that we learn our linear (or additive) scale through our culture.

More here. [Thanks to Annie Murphy Paul.]

The Worst Book Ever Written

Gabe Habash at the Publishers Weekly blog PWxyz:

1qb4hBefore you get all riled up about how we’ve previously called two other books (How to Avoid Huge Ships and Dildo Cay) the Worst Book Ever, you should know that sometimes PWxyz makes mistakes. Please forgive us our mis-pronouncement and come, walk with us down the hallowed halls of literary infamy, for we have a whopper of a book to show you.

In 1987, The Book Services Ltd published a slim, 144-page cookbook called Microwave for One. The book is by Sonia Allison, who has quite a few publications under her belt. But she’s best known for her masterpiece of tragedy, a book whose title and cover is so rife with sadness that one almost has the urge to brush the invisible tears from Ms. Allison’s face as she leans over her microwave and her food spread.

Very little is known about the contents of the book, except for the few that have been lucky enough to chance upon a copy. Let’s turn to these Amazon customer reviews for some insight.

After the divorce” by Benjamin L. Hamilton

After the divorce my diet consisted primarily of uncooked ramen and whiskey. Occasionally I wondered aloud if I’d ever have another home cooked meal again.

Then I discovered “Microwave for One” and everything changed.

My favorite chapters were:

Chapter 1: Plugging in your Microwave and You

Chapter 4: How to Wait 3 Minutes

Chapter 11 [BONUS CHAPTER]: Eating with Cats

In closing, I give this book 2 thumbs up (and a paw!). Thanks Sonia Allison!

More here. [Thanks to Charles J. Shields.]

the iron lady

Cn_image.size.thatcher

Not long after she resigned as prime minister, in 1990, Margaret Thatcher began to write her memoirs. I met her at a dinner party and asked her what she would call them. The famous blue eyes flashed at me: “Undefeated!” she declared. This expressed a sober arithmetical fact. Uniquely at that time in British politics, Margaret Thatcher had won three general elections in a row as party leader and had never lost any. Before she had the chance to contest her fourth, she was deposed by members of Parliament from her own party in a coup. Yet, even in that contest, the pure numbers were on her side. In 1990, when the Conservative Party staged a challenge to her leadership, she won more legislators’ votes than her main rival, but not enough to avoid a second ballot. Her Cabinet colleagues convinced her that she would be humiliated in the runoff, and she resigned.

more from Charles Moore at Vanity Fair here.

A Jew in the Northwest

Hood98_portland_oregon_mount_hood_12-16-98_med

I was standing, like a good Northwesterner, in the produce section of my locally owned organic-food supermarket—this was a couple of years ago, not long after I had moved to Portland from the New York City area—when I heard a voice in my ear. “Excuse me,” it said. “You’re a Jew, aren’t you?” My sphincter clenched. There were two ways this could go, and neither one was good. Either the guy I could now sense hovering at my elbow was a Lubavitcher, doing outreach among his fallen brethren (drawing them near, in the term of art), or he was a Jew for Jesus, hoping to tell me about the Lord. In the first case, I would sling the brushback pitch that I had learned to keep at hand for such occasions, amply familiar from life in New York. Ma ha’avodah hazose lachem? I would say: What is this worship to you?—the words of the Wicked Son in the Passover story. (“To you,” the Haggadah explains, “and not to him. By excluding himself from the community, he has negated the essential.”) In the second case, I would probably just start screaming and ripping up his pamphlets, as I did to a guy in the subway once. Christian missionaries tend to transform me into a kind of Semitic Incredible Hulk, a ball of ethno-historical rage. (A third possibility, that I’d been teleported back to Poland, circa 1941, and was about to be invited into a cattle car, I discounted as unlikely.)

more from William Deresiewicz at The American Scholar here.

fighting the agreeable ooze

Dwight-macdonald-1

If one were to point out that the wider authority of literary criticism is barely discernible today, one could hardly be accused of courting a controversy or kicking up a fuss. There certainly is a coterie of Americans for whom literature and its criticism is a matter of urgency or livelihood or both, but the notion of the literary critic as a cultural gatekeeper, whose judgments shape tastes and move units, sounds either fanciful or anachronistic, depending on whether you believe that such a creature ever really existed. Our culture is now so big and so varied, the population so diverse and so fragmented, that the very idea of anything or anyone having “wider authority” sounds silly, if not absurd. Masscult and Midcult: Essays Against the American Grain, a selection of Dwight Macdonald’s work from the 1950s and ’60s, includes the kinds of big, critical pronouncements that today would be met with eye rolls of annoyance or, more likely, blank stares of indifference. Macdonald started out as a journalist, and he wrote literary criticism that was as politically informed as it was aesthetically attuned. His voice was cantankerous and opinionated; he provided readers with the larger context as well as the close read. And he wrote much of this biting, caustic criticism for The New Yorker, where he was a staff writer for more than a decade.

more from Jennifer Szalai at The Nation here.

Henry Morton Stanley’s Unbreakable Will

From Smithsonian:

Is willpower a mood that comes and goes? A temperament you’re born with (or not)? A skill you learn? In Willpower: Rediscovering the Greatest Human Strength, Florida State University psychologist Roy F. Baumeister and New York Times journalist John Tierney say willpower is a resource that can be renewed or depleted, protected or wasted. This adaptation from their book views Henry Morton Stanley’s iron determination in the light of social science.

Stanley-1872-631In 1887, Henry Morton Stanley went up the Congo River and inadvertently started a disastrous experiment. This was long after his first journey into Africa, as a journalist for an American newspaper in 1871, when he’d become famous by finding a Scottish missionary and reporting the first words of their encounter: “Dr. Livingstone, I presume?” Now, at age 46, Stanley was leading his third African expedition. As he headed into an uncharted expanse of rain forest, he left part of the expedition behind to await further supplies. The leaders of this Rear Column, who came from some of the most prominent families in Britain, proceeded to become an international disgrace. Those men allowed Africans under their command to perish needlessly from disease and poisonous food. They kidnapped and bought young African women. The British commander of the fort savagely beat and maimed Africans, sometimes ordering men to be shot or flogged almost to death for trivial offenses. While the Rear Column was going berserk, Stanley and the forward portion of the expedition spent months struggling to find a way through the dense Ituri rain forest. They suffered through torrential rains. They were weakened by hunger, crippled by festering sores, incapacitated by malaria and dysentery. They were attacked by natives with poisoned arrows and spears. Of those who started with Stanley on this trek into “darkest Africa,” as he called that sunless expanse of jungle, fewer than one in three emerged with him. Yet Stanley persevered. His European companions marveled at his “strength of will.” Africans called him Bula Matari, Breaker of Rocks. “For myself,” he wrote in an 1890 letter to The Times, “I lay no claim to any exceptional fineness of nature; but I say, beginning life as a rough, ill-educated, impatient man, I have found my schooling in these very African experiences which are now said by some to be in themselves detrimental to European character.

More here.

Bam! How comics teach science

From MSNBC:

ComicCan you really learn relativity from a comic book? The Japanese have been using manga for decades to teach complex subjects, and now Americans are doing it too. No Starch Press, a San Francisco publishing house, puts out a whole line of manga-style books on math and science, picked up from the original Japanese and translated for the American market. Yes, there's a “Manga Guide to Relativity,” as well as calculus, linear algebra, biochemistry and other head-banging subjects. The plot lines may sound sappy to grown-ups. Usually they involve a cute schoolgirl or schoolboy who's challenged by an equally cute teacher to master a seemingly impenetrable subject. But Bill Pollock, the founder and president of No Starch Press, says the books get the job done, especially for students who are at a crucial age for math and science education. “We're not out to publish the best manga ever,” Pollock told me. “The manga is a vehicle.”

Educational comics are nothing new, of course: Classics Illustrated, for example, was delivering comic-book versions of English lit and science class back in the '50s. (I still get the heebie-jeebies when I recall the Classics Illustrated version of “Jane Eyre” that sat in the comic-book box at Grandma's house.) More recently, cartoonist Larry Gonick has been using the comic-book format to explain subjects ranging from chemistry to physics to sex. This year, one of the items on my holiday book list is “Feynman,” a graphic-novel biography of the bongo-playing physicist. But manga books come from a different cultural tradition — the same tradition that spawned Pokemon, Hello Kitty and other Japanese imports that American kids have grown up with. In Japan, there's a manga subgenre (“gakushu manga”) that is completely focused on education. These books, which range around 200 pages in length, are the ones that have been adapted into English-language “manga guides.”

More here.

The Crises of Democratic Capitalism

3050102Wolfgang Streeck in New Left Review:

The collapse of the American financial system that occurred in 2008 has since turned into an economic and political crisis of global dimensions. How should this world-shaking event be conceptualized? Mainstream economics has tended to conceive society as governed by a general tendency toward equilibrium, where crises and change are no more than temporary deviations from the steady state of a normally well-integrated system. A sociologist, however, is under no such compunction. Rather than construe our present affliction as a one-off disturbance to a fundamental condition of stability, I will consider the ‘Great Recession’ and the subsequent near-collapse of public finances as a manifestation of a basic underlying tension in the political-economic configuration of advanced-capitalist societies; a tension which makes disequilibrium and instability the rule rather than the exception, and which has found expression in a historical succession of disturbances within the socio-economic order. More specifically, I will argue that the present crisis can only be fully understood in terms of the ongoing, inherently conflictual transformation of the social formation we call ‘democratic capitalism’.

Democratic capitalism was fully established only after the Second World War and then only in the ‘Western’ parts of the world, North America and Western Europe. There it functioned extraordinarily well for the next two decades—so well, in fact, that this period of uninterrupted economic growth still dominates our ideas and expectations of what modern capitalism is, or could and should be. This is in spite of the fact that, in the light of the turbulence that followed, the quarter century immediately after the war should be recognizable as truly exceptional. Indeed I suggest that it is not the trente glorieuses but the series of crises which followed that represents the normal condition of democratic capitalism—a condition ruled by an endemic conflict between capitalist markets and democratic politics, which forcefully reasserted itself when high economic growth came to an end in the 1970s. In what follows I will first discuss the nature of that conflict and then turn to the sequence of political-economic disturbances that it produced, which both preceded and shaped the present global crisis.

Walk-Through-Wall Effect Might Be Possible With Humanmade Object

Sn-tunneling-thumb-200xauto-11637Nathan Collins in Science Now:

If you've ever tried the experiment, you know you can't walk through a wall. But subatomic particles can pull off similar feats through a weird process called quantum tunneling. Now, a team of physicists says that it might just be possible to observe such tunneling with a larger, humanmade object, though others say the proposal faces major challenges.

If successful, the experiment would be a striking advance toward fashioning mechanical systems that behave quantum mechanically. In 2010, physicists took a key first step in that direction by coaxing a tiny object into states of motion that can be described only by quantum mechanics. Tunneling would be an even bigger achievement.

So how does quantum tunneling work? Imagine that an electron, for example, is a marble sitting in one of two depressions separated by a small hill, which represent the effects of a sculpted electric field. To cross the hill from one depression to the other, the marble needs to roll with enough energy. If it has too little energy, then classical physics predicts it can never reach the top of the hill and cross over it.

Tiny particles such as electrons, however, can still make it across even if they don't have enough energy to climb the hill. Quantum physics describes such particles as extended waves of probability—and it turns out that there is a probability that one of them will “tunnel” through the hill and suddenly materialize in the other depression, even though the electron can't occupy the high ground between the two low spots.

It sounds unlikely, but scientists and engineers have amply demonstrated quantum tunneling in semiconductors in which electrons tunnel through nonconducting layers of material.

Slavoj Zizek and Harum Scarum

Hamid Dabashi in Al Jazeera:

In Gene Nelson's “Harum Scarum” (1965), featuring Elvis Presley as the Hollywood heartthrob Johnny Tyronne, we meet the action movie star travelling through the Orient while promoting his new film, “Sands of the Desert”. Upon arrival, however, Elvis Presley/Johnny Tyronne is kidnapped by a gang of assassins led by a temptress “Oriental” named Aishah, who wish to hire him to carry out an assassination. Emboldened by proper “Western virtues”, Elvis will do no such thing and manages to sing and dance his way out of the way of the conniving “Orientals”.

In an interview with Al Jazeera, Slavoj Zizek, the Slovenian philosopher, made a rather abrupt staccato observation – a hit-and-run strike worthy of an action hero – very much reminiscent of the fate of Elvis Presley and his Oriental sojourn:

“I think today the world is asking for a real alternative. Would you like to live in a world where the only alternative is either anglo-saxon neoliberalism or Chinese-Singaporean capitalism with Asian values? I claim if we do nothing we will gradually approach a kind of a new type of authoritarian society. Here I see the world historical importance of what is happening today in China. Until now there was one good argument for capitalism: sooner or later it brought a demand for democracy … What I'm afraid of is, with this capitalism with Asian values, we get a capitalism much more efficient and dynamic than our western capitalism. But I don't share the hope of my liberal friends – give them ten years [and there will be] another Tiananmen Square demonstration – no, the marriage between capitalism and democracy is over.”

What precisely are these “Asian values,” when uttered by an Eastern European, we Asians of one sort or another may wonder? Did capitalism really have to travel all the way to China and Singapore (as Elvis did to the Orient) to lose all its proper Western virtues (and what exactly might they be) and become corrupted (or indeed carry its destructive forces to its logical conclusions)? So, are we to believe, when it flourishes in “the West”, capitalism flowers in democracy and when it assumes “Asian values” it divorces that virtue and becomes a promiscuous monster?

Elvis Presley indeed. Let us rescue capitalism from that treacherous Aishah and her Asian values and have it go back to his Western virtues.

What Zizek is warning the world against is capitalism with its newly acquired “Asian values”, as distinct from what he calls “our [his] Western capitalism”, he insists, obviously adorned with “Western virtues” – which promiscuity has already resulted in decoupling the happily-ever-after marriage of capitalism and democracy.

Richard Rhodes explores Hedy Lamarr’s other career

Sam Kean in Slate:

111117_BOOK_hedy_lamarr_jpg_CROP_article250-mediumImagine that, on Sept. 12, 2001, an outraged Angelina Jolie had pulled out a pad of paper and some drafting tools and, all on her own, designed a sophisticated new missile system to attack al-Qaida. Now imagine that the design proved so innovative that it transcended weapons technology, and sparked a revolution in communications technology over the next half-century.

Believe it or not, this essentially happened to Hedy Lamarr. Often proclaimed “the most beautiful woman in the world,” the 26-year-old Lamarr was thriving in Hollywood when, in mid-September 1940, Nazi U-boats hunted down and sank a cruise ship trying to evacuate 90 British schoolchildren to Canada. Seventy-seven drowned in the bleak north Atlantic. Lamarr, a Jewish immigrant from Nazi-occupied Austria, was horrified. She decided to fight back, but instead of the usual celebrity posturing, she sat down at a drafting table at home and sketched out a revolutionary radio guidance system for anti-submarine torpedoes.

This unlikely tale is the subject of Richard Rhodes’ new book, Hedy’s Folly. Compared to his other works, like the magisterial (and quite hefty) The Making of the Atomic Bomb, this book breezes by in 272 chatty pages. Rhodes succeeds in the most vital thing—capturing the spirit of a willful woman who wanted recognition for more than her pretty face—but he skims over the deeper questions that Lamarr’s life story raises about the nature of creative genius.

More here.

Occupation as Fairness

Cohen_36.6_flagJoshua Cohen and Seth Resler in The Boston Review:

Seth Resler: John Rawls’s magnum opus is A Theory of Justice, published in 1971. Let’s talk about what the theory actually is. It has its own name, which is “justice as fairness,” and there are two principles involved. Tell me about them.

Joshua Cohen: A Theory of Justice defends two principles of justice. The first principle is an expression of what we conventionally refer to in the United States as liberal ideas about liberty. The idea is that everyone is entitled to equal fundamental liberties including political liberty, freedom to participate in the political process, religious liberty, freedom of speech and association, freedoms associated with the rule of law—including protection of bodily integrity. Rawls says that principle has priority. That’s the first principle of the theory. We’ll call it the Liberty Principle.

The second principle has two parts, and because it has two parts it is a little more complicated than the first one. The first part of the second principle provides a way to think about equality of opportunity. The idea is that where you end up shouldn’t depend on where you start out, that your birth should not fix your fate. A little more precisely, it says that if you take two people who are equally motivated and equally able, their chances in life should not depend on differences in their social backgrounds. Your chances in life shouldn’t depend on your class background, your family background, the neighborhood you grow up in; they should depend on what you’re able to do and what you’re motivated to do. So, equally able and equally motivated, you have equal chances. That is Equality of Fair Opportunity.

The Debated Truth About the Crackdown on Occupy

Brandon-Watts-lies-injure-007Naomi Wolf in The Guardian:

US citizens of all political persuasions are still reeling from images of unparallelled police brutality in a coordinated crackdown against peaceful OWS protesters in cities across the nation this past week. An elderly woman was pepper-sprayed in the face; the scene of unresisting, supine students at UC Davis being pepper-sprayed by phalanxes of riot police went viral online; images proliferated of young women – targeted seemingly for their gender – screaming, dragged by the hair by police in riot gear; and the pictures of a young man, stunned and bleeding profusely from the head, emerged in the record of the middle-of-the-night clearing of Zuccotti Park.

But just when Americans thought we had the picture – was this crazy police and mayoral overkill, on a municipal level, in many different cities? – the picture darkened. The National Union of Journalists and the Committee to Protect Journalists issued a Freedom of Information Act request to investigate possible federal involvement with law enforcement practices that appeared to target journalists. The New York Times reported that “New York cops have arrested, punched, whacked, shoved to the ground and tossed a barrier at reporters and photographers” covering protests. Reporters were asked by NYPD to raise their hands to prove they had credentials: when many dutifully did so, they were taken, upon threat of arrest, away from the story they were covering, and penned far from the site in which the news was unfolding. Other reporters wearing press passes were arrested and roughed up by cops, after being – falsely – informed by police that “It is illegal to take pictures on the sidewalk.”

Corey Robin responds:

On Friday, Naomi Wolf made the attention-grabbing accusation in the Guardian that federal officials were involved in, indeed ordered, the violent crackdowns against Occupy Wall Street protesters that we’ve been seeing across the country these past few weeks.

Congressional overseers, with the blessing of the White House, told the DHS [Department of Homeland Security] to authorise mayors to order their police forces – pumped up with millions of dollars of hardware and training from the DHS – to make war on peaceful citizens.

The next day, Joshua Holland debunked Wolf’s claims on Alternet.

I don’t have anything to add to Holland’s excellent critique. Wolf gets her facts wrong, and he shows it.

To my mind, though, the problem is bigger than that: The reason Wolf gets her facts wrong is that she’s got her theory wrong. And though many were quick to jump off her conspiracy bandwagon once Holland pointed out its flaws, I suspect that one of the reasons they were so quick to jump on it in the first place is that they subscribe to her theory.

Why Marriage is a Declining Option for Modern Women

Kate-bolick-007Kate Bolick in The Observer:

For thousands of years, marriage had been a primarily economic and political contract between two people, negotiated and policed by their families, church and community. It took more than one person to make a farm or business thrive, and so a potential mate's skills, resources, thrift and industriousness were valued as highly as personality and attractiveness. This held true for all classes. In the American colonies, wealthy merchants entrusted business matters to their landlocked wives while off at sea, just as sailors, vulnerable to the unpredictability of seasonal employment, relied on their wives' steady income as domestics in elite households. Two-income families were the norm.

Not until the 18th century did labour begin to be divided along a sharp line: wage-earning for the men and unpaid maintenance of household and children for the women. Coontz notes that as recently as the late 17th century, women's contributions to the family economy were openly recognised, and advice books urged husbands and wives to share domestic tasks. But as labour became separated, so did our spheres of experience – the marketplace versus the home – one founded on reason and action, the other on compassion and comfort. Not until the postwar gains of the 1950s, however, were a majority of American families able to actually afford living off a single breadwinner.

All of this was intriguing, for sure – but even more surprising to Coontz was the realisation that those alarmed reporters and audiences might be on to something. Coontz still didn't think that marriage was falling apart, but she came to see that it was undergoing a transformation far more radical than anyone could have predicted, and that our current attitudes and arrangements are without precedent. “Today we are experiencing a historical revolution every bit as wrenching, far-reaching, and irreversible as the Industrial Revolution,” she wrote.

Last summer I called Coontz to talk to her about this revolution. “We are without a doubt in the midst of an extraordinary sea change,” she told me. “The transformation is momentous – immensely liberating and immensely scary. When it comes to what people actually want and expect from marriage and relationships, and how they organise their sexual and romantic lives, all the old ways have broken down.”

Giving thanks for error bars

Sean Carroll in Cosmic Variance:

FileStandard_deviation_diagramError bars are a simple and convenient way to characterize the expected uncertainty in a measurement, or for that matter the expected accuracy of a prediction. In a wide variety of circumstances (though certainly not always), we can characterize uncertainties by a normal distribution — the bell curve made famous by Gauss. Sometimes the measurements are a little bigger than the true value, sometimes they’re a little smaller. The nice thing about a normal distribution is that it is fully specified by just two numbers — the central value, which tells you where it peaks, and the standard deviation, which tells you how wide it is. The simplest way of thinking about an error bar is as our best guess at the standard deviation of what the underlying distribution of our measurement would be if everything were going right. Things might go wrong, of course, and your neutrinos might arrive early; but that’s not the error bar’s fault.

Now, there’s much more going on beneath the hood, as any scientist (or statistician!) worth their salt would be happy to explain. Sometimes the underlying distribution is not expected to be normal. Sometimes there are systematic errors. Are you sure you want the standard deviation, or perhaps the standard error? What are the error bars on your error bars?

More here.

the suffering of the vegetables

15318_420789977847_538727847_5357287_3545709_n

Sir Jagadish Chandra Bose, the aforementioned carrot vivisector, was a serious man of science. Born in what is today Bangladesh in 1858, Bose was a quintessential polymath: physicist, biologist, botanist, archaeologist. He was the first person from the Indian subcontinent to receive a U.S. patent, and is considered one of the fathers of radio science, alongside such notables as Tesla, Marconi, and Popov. He was elected Fellow of the Royal Society in 1920, becoming the first Indian to be honored by the Royal Society in the field of science. It’s clear that Sir Jagadish Chandra Bose was a scientist of some weight. And, like many scientists of weight, he has become popularly known for his more controversial pursuits — in Bose’s case, his experiments in plant physiology. Perhaps it was his work in radio waves and electricity that inspired Bose’s investigations into what we might call the invisible world. Bose strongly felt that physics could go far beyond what was apparent to the naked eye. Around 1900, Bose began his investigations into the secret world of plants. He found that all plants, and all parts of plants, have a sensitive nervous system not unlike that of animals, and that their responses to external stimuli could be measured and recorded. Some plant reactions can be seen easily in sensitive plants like the Mimosa, which, when irritated, will react with the sudden shedding or shrinking of its leaves. But when Bose attached his magnifying device to plants from which it was more difficult to witness a response, such as vegetables, he was astounded to discover that they, too, became excited when vexed. All around us, Bose realized, the plants are communicating. We just don’t notice it.

more from Stefany Anne Golberg at The Smart Set here.

on the American way of death

Tumblr_lvaf00DUTj1qhwx0o

There are, in other words, two aspects to the phenomenon of death. On the one hand, there is death itself — immutable, the single certainty all of us face, unchanging as it has always been. On the other hand, though, is how we living face the death of others, which is constantly changing, composed of ritual, emotion, and something that each culture and each generation must define — and redefine — for itself. Our current culture seems generally uncomfortable with facing the prospect of mourning, and even more uncomfortable with the dead body itself. Only nine days after the attacks of September 11, 2001, George W. Bush forcefully declared that it was time to turn grief into action, attempting to foreclose any extended period of public mourning period. And personal losses aren’t much different; half a century ago, Jessica Mitford’s The American Way of Death laid bare the amount of chemicals, makeup, and money we waste in order to give death a pleasant, less death-like appearance. Death is a thing to be acknowledged but not dwelled on, not faced head-on.

more from Colin Dickey at the LA Review of Books here.

ken russell (1927-2011)

10-Ken-Russell-1-REX

It is the commonplace fate of British cinema’s more visionary talents to end their careers marginalised and even mocked. This was certainly what happened to Ken Russell, who has died aged 84. In his latter years, with his shock of white hair and his red face, the director cut a cantankerous and slightly buffoonish figure. He asked for money for interviews. His greatest work wasn’t much in circulation. Those who knew him from such lesser efforts as The Fall Of The Louse Of Usher (2002), his eccentric and low-budget Edgar Allan Poe adaptation, or for his Cliff Richard and Sara Brightman videos, were probably baffled that he had such a glowing reputation. The director’s son Alex Verney-Elliott said his father had died in hospital after a series of strokes. Russell’s widow, Elize, said she was “devastated” by her husband’s death, which had been “completely unexpected”. Even in his pomp, he had always been a figure of considerable controversy. He was so often called the “enfant terrible” of British film that no one paid as much attention to his craftsmanship as they should have done.

more from Geoffrey Macnab at The Independent here.

Tuesday Poem

Vocation

Each cold October morning he went out
into the Gate Field and walked up and down,
like the horse-drawn seed-drill quartering every inch
to make sure the harvest was kept constant,
reading his Office, every sentence
of the forty pages for the day. In the evening,
as the colder darkness fell with the crows’
harsh calling, he sat alone in the back
benches of the unheated chapel, hour
after hour, staring for inspiration
at the golden, unresponsive tabernacle.

by Bernard O'Donoghue
Publisher: PIW, © 2011