How should I live my life?

Contraception2

So what does it mean for the country that our cultural common denominator is shrinking? That increasingly Americans have very little experiences through which to understand the lives of our fellow citizens? And why, in the midst of these trends is there general agreement on an issue as potentially flammable as contraception? Recently I found good answers to these questions in an unexpected place — in an essay on literature and ethics that provides a convincing account of the rock bottom consequences of a fractured population. The essay is called “Perceptive Equilibrium: Literary Theory and Ethical Theory,” and it was first given as a talk by the philosopher Martha Nussbaum 25 years ago. The purpose of the paper was to merge literary theory with ethical theory — to show how forms of art like the novel can help us answer arguably the two most fundamental philosophical questions: How should I live my life? How should we live together? Here is Nussbaum describing the centrality of literature to ethics: “One of the things that makes literature something deeper and more central for us than a complex game, deeper even than those games, for example chess and tennis, that move us to wonder by their complex beauty, is that it speaks like Strether. It speaks about us, about our lives and choices and emotions, about our social existence and the totality of our connections.”

more from Kevin Hartnett at The Millions here.

proust’s mom

Proust13

There are texts that seem to require a certain craziness of us, a mismeasure of response to match the extravagance of their expression. But can a mismeasure be a match? All we know is that we don’t want to lose or reduce the extravagance but can’t quite fall for it either. An example would be Walter Benjamin’s wonderful remark about missed experiences in Proust: None of us has time to live the true dramas of the life that we are destined for. This is what ages us – this and nothing else. The wrinkles and creases on our faces are the registration of the great passions, vices, insights that called on us; but we, the masters, were not at home. Even without the ‘nothing else’ this is a pretty hyperbolic proposition. With the ‘nothing else’ it turns into a form of madness, a suggestion that we shall not grow old at all unless we keep failing to receive the passions, vices and insights that come to see us. This would be a life governed by new necessities, entirely free from the old ones, exempt from time and biology. The sentences are clear enough but don’t read easily as fantasy or figure of speech. Benjamin is asking us to entertain this magical thought for as long as we can, and not to replace it too swiftly by something more sensible.

more from Michael Wood at the LRB here.

Tuesday, March 27, 2012

In the Land of Blood and Honey

In_the_Land_of_Blood_and_Honey_5Srecko Horvat on Angelina Jolie's new film In the Land of Blood and Honey about an affair between a Serb and Muslim during the Balkan war, in Eurozine (Warning: the article contains spoilers):

The movie tells the story of Danijel, a soldier fighting for the Bosnian Serbs, and Ajla, a Bosnian Muslim who was involved with him before the war and is now a captive in the concentration camp he oversees. It's a bad repetition of the same good old story depicted most recently in The Reader (Stephen Daldry, 2008), and unforgettably in The Night Porter (Liliana Cavani, 1974). In short, it's a story about the perpetrator and the victim and a reversal of these perspectives as the story goes on. On the one hand you have a war criminal (a concentration camp guard in The Reader, the former SS officer in The Night Porter, the Serbian officer in Jolie's movie), on the other hand you have the victim (the boy who read to the concentration camp guard, the concentration camp survivor, the innocent Muslim woman in the Bosnian war). What all three films have in common is a fatal love affair between a criminal and an innocent victim, the only difference being that, in The Reader, the boy finds out eight years later, when as a law student, he observes a trial of several women (including his former lover) accused of letting 300 Jewish women die in a burning church.

Common to all these films is also that the roles become less and less clear as the story develops. The best example is The Night Porter, where thirteen years after the concentration camp, Lucia meets Maximillian again, who is now working at a Vienna hotel; instead of exposing him, she falls back into their sadomasochistic relationship. The relationship is what Primo Levi – remembering the case of the Sonderkommando, the “special units” of camp inmates in charge of bringing their neighbours to the gas chambers – calls the “gray zone”, the zone in which the “long chain of conjunction between victim and executioner” comes loose. Or, as Giorgo Agamben puts it in his Remnants of Auschwitz, “where the oppressed becomes oppressor and the executioner in turn appears as victim. A gray, incessant alchemy in which good and evil and, along with them, all the metals of traditional ethics reach their point of fusion”.[2]

The best expression of this new terra ethica was articulated by Michael in Bernhard Schlink's novel The Reader, on which the film was based: “I wanted simultaneously to understand Hanna's crime and to condemn it. But it was too terrible for that. When I tried to understand it, I had the feeling I was failing to condemn it as it must be condemned. When I condemned it as it must be condemned, there was no room for understanding. But even as I wanted to understand Hanna, failing to understand her meant betraying her all over again. I could not resolve this. I wanted to pose myself both tasks – understanding and condemnation. But it was impossible to do both.”[3] In other words, when we try to understand the crime, then we stop condemning it; and when we condemn, then we stop understanding it.

So, what is missing in Jolie's movie?

A Short Course in Thinking About Thinking

DKahnemannDaniel Kahneman in Edge:

SESSION ONE

I'll start with a topic that is called an inside-outside view of the planning fallacy. And it starts with a personal story, which is a true story.

Well over 30 years ago I was in Israel, already working on judgment and decision making, and the idea came up to write a curriculum to teach judgment and decision making in high schools without mathematics. I put together a group of people that included some experienced teachers and some assistants, as well as the Dean of the School of Education at the time, who was a curriculum expert. We worked on writing the textbook as a group for about a year, and it was going pretty well—we had written a couple of chapters, we had given a couple of sample lessons. There was a great sense that we were making progress. We used to meet every Friday afternoon, and one day we had been talking about how to elicit information from groups and how to think about the future, and so I said, Let's see howwe think about the future.

I asked everybody to write down on a slip of paper his or her estimate of the date on which we would hand the draft of the book over to the Ministry of Education. That by itself by the way was something that we had learned: you don't want to start by discussing something, you want to start by eliciting as many different opinions as possible, which you then you pool. So everybody did that, and we were really quite narrowly centered around two years; the range of estimates that people had—including myself and the Dean of the School of Education—was between 18 months and two and a half years.

But then something else occurred to me, and I asked the Dean of Education of the school whether he could think of other groups similar to our group that had been involved in developing a curriculum where no curriculum had existed before. At that period—I think it was the early 70s—there was a lot of activity in the biology curriculum, and in mathematics, and so he said, yes, he could think of quite a few. I asked him whether he knew specifically about these groups and he said there were quite a few of them about which he knew a lot. So I asked him to imagine them, thinking back to when they were at about the same state of progress we had reached, after which I asked the obvious question—how long did it take them to finish?

An Interview with Margarethe von Trotta on Her Upcoming Film About Hannah Arendt

8898401-STANDARDOver at the Goethe Institute:

Thinking and writing, those are the things that really defined the great philosopher Hannah Arendt. The objective of the film was to transform this thought into a film, to make it a visual embodiment of a real person.

How does one use film to describe a woman who thinks? How can we watch her while she thinks? That is of course the big challenge when making a film about intellectual personalities. I insisted that Barbara Sukowa play Hannah because she is the only actress I know who I could imagine showing me how someone thinks, or that someone is thinking. And she managed to do it. For me, it was clear from the beginning that she was the one, and I had to push for her to get the role because some of the investors couldn’t visualize it. I said to them, “I am not doing this film without her.” I had the same situation with Rosa Luxemburg and again with Hildegard von Bingen – she really experienced the intellectual nature of Rosa’s political speeches, for example. That is how it is with Hannah Arendt. The viewer has to see that she is really thinking. She does two speeches in this film as well. Arendt was a professor at various universities in the United States and she did seminars and speeches on philosophical and political subject matter. In situations like that, it’s not about just reading your lines. You have to be able to improvise and develop the speech as you go. In the film there is a six-minute speech in English, with the strong German accent that Arendt had, and Sukowa is able to get viewers to experience, think and follow her analyses.

What were the preparations for the film like? And what about your contact with Arendt’s world?

Before we started writing the screenplay we met with a lot of people in New York who had known Arendt well on a personal level. People like Lotte Köhler, her longtime colleague and friend who died in 2011 at the age of 92, or Elisabeth Young-Bruehl, who also died in 2011, as well as others like Lore Jonas, widow of Hans Jonas, and Jerome Kohn, her last assistant and publisher of her posthumous writings. Those were amazing encounters, the stuff you need when you are writing a script about this type of real person who you’ve never met yourself.

Hilton Kramer, 1928-2012

Kramer1-articleInlineWilliam Grimes in the NYT:

Admired for his intellectual range and feared for his imperious judgments, Mr. Kramer emerged as a critic in the early 1950s and joined The Times in 1965, a time when the tenets of high modernism were being questioned and increasingly attacked. He was a passionate defender of high art against the claims of popular culture and saw himself not simply as a critic offering informed opinion on this or that artist, but also as a warrior upholding the values that made civilized life worthwhile.

This stance became more marked as political art and its advocates came to the fore, igniting the culture wars of the early 1980s, a struggle in which Mr. Kramer took a leading role as the editor of The New Criterion, where he was also a frequent contributor.

In its pages, Mr. Kramer took dead aim at a long list of targets: creeping populism at leading art museums; the incursion of politics into artistic production and curatorial decision making; the fecklessness, as he saw it, of the National Endowment for the Arts; and the decline of intellectual standards in the culture at large.

A resolute high modernist, he was out of sympathy with many of the aesthetic waves that came after the great achievements of the New York School, notably Pop (“a very great disaster”), conceptual art (“scrapbook art”) and postmodernism (“modernism with a sneer, a giggle, modernism without any animating faith in the nobility and pertinence of its cultural mandate”).

At the same time, he made it his mission to bring underappreciated artists to public attention and open up the history of 20th-century American art to include figures like Milton Avery and Arthur Dove, about whom he wrote with insight and affection.

the fate of the western

Winchester_73

However much certain optimists may talk about the survival or possible resurrection of the Western, I fear—much to my regret—that, as a genre, it is pretty well dead and buried, a relic of a more credulous, more innocent, more emotional age, an age less crushed or suffocated by the ghastly plague of political correctness. Nonetheless, whenever a new Western comes out, I dutifully go and see it, albeit with little expectation that it will be any good. In the last decade, I can recall three pointless remakes, vastly inferior to the movies on which they were modelled and which weren’t exactly masterpieces themselves: 3:10 to Yuma by James Mangold, The Alamo by John Lee Hancock, and True Grit by the Coen brothers, all of them uninspired and unconvincing, and far less inspired than the distinctly uneven originals made, respectively, by Delmer Davies, John Wayne, and Henry Hathaway. I recall, too, Andrew Dominik’s interesting but dull The Assassination of Jesse James by the Coward Robert Ford, Ed Harris’s bland, soulless Appaloosa, David von Ancken’s unbearable Seraphim Falls, and the Australian John Hillcoat’s The Proposition, of which my memory has retained not a single image. The only recent Westerns that have managed to arouse my enthusiasm have been those made for TV: Walter Hill’s Broken Trail, and Deadwood, whose third and final season no one has even bothered to bring out on DVD in Spain, which gives you some idea of how unsuccessful the magnificent first two series must have been. In my view, Kevin Costner’s Open Range, which came out slightly earlier, was the last decent Western to be made for the big screen, even though it has long been fashionable to denigrate anything this admirable actor and director does.

more from Javier Marías at Threepenny Review here.

whoever we may be, we are aliens too

Image

Vincent Gallo is one of the most disliked of current film actors, while George Clooney is one of the most admired, but most viewers of Essential Killing—American, Belgian, Sri Lankan, or Japanese—probably have more in common with Gallo’s “Mohammed” than they have with Clooney. Anyone can be targeted, victimized, have their eardrums blasted out, be forced to hide and kill in order to survive. All these are possibilities of human existence that, at the advanced stage of civilization we enjoy, are available to everyone. But to be George Clooney? He may make it look easy. It’s in the voice, however, that the deceptive quality of the Clooney figure can best be detected. Clooney, who is from Lexington, Kentucky, speaks with an unmarked accent, an accent of zero. His vocal deadpan (so soothing in Wes Anderson’s Fantastic Mr. Fox [2009]) projects a reasonableness and an authority that do not impose themselves through any apparent violence. When he talks, it’s as if he were saying nothing. Such a talent makes him indeed The American.

more from Chris Fujiwara at n+1 here.

first act is final curtain

Ec9fabe6-7489-11e1-9951-00144feab49a

It’s impossible to know how Francesca Woodman’s photographs would strike us if she hadn’t thrown herself out of a window at 22. Her suicide makes every image feel portentous. Each is a memento mori, a harbinger of imminent death. She specialised in self-portraits and the suite of choreographed scenes she shot with a timer or a remote trigger seems in retrospect a record of her unravelling. We rarely see her face. She bleeds into the background in very long exposures and disappears into crumbling walls. Her limbs vanish behind wallpaper and blur into architecture. Her flesh is barely solid, melting into mist and yielding to the rigid surface of a windowpane. The new exhibition of her work at New York’s Guggenheim Museum prompts a series of unanswerable questions. Would Woodman’s fierce self-scrutiny have ebbed with maturity or would it have inflected her entire career? Did the monomaniacal intensity of her work propel her towards death?

more from Ariella Budick at the FT here.

More than Health Insurance

From The New Yorker:

Health-care-supreme-court-protestOn Monday, the case of the century got even bigger. In challenges to the Affordable Care Act in lower courts, several judges gave the Supreme Court an escape hatch. These judges, including Brett Kavanaugh, a young judge sure to make Republican short lists for the Supreme Court, said that the Justices should kick the can down the road and put off a decision for a year or two. Specifically, Kavanaugh said that the Tax Anti-Injunction Act (a deeply obscure law) compelled the Justices to put off a decision on the law until it takes full effect, in 2014.

Across the ideological spectrum, the Justices, through their questions to the lawyers arguing for and against the upholding the A.C.A., declined the invitation for delay. They all (that is, the eight who asked questions; Clarence Thomas did not) seemed to recognize that there were legal and prudential reasons to resolve this issue now. As Justice Ruth Bader Ginsburg said, the act “does not apply to penalties that are designed to induce compliance with the law, rather than to raise revenue. And this is not a revenue-raising measure because, if it’s successful, they—nobody will pay the penalty, and there will be no revenue to raise.” The Court, it now seems clear, will decide this case on the merits.

More here.

At Bottom of Pacific, Director Sees Dark Frontier

From The New York Times:

CamNo sea monsters. No strange life. No fish. Just amphipods — tiny shrimplike creatures swimming across a featureless plane of ooze that stretched off into the primal darkness. “It was very lunar, a very desolate place,” James Cameron, the movie director, said in a news conference on Monday after completing the first human dive in 52 years to the ocean’s deepest spot, nearly seven miles down in the western Pacific. “We’d all like to think there are giant squid and sea monsters down there,” he said, adding that such creatures still might be found. But on this dive he saw “nothing larger than about an inch across” — just the shrimplike creatures, which are ubiquitous scavengers of the deep.

His dive, which had been delayed by rough seas for about two weeks, did not go entirely as planned: his submersible’s robot arm failed to operate properly, and his time at the bottom was curtailed from a planned six hours to about three. It was not entirely clear why. But he did emerge safely from the perilous trip, vowing to press on. The area he wants to explore, he said, was 50 times larger than the Grand Canyon. “I see this as the beginning,” Mr. Cameron said. “It’s not a one-time deal and then you move on. It’s the beginning of opening up this frontier.” National Geographic, which helped sponsor the expedition to the area known as the Challenger Deep, said that Mr. Cameron, the maker of the movies “Avatar” and “Titanic,” began his dive on Sunday at 3:15 p.m. Eastern Daylight Time, landed on the bottom at 5:52 p.m. and surfaced at 10 p.m. He conducted the news conference via satellite as he was being rushed to Guam in the hope of reaching London for the debut on Tuesday of “Titanic 3-D.”

More here.

Monday, March 26, 2012

Sunday, March 25, 2012

The Originality of the Species

Atoms-artwork-008Ian McEwan in The Guardian (via Mark Trodden):

[T]he modern artefact bears the stamp of personality. The work is the signature. The individual truly possesses his or her own work, has rights in it, defines himself by it. It is private property that cannot be trespassed on. A great body of law has grown up around this possessiveness. Countries that do not sign up to the Berne Convention and other international agreements relating to intellectual property rights find themselves excluded from the mainstream of a globalised culture. The artist owns his work, and sits glowering over it, like a broody hen on her eggs. We see the intensity of this fusion of originality and individuality whenever a plagiarism scandal erupts. (I’ve had some experience of it myself.)

The dust-jacket photograph, though barely relevant to an appreciation of a novel, seals the ownership. This is me, it says, and what you have in your hands is mine. Or is me. We see it too in the cult of personality that surrounds the artist – individuality and personality are driven to inspire near-religious devotion. The coach parties at Grasmere, the cult of Hemingway, or Picasso, or Neruda. These are big figures – their lives fascinate us sometimes even more than their art.

This fascination is relatively new. In their day, Shakespeare, Bach, Mozart, even Beethoven were not worshipped, they did not gleam in the social rankings the way their patrons did, or in the way that Byron or Chopin would do, or in the way a Nobel Prize-winner does today. How the humble artist was promoted to the role of secular priest is a large and contentious subject, a sub-chapter in the long discussion about individuality and modernity. The possible causes make a familiar list – capitalism, a growing leisured class, the Protestant faith, the Romantic movement, new technologies of communication, the elaboration of patent law following the Industrial Revolution. Some or all of these have brought us to the point at which the identification of the individual and her creativity is now complete and automatic and unquestionable. The novelist today who signs her name in her book for a reader, and the reader who stands in line waiting for his book to be signed collude in this marriage of selfhood and art.

There is an antithetical notion of artistic creation, and though it has been expressed in different forms by artists, critics and theoreticians, it has never taken hold outside the academies. This view holds that, of course, no one escapes history. Something cannot come out of nothing, and even a genius is bound by the constraints and opportunities of circumstance. The artist is merely the instrument on which history and culture play. Whether an artist works within his tradition or against it, he remains its helpless product. The title of Auden’s essay, “The Dyer’s Hand”, is just a mild expression of the drift. Techniques and conventions developed by predecessors – perspective, say, or free indirect style (the third person narrative coloured by a character’s subjective state) are available as ready-made tools and have a profound effect. Above all, art is a conversation conducted down through the generations. Meaningful echoes, parody, quotation, rebellion, tribute and pastiche all have their place. Culture, not the individual talent, is the predominant force; in creative writing classes, young writers are told that if they do not read widely, they are more likely to be helplessly influenced by those whose work they do not know.

Such a view of cultural inheritance is naturally friendly to science.

An Antimatter Breakthrough

From Liz Mermin's documentary in progress: “On 7 March, the journal Nature published the latest results from the ALPHA experiment at CERN. The findings were called “historic.” ALPHA first made science history in 2010, when they created atoms of anti-hydrogen; in 2011 they succeeded in trapping and holding these atoms for an astonishing 1000 seconds. In these three short films, members of the ALPHA collaboration explain their latest triumph, revealing the excitement behind this extroardinary scientific process.”

Read more »

Cynthia Nixon, Joseph Massad, and Not Being an American Gigolo

FoucaultScott Long in A Paper Bird:

In the politics of identity, bisexuals are hated because they stand for choice. The game is set up so as to exclude the middle; bisexuals get squeezed out. in the “LGBT” word, the “B” is silent. John Aravosis, for instance, says that if you’re into both genders, “that’s fine” — great! — but “most people” aren’t. First off, that rather defies Freud and the theory of universal infantile bisexuality. But never mind that. The business of “outing,” of which Aravosis has been an eloquent proponent, also revolves around the excluded middle. It’s not a matter of what you think of outing’s ethics, on which there’s plenty of debate. It’s that the underlying presumption is that one gay sex act makes you “gay” — not errant, not bisexual, not confused or questioning: gay, gay, gay. I saw you in that bathroom, for God’s sake! You’re named for life! It’s also that the stigma goes one way only: a lifetime of heterosexual sex acts can’t make up for that one, illicit, overpowering pleasure. As I’ve argued, this both corresponds to our own buried sense, as gays, that it is a stigma, and gives us perverse power. In the scissors, paper, rock game of sexuality, gay is a hand grenade. It beats them all.

And this fundamentalism infects other ways of thinking about sexuality, too. Salon today carries an article about multiple sex-and-love partners: “The right wants to use the ‘slippery slope’ of polyamory to discredit gay marriage. Here’s how to stop them.” I’ll leave you to study the author’s solution. He doesn’t want to disrespect the polyamorists:

I reject the tactic of distinguishing the good gays from the “bad” poly people. Further marginalizing the marginalized is just the wrong trajectory for any liberation movement to take.

That’s true — although whether we’re still really a liberation movement, when we deny the liberty of self-description, is a bit doubtful. But he goes on, contemplating how polyamory might in future be added to the roster of rights:

Really, there are a host of questions that arise in the case of polyamory to which we just don’t know the answer. Is polyamory like sexual orientation, a deep trait felt to be at the core of one’s being? Would a polyamorous person feel as incomplete without multiple partners as a lesbian or gay person might feel without one? How many “truly polyamorous” people are there?

Well, what if it’s not? What if you just choose to be polyamorous? God, how horrible! You beast! What can be done for the poor things? Should some researcher start looking for a gene for polyamory, so it can finally become respectable, not as a practice, but as an inescapable doom? (I shudder to think there’s one gene I might share with Newt Gingrich.)

What, moreover, if sexual orientation itself is not “a deep trait felt to be at the core of one’s being,” one that people miraculously started feeling in 1869, when the word “homosexual” was coined? What if it’s sometimes that, sometimes a transient desire, sometimes a segment of growth or adolescent exploration, sometimes a recourse from the isolations of middle age, sometimes a Saturday night lark, sometimes a years-long passion? What if some people really do experience it as … a choice?

What if our model for defending LGBT people’s rights were not race, but religion? What if we claimed our identities were not something impossible to change, but a decision so profoundly a part of one’s elected and constructed selfhood that one should never be forced to change it?

Hey Dude

0212ILIN01Robert Lane Greene in More Intelligent Life (for Sophie Schulte-Hillen):

Slang rarely has staying power. That is part of its charm; the young create it, and discard it as soon as it becomes too common. Slang is a subset of in-group language, and once that gets taken up by the out-group, it’s time for the in-crowd to come up with something new. So the long life of one piece of American slang, albeit in many different guises, is striking. Or as the kids would say, “Dude!”

Though the term seems distinctly American, it had an interesting birth: one of its first written appearances came in 1883, in the American magazine, which referred to “the social ‘dude’ who affects English dress and the English drawl”. The teenage American republic was already a growing power, with the economy booming and the conquest of the West well under way. But Americans in cities often aped the dress and ways of Europe, especially Britain. Hence dude as a dismissive term: a dandy, someone so insecure in his Americanness that he felt the need to act British. It’s not clear where the word’s origins lay. Perhaps its mouth-feel was enough to make it sound dismissive.

From the specific sense of dandy, dude spread out to mean an easterner, a city slicker, especially one visiting the West. Many westerners resented the dude, but some catered to him. Entrepreneurial ranchers set up ranches for tourists to visit and stay and pretend to be cowboys themselves, giving rise to the “dude ranch”.

By the 1950s or 1960s, dude had been bleached of specific meaning. In black culture, it meant almost any male; one sociologist wrote in 1967 of a group of urban blacks he was studying that “these were the local ‘dudes’, their term meaning not the fancy city slickers but simply ‘the boys’, ‘fellas’, the ‘cool people’.”

From the black world it moved to hip whites, and so on to its enduring associations today—California, youth, cool. In “Easy Rider” (1969) Peter Fonda explains it to the square Jack Nicholson: “Dude means nice guy. Dude means a regular sort of person.” And from this new, broader, gentler meaning, dude went vocative.

The Return of Mad Men and the End of TV’s Golden Age

IAndy Greenwald in Grantland:

[L]ike the Komodo dragon or Kirk Cameron, a few Golden Age shows remain in production even if their evolutionary time has passed. Larry David will keep kvetching as long as there's bile in his body, and the brilliant Breaking Bad has one more batch of crystal to cook. But with three full seasons stretching out before us like the red carpet at the Clios, Mad Men will be the last of the Golden Age shows to grace our flat-screens. With a typically outstanding new episode, the first in 17 months, due to premiere on Sunday, it's worth asking: Is it also the best?

The line of inheritance from first to last is almost too neat: David Chase hired Matt Weiner to the Sopranos off of the cigarette-stained spec of Mad Men, a script originally written by Weiner in an aspirational frenzy while toiling on the Bronze Age Ted Danson sitcom Becker. Weiner's infamous penchant for micromanaging and rewriting was learned at the foot of Chase, and Don Draper is a direct descendent of Tony Soprano; the two share a charismatic corruption, the last of the troubled titans. But this is where the comparisons end. The Sopranos, in all its digressive genius, was a show dedicated to the impossibility of change. Season by season, Chase built a red-sauce-spattered shrine to a lifetime of lessons learned on Dr. Melfi-esque couches: that people are who they are, no matter what. At its core, The Sopranos was Chase's grand F.U. to all the hard-worn stereotypes of Television 1.0, the boring brontosaur he'd finally managed to dump in the Meadowlands. There was no hugging in Tony's New Jersey. No learning or smoothing or straightening. Tony Soprano was Tony Soprano: an amiable monster. In the end, Chase argued with nihilistic aplomb, it doesn't much matter how the Satriale sausage was made, just whether it was spicy or sweet. And when he began to feel revulsion toward his audience's bloodlust, he denied them even that: The finale's fade to black ensured Tony would be stuck with himself for eternity. To Chase it was a fate worse than prison or a slug to the head from a mook in a Member's Only jacket; a karmic feedback loop in the shape of an onion ring.

Mad Men is different. It's less dark and more expansive than its ancestor because, unlike Chase, Weiner isn't asking questions that he's already convinced himself can't be answered. Where The Sopranos was angry, Mad Men is curious. Even at his grief-wracked, whiskey-bloated nadir last season, being Don Draper wasn't a life sentence because Don Draper doesn't exist. He's merely a particularly dapper suit that Dick Whitman is trying on for size. On Mad Men, identity is what's fungible, not nature.

The Birangana and the birth of Bangladesh

From Himal Southasian:

The year 1971 was a landmark in Southasian history for many reasons. It included the birth of Bangladesh but also the war fought by Pakistan and India. It was perhaps the only such conflict involving the three most populous Southasian countries, clashing for the first time since the end of colonial rule. High-level politics and the tumultuous times spawned a number of books on war, international relations and human rights. However, an uncanny silence has remained about one aspect of the war – the sexual crimes committed by the Pakistan Army and its collaborators, the Razakar militia, against Bangladeshi women. It is only now, 40 years on, that some of that silence is being broken.

Bina D’Costa’s new Nationbuilding, Gender and War Crimes in South Asia takes on the mammoth task of placing violence against women during the war in a larger political context. While what D’Costa calls the ‘original cartographic trauma’ of the Subcontinent has been well researched, gendered nation-building narratives have been given little consideration. Yet D’Costa proposes that any theorisation of nation-building in post-Partition India and Pakistan, or post-Liberation Bangladesh, is incomplete without a gendered analysis. Recognising that women have largely been silenced by state historiography, feminist scholars and activists in Southasia – Veena Das, Kamla Bhasin, Ritu Menon, Urvashi Butalia – have attempted to explore this sordid aspect of war. That rape has been used as a weapon of war has been well documented. One of the more famous examples is American feminist Susan Brownmiller’s investigation of rapes committed during the two World Wars, in Vietnam and then in Bangladesh, which emerged as the 1975 classic Against Our Will: Men, women and rape. The idea of defiling the enemy population by raping its women and impregnating them, often while their helpless and ‘feminised’ menfolk watch, is based on notions of honour, purity and emasculating the opposition. These notions of defilement also led to the sacrificial killing, sometimes by their own families, of women who had either been raped or even simply exposed to the potential of sexual violence.

More here.