How Conservatives Lost their Faith in Science

Alan Boyle in MSNBC's Cosmic Log:

Gauchat cross-referenced attitudes toward the scientific community with various demographic categories, and found that two categories showed a significant erosion of trust in science: conservatives and frequent churchgoers. People who identified themselves as conservatives voiced more confidence in science than moderates or liberals in 1974, but by 2010, that level had fallen by more than 25 percent.

This graph shows the unadjusted mean values for public trust in science, classified by self-reported political ideology between 1974 and 2010. The figures are derived from the General Social Survey.

Why the drop? Gauchat suggested that the character of the conservative movement has changed over the past three and a half decades — and so has the character of the scientific establishment.

“Over the last several decades, there's been an effort among those who define themselves as conservatives to clearly identify what it means to be a conservative,” he said. “For whatever reason, this appears to involve opposing science and universities, and what is perceived as the 'liberal culture.' So, self-identified conservatives seem to lump these groups together and rally around the notion that what makes 'us' conservatives is that we don't agree with 'them.'”

Meanwhile, the perception of science's role in society has shifted as well.

“In the past, the scientific community was viewed as concerned primarily with macro structural matters such as winning the space race,” Gauchat said. “Today, conservatives perceive the scientific community as more focused on regulatory matters such as stopping industry from producing too much carbon dioxide.”

30 Years of Subaltern Studies: Conversations with Gyanendra Pandey and Partha Chatterjee

5527716539_fdba2044b5In Cultural Anthropology:

McGrail: I’d like to start by asking if you could give us an overview of the term “subaltern studies” and explain how it has evolved in the past few decades.

Chatterjee: When the Subaltern Studies Collective began, our initial move was a reading Antonio Gramsci’s Prison Notebooks, which had just been published in English. We were compelled by the fact that Gramsci used the term “subaltern” instead of “proletariat.” Now, he used this term because he was writing in prison under condition of extreme censorship; therefore, he didn’t want to use standard Marxist term and coined the term “subaltern.” But as a result, Gramsci was fundamentally altering the core definition of classes in the orthodox version of Marxism at the time. By simply renaming the proletarian class to the subaltern, he was suggesting that classical Marxist division of European industrial society into classes was not entirely adequate. The classical understanding of class didn’t quite work in a country like Italy, where in the North there was a large industrial structure, while most parts of the South were agrarian and most exploited people were peasants. Gramsci was suggesting that the classical understanding of the “proletariat” didn’t fit the political situation in Italy. So in using a term like subaltern, he was trying to incorporate this very large, pre-industrial formation in to the understanding of political strategies for the Left or the Communist movement.

We found this extremely relevant in trying to understand the situation in countries like India, for instance, which in the early 80s was more-or-less in exactly the same situation: there was an important and developing industrial section with industrial working classes, but a very large part of the country essentially consisted of agrarian formations. Therefore most Indians were in fact still peasants. So it was in trying to reorient or reformulate the problem of what it is to write the history of the “people” in a country like India that we found the idea of using “subaltern classes”—rather than the orthodox formulation of classes in Marxism—much more useful and, in a sense, full of new possibilities. That’s how it began; we actually began by using the term “subaltern classes.”

Initially in our thinking, subalternity still referred to a certain class structure that was perhaps not entirely frozen or well-defined—i.e., it was often indeterminant, fuzzy and so on—but the term still referred to a certain structure of class relations. It’s work that happened later on—particularly with Gayatri Spivak’s interventions—that allowed for a different inflection to be given to the term subaltern.

Was the Nazi rise inevitable?

From The Telegraph:

NazGerman history has been shaped by one central trauma: the rise of the Nazis culminating in the horror of the concentration camps. There has been an understandable tendency for scholars to interpret everything that went before as a prelude to the emergence of fascism. Just as the Whig school notoriously interpreted the path of British history as an inexorable process leading to the triumph of parliamentary democracy in the 19th century, so the rise of Hitler has haunted German historians.

One major victim of this tendency has been the Holy Roman Empire, a sprawling confederation of German-speaking states that embraced Italy, Germany and much of France at one point in the high Middle Ages. Contemporary historians have tended to lose interest in the Holy Roman Empire after the death in 1250 of Frederick II, the powerful and charismatic emperor who challenged the authority of the Pope. Thereafter they have assumed that the empire fell into decline, part of a pattern of neglect and institutional collapse that sowed the seeds for the failure of the Weimar Republic and the rise of the Nazis. Indeed, in the words of one historian, the Holy Roman Empire had “no history at all” after the mid-17th century, though “it continued for a while longer to lead a miserable, meaningless existence because its patient, slow-moving subjects lacked the initiative and in many cases the intelligence to effect its actual dissolution”.

More here.

Cancer screen yields drug clues

From Nature:

CellsTwo compendiums of data unite genetic profiling with drug testing to create the most complete picture yet of how mutations can shape a cancer’s response to therapy. The results, published today in Nature1, 2, suggest that the effectiveness of most anticancer agents depends on the genetic make-up of the cancer against which they are used. One study found a link between drug sensitivity and at least one mutation in a cancer-related gene for 90% of the compounds tested.

Lab-grown cancer cells are a mainstay of research into the disease. The two projects catalogue the genetic features of hundreds of such cell lines, including mutations in cancer-associated genes and patterns of gene activation. They then match these features with how the cells respond to approved and potential drugs. “This is a very powerful finding,” says Tom Hudson, president of the Ontario Institute for Cancer Research in Toronto, Canada, who was not affiliated with the work. “It could provide valuable information for designing clinical trials, and lead to more focused and less expensive approaches to drug development.”

More here.

Thursday Poem

Line

She is standing, here, in a grocery store,
Under the fluorescent light suspended,
Above her head, and from the ceiling,
Standing in front of the refrigerated meat,
That is laid out in front of her, butchered,
A thigh, a breast, a leg,
Or chopped and ground,
Pieces of meat wrapped tightly in plastic that is
Stretching over them, like skin, and she forgets,
Forgets what she is looking for, because she is,
Remembering what he said on the telephone,
His voice in Afghanistan and, here, in her ear,
About what happened, there, in Kandahar, or
How an American soldier, how he lost his mind,
Went and killed sixteen Afghans, nine children,
A massacre, her husband whispers over it, this
Telephone line, and she is here, now, in America,
Moving down aisles of a grocery store, moving
Through the months, because she is still waiting,
Waiting for him to come home again, waiting
In a checkout line, and thinking about lines,
Lines she draws through the days on a calendar,
Bodies shot dead, lined up on the side of a road,
Or the lines of war,
Lines soldiers cross and lines they don’t,
And the imaginary lines that divide countries,
Our country from theirs,
Or how different he will be,
Her husband,
When he crosses over again, and comes home.

by Amalie Flynn

Rethinking the Literature Classroom

1SBC-our4th-yr_classroom_circa821Jeff Hudson in Full Stop:

Here is something I know: I feel better when I read — not just good, but better. Anxieties are assuaged, burdens lightened, relationships enriched. I feel part of something hopeful, a connection to the writer, the characters, other readers. I feel smart, if it is okay to say that. I am moved to act after reading — to write, to talk. I have new questions and fresh answers. And I am hardly alone. Anne Lamott knows that “when writers make us shake our heads with the exactness of their prose and their truths, and even make us laugh about ourselves or life, our buoyancy is restored. We are given a shot at dancing with, or at least clapping along with the absurdity of life instead of being squashed by it over and over again.” After sharing stories, writer Barry Lopez feels exhilarated: “The mundane tasks which awaited me, I anticipated now with pleasure. The stories had renewed in me a sense of the purpose of my life.”

Here is something else I know: the power of literature to “renew a sense of purpose in our lives” gets killed in literature classrooms — unintentionally, no doubt, but killed nonetheless.

This isn’t an indictment. Writer Richard Ford found himself teaching literature as a graduate assistant in 1969 and realized, “What seemed worthwhile to teach was what I felt about literature . . . [literature] had mystery, denseness, authority, connectedness, closure, resolution, perception, variety, magnitude — value in other words . . . Literature appealed to me. But I had no idea how to teach its appealing qualities, how to find and impart the origins of what I felt.” This is a difficult question.

Signandsight.com Says Good-bye

Logo_top_leftWe here at 3QD have been long-time fans of signandsight. Their presence on the web will be missed. Anja Seeliger and Thierry Chervel:

After seven years we are shutting down signandsight.com. The site will remain online, but for now no new texts will be posted.

We still believe in the idea behind signandsight.com. We are convinced that Europe needs a public sphere, and we think that this public sphere is best achieved by combining the possibilities of the internet and traditional media. As before, we still love our motto “Let's Talk European”. Yes, English is now the lingua franca of contemporary Europe. However, when English is used to bring articles written in another language to an international readership, then it serves as a bridge to these others languages and helps create waves. Interestingly, the most lively reactions to signandsight.com came from the US, where there is an intellectual audience that wishes to escape its domestic borders.

Some of the most wonderful experiences with signandsight.com were when Harper's reprinted an interview with Thomas Bernhard, because it had been first translated into English by signandsight.com, when Anne Applebaum of the Washington Post and Paul Berman of the New Republic discussed texts or debates that had originated from signandsight.com and which had found echoes in Swedish, Hungarian, Spanish, Polish, and French newspapers. Signandsight.com never had a wide popular readership, but it was a catalyst for European public debate.

Marx at 193

421px-Karl_Marx_001Also in the LRB, John Lanchester:

In trying to think what Marx would have made of the world today, we have to begin by stressing that he was not an empiricist. He didn’t think that you could gain access to the truth by gleaning bits of data from experience, ‘data points’ as scientists call them, and then assembling a picture of reality from the fragments you’ve accumulated. Since this is what most of us think we’re doing most of the time it marks a fundamental break between Marx and what we call common sense, a notion that was greatly disliked by Marx, who saw it as the way a particular political and class order turns its construction of reality into an apparently neutral set of ideas which are then taken as givens of the natural order. Empiricism, because it takes its evidence from the existing order of things, is inherently prone to accepting as realities things that are merely evidence of underlying biases and ideological pressures. Empiricism, for Marx, will always confirm the status quo. He would have particularly disliked the modern tendency to argue from ‘facts’, as if those facts were neutral chunks of reality, free of the watermarks of history and interpretation and ideological bias and of the circumstances of their own production.

I, on the other hand, am an empiricist. That’s not so much because I think Marx was wrong about the distorting effect of underlying ideological pressures; it’s because I don’t think it’s possible to have a vantage point free of those pressures, so you have a duty to do the best with what you can see, and especially not to shirk from looking at data which are uncomfortable and/or contradictory. But this is a profound difference between Marx and my way of talking about Marx, which he would have regarded as being philosophically and politically entirely invalid.

In the Zeitgeist, Fact-Checking

Essay_factcheck_iontrap1-383x287Via Zite, there are two pieces on fact-checking this week. Atossa Araxia Abrahamian in The New Inquiry:

Brides magazine has a fact-checker. She does things like verify the cost of honeymoons and makes sure that Vera Wang did, in fact, design that dress, and compares the captions on winter flower bouquet slideshows with pictures in botany reference books. It would be terrible to mistake a eucalyptus pod for a mere pussy willow.

Many American magazines, from trashy celebrity weeklies to highbrow general-interest journals, have fact-checkers of some sort. I worked as one in 2008, when, with three other Harper’s interns, I fact-checked the magazine’s Index from beginning to end. Being the primary speaker of foreign languages in the intern cubicle, I ended up doing a lot of the international checking for the magazine. Percentage of Russians who say one goal of U.S. foreign policy is “the complete destruction of Russia”: 43. Number of Iraqi stray dogs that Operation Baghdad Pups has helped emigrate to the United States since 2003: 66.

I quickly learned that fact-checking is a predominantly American phenomenon. The French don’t do much of it, most Russian papers certainly don’t either, and even the Swiss — possibly the most exacting and precise people on the planet — do not make use of fact-checkers in quite the same way as Americans do. Yet their presses keep rolling, and their readers keep reading, and their brides still buy roses, if by another name. People even trust the press in Switzerland much more than they do in the U.S.: 46 percent of Swiss people said they had confidence in their newspapers and magazines in 2010. Among Americans, it was only 25 percent.

Christian Lorentzen also has a piece in the LRB.

Women: The Libyan Rebellion’s Secret Weapon

From Smithsonian:

Inas Fathy’s transformation into a secret agent for the rebels began weeks before the first shots were fired in the Libyan uprising that erupted in February 2011. Inspired by the revolution in neighboring Tunisia, she clandestinely distributed anti-Qaddafi leaflets in Souq al-Juma, a working-class neighborhood of Tripoli. Then her resistance to the regime escalated. “I wanted to see that dog, Qaddafi, go down in defeat.” A 26-year-old freelance computer engineer, Fathy took heart from the missiles that fell almost daily on Col. Muammar el-Qaddafi’s strongholds in Tripoli beginning March 19. Army barracks, TV stations, communications towers and Qaddafi’s residential compound were pulverized by NATO bombs. Her house soon became a collection point for the Libyan version of meals-ready-to-eat, cooked by neighborhood women for fighters in both the western mountains and the city of Misrata. Kitchens across the neighborhood were requisitioned to prepare a nutritious provision, made from barley flour and vegetables, that could withstand high temperatures without spoiling. “You just add water and oil and eat it,” Fathy told me. “We made about 6,000 pounds of it.”

Fathy’s house, located atop a hill, was surrounded by public buildings that Qaddafi’s forces often used. She took photographs from her roof and persuaded a friend who worked for an information-technology company to provide detailed maps of the area; on those maps, Fathy indicated buildings where she had observed concentrations of military vehicles, weapons depots and troops. She dispatched the maps by courier to rebels based in Tunisia. On a sultry July evening, the first night of Ramadan, Qaddafi’s security forces came for her. They had been watching her, it turned out, for months. “This is the one who was on the roof,” one of them said, before dragging her into a car. The abductors shoved her into a dingy basement at the home of a military intelligence officer, where they scrolled through the numbers and messages on her cellphone. Her tormentors slapped and punched her, and threatened to rape her. “How many rats are working with you?” demanded the boss, who, like Fathy, was a member of the Warfalla tribe, Libya’s largest. He seemed to regard the fact that she was working against Qaddafi as a personal affront.

More here.

How to Write Like a Scientist

From Science:

ScientistScetch081005_originalI didn’t know whether to take my Ph.D. adviser’s remark as a compliment. “You don’t write like a scientist,” he said, handing me back the progress report for a grant that I had written for him. In my dream world, tears would have come to his eyes, and he would have squealed, “You write like a poet!” In reality, though, he just frowned. He had meant it as a criticism. I don’t write like a scientist, and apparently that’s bad. I asked for an example, and he pointed to a sentence on the first page. “See that word?” he said. “Right there. That is not science.”

The word was “lone,” as in “PvPlm is the lone plasmepsin in the food vacuole of Plasmodium vivax.” It was a filthy word. A non-scientific word. A flowery word, a lyrical word, a word worthy of — ugh — an MFA student. I hadn’t meant the word to be poetic. I had just used the word “only” five or six times, and I didn’t want to use it again. But in his mind, “lone” must have conjured images of PvPlm perched on a cliff’s edge, staring into the empty chasm, weeping gently for its aspartic protease companions. Oh, the good times they shared. Afternoons spent cleaving scissile bonds. Lazy mornings decomposing foreign proteins into their constituent amino acids at a nice, acidic pH. Alas, lone plasmepsin, those days are gone. So I changed the word to “only.” And it hurt. Not because “lone” was some beautiful turn of phrase but because of the lesson I had learned: Any word beyond the expected set — even a word as tame and innocuous as “lone” — apparently doesn’t belong in science. I’m still fairly new at this science thing. I’m less than 4 years beyond the dark days of grad school and the adviser who wouldn’t tolerate “lone.” So forgive my naïveté when I ask: Why the hell not? Why can’t we write like other people write? Why can’t we tell our science in interesting, dynamic stories? Why must we write dryly? (Or, to rephrase that last sentence in the passive voice, as seems to be the scientific fashion, why must dryness be written by us?)

More here.

collecting America’s other language

Dare

The scene is a mysterious one, beguiling, thrilling, and, if you didn’t know better, perhaps even a bit menacing. According to the time-enhanced version of the story, it opens on an afternoon in the late fall of 1965, when without warning, a number of identical dark-green vans suddenly appear and sweep out from a parking lot in downtown Madison, Wisconsin. One by one they drive swiftly out onto the city streets. At first they huddle together as a convoy. It takes them only a scant few minutes to reach the outskirts—Madison in the sixties was not very big, a bureaucratic and academic omnium-gatherum of a Midwestern city about half the size of today. There is then a brief halt, some cursory consultation of maps, and the cars begin to part ways. All of this first group of cars head off to the south. As they part, the riders wave their farewells, whereupon each member of this curious small squadron officially commences his long outbound adventure—toward a clutch of carefully selected small towns, some of them hundreds and even thousands of miles away. These first few cars are bound to cities situated in the more obscure corners of Florida, Oklahoma, and Alabama. Other cars that would follow later then went off to yet more cities and towns scattered evenly across every corner of every mainland state in America. The scene as the cars leave Madison is dreamy and tinted with romance, especially seen at the remove of nearly fifty years. Certainly nothing about it would seem to have anything remotely to do with the thankless drudgery of lexicography. But it had everything to do with the business, not of illicit love, interstate crime, or the secret movement of monies, but of dictionary making.

more from Simon Winchester at Lapham’s Quarterly here.

How should I live my life?

Contraception2

So what does it mean for the country that our cultural common denominator is shrinking? That increasingly Americans have very little experiences through which to understand the lives of our fellow citizens? And why, in the midst of these trends is there general agreement on an issue as potentially flammable as contraception? Recently I found good answers to these questions in an unexpected place — in an essay on literature and ethics that provides a convincing account of the rock bottom consequences of a fractured population. The essay is called “Perceptive Equilibrium: Literary Theory and Ethical Theory,” and it was first given as a talk by the philosopher Martha Nussbaum 25 years ago. The purpose of the paper was to merge literary theory with ethical theory — to show how forms of art like the novel can help us answer arguably the two most fundamental philosophical questions: How should I live my life? How should we live together? Here is Nussbaum describing the centrality of literature to ethics: “One of the things that makes literature something deeper and more central for us than a complex game, deeper even than those games, for example chess and tennis, that move us to wonder by their complex beauty, is that it speaks like Strether. It speaks about us, about our lives and choices and emotions, about our social existence and the totality of our connections.”

more from Kevin Hartnett at The Millions here.

proust’s mom

Proust13

There are texts that seem to require a certain craziness of us, a mismeasure of response to match the extravagance of their expression. But can a mismeasure be a match? All we know is that we don’t want to lose or reduce the extravagance but can’t quite fall for it either. An example would be Walter Benjamin’s wonderful remark about missed experiences in Proust: None of us has time to live the true dramas of the life that we are destined for. This is what ages us – this and nothing else. The wrinkles and creases on our faces are the registration of the great passions, vices, insights that called on us; but we, the masters, were not at home. Even without the ‘nothing else’ this is a pretty hyperbolic proposition. With the ‘nothing else’ it turns into a form of madness, a suggestion that we shall not grow old at all unless we keep failing to receive the passions, vices and insights that come to see us. This would be a life governed by new necessities, entirely free from the old ones, exempt from time and biology. The sentences are clear enough but don’t read easily as fantasy or figure of speech. Benjamin is asking us to entertain this magical thought for as long as we can, and not to replace it too swiftly by something more sensible.

more from Michael Wood at the LRB here.

In the Land of Blood and Honey

In_the_Land_of_Blood_and_Honey_5Srecko Horvat on Angelina Jolie's new film In the Land of Blood and Honey about an affair between a Serb and Muslim during the Balkan war, in Eurozine (Warning: the article contains spoilers):

The movie tells the story of Danijel, a soldier fighting for the Bosnian Serbs, and Ajla, a Bosnian Muslim who was involved with him before the war and is now a captive in the concentration camp he oversees. It's a bad repetition of the same good old story depicted most recently in The Reader (Stephen Daldry, 2008), and unforgettably in The Night Porter (Liliana Cavani, 1974). In short, it's a story about the perpetrator and the victim and a reversal of these perspectives as the story goes on. On the one hand you have a war criminal (a concentration camp guard in The Reader, the former SS officer in The Night Porter, the Serbian officer in Jolie's movie), on the other hand you have the victim (the boy who read to the concentration camp guard, the concentration camp survivor, the innocent Muslim woman in the Bosnian war). What all three films have in common is a fatal love affair between a criminal and an innocent victim, the only difference being that, in The Reader, the boy finds out eight years later, when as a law student, he observes a trial of several women (including his former lover) accused of letting 300 Jewish women die in a burning church.

Common to all these films is also that the roles become less and less clear as the story develops. The best example is The Night Porter, where thirteen years after the concentration camp, Lucia meets Maximillian again, who is now working at a Vienna hotel; instead of exposing him, she falls back into their sadomasochistic relationship. The relationship is what Primo Levi – remembering the case of the Sonderkommando, the “special units” of camp inmates in charge of bringing their neighbours to the gas chambers – calls the “gray zone”, the zone in which the “long chain of conjunction between victim and executioner” comes loose. Or, as Giorgo Agamben puts it in his Remnants of Auschwitz, “where the oppressed becomes oppressor and the executioner in turn appears as victim. A gray, incessant alchemy in which good and evil and, along with them, all the metals of traditional ethics reach their point of fusion”.[2]

The best expression of this new terra ethica was articulated by Michael in Bernhard Schlink's novel The Reader, on which the film was based: “I wanted simultaneously to understand Hanna's crime and to condemn it. But it was too terrible for that. When I tried to understand it, I had the feeling I was failing to condemn it as it must be condemned. When I condemned it as it must be condemned, there was no room for understanding. But even as I wanted to understand Hanna, failing to understand her meant betraying her all over again. I could not resolve this. I wanted to pose myself both tasks – understanding and condemnation. But it was impossible to do both.”[3] In other words, when we try to understand the crime, then we stop condemning it; and when we condemn, then we stop understanding it.

So, what is missing in Jolie's movie?

A Short Course in Thinking About Thinking

DKahnemannDaniel Kahneman in Edge:

SESSION ONE

I'll start with a topic that is called an inside-outside view of the planning fallacy. And it starts with a personal story, which is a true story.

Well over 30 years ago I was in Israel, already working on judgment and decision making, and the idea came up to write a curriculum to teach judgment and decision making in high schools without mathematics. I put together a group of people that included some experienced teachers and some assistants, as well as the Dean of the School of Education at the time, who was a curriculum expert. We worked on writing the textbook as a group for about a year, and it was going pretty well—we had written a couple of chapters, we had given a couple of sample lessons. There was a great sense that we were making progress. We used to meet every Friday afternoon, and one day we had been talking about how to elicit information from groups and how to think about the future, and so I said, Let's see howwe think about the future.

I asked everybody to write down on a slip of paper his or her estimate of the date on which we would hand the draft of the book over to the Ministry of Education. That by itself by the way was something that we had learned: you don't want to start by discussing something, you want to start by eliciting as many different opinions as possible, which you then you pool. So everybody did that, and we were really quite narrowly centered around two years; the range of estimates that people had—including myself and the Dean of the School of Education—was between 18 months and two and a half years.

But then something else occurred to me, and I asked the Dean of Education of the school whether he could think of other groups similar to our group that had been involved in developing a curriculum where no curriculum had existed before. At that period—I think it was the early 70s—there was a lot of activity in the biology curriculum, and in mathematics, and so he said, yes, he could think of quite a few. I asked him whether he knew specifically about these groups and he said there were quite a few of them about which he knew a lot. So I asked him to imagine them, thinking back to when they were at about the same state of progress we had reached, after which I asked the obvious question—how long did it take them to finish?

An Interview with Margarethe von Trotta on Her Upcoming Film About Hannah Arendt

8898401-STANDARDOver at the Goethe Institute:

Thinking and writing, those are the things that really defined the great philosopher Hannah Arendt. The objective of the film was to transform this thought into a film, to make it a visual embodiment of a real person.

How does one use film to describe a woman who thinks? How can we watch her while she thinks? That is of course the big challenge when making a film about intellectual personalities. I insisted that Barbara Sukowa play Hannah because she is the only actress I know who I could imagine showing me how someone thinks, or that someone is thinking. And she managed to do it. For me, it was clear from the beginning that she was the one, and I had to push for her to get the role because some of the investors couldn’t visualize it. I said to them, “I am not doing this film without her.” I had the same situation with Rosa Luxemburg and again with Hildegard von Bingen – she really experienced the intellectual nature of Rosa’s political speeches, for example. That is how it is with Hannah Arendt. The viewer has to see that she is really thinking. She does two speeches in this film as well. Arendt was a professor at various universities in the United States and she did seminars and speeches on philosophical and political subject matter. In situations like that, it’s not about just reading your lines. You have to be able to improvise and develop the speech as you go. In the film there is a six-minute speech in English, with the strong German accent that Arendt had, and Sukowa is able to get viewers to experience, think and follow her analyses.

What were the preparations for the film like? And what about your contact with Arendt’s world?

Before we started writing the screenplay we met with a lot of people in New York who had known Arendt well on a personal level. People like Lotte Köhler, her longtime colleague and friend who died in 2011 at the age of 92, or Elisabeth Young-Bruehl, who also died in 2011, as well as others like Lore Jonas, widow of Hans Jonas, and Jerome Kohn, her last assistant and publisher of her posthumous writings. Those were amazing encounters, the stuff you need when you are writing a script about this type of real person who you’ve never met yourself.

Hilton Kramer, 1928-2012

Kramer1-articleInlineWilliam Grimes in the NYT:

Admired for his intellectual range and feared for his imperious judgments, Mr. Kramer emerged as a critic in the early 1950s and joined The Times in 1965, a time when the tenets of high modernism were being questioned and increasingly attacked. He was a passionate defender of high art against the claims of popular culture and saw himself not simply as a critic offering informed opinion on this or that artist, but also as a warrior upholding the values that made civilized life worthwhile.

This stance became more marked as political art and its advocates came to the fore, igniting the culture wars of the early 1980s, a struggle in which Mr. Kramer took a leading role as the editor of The New Criterion, where he was also a frequent contributor.

In its pages, Mr. Kramer took dead aim at a long list of targets: creeping populism at leading art museums; the incursion of politics into artistic production and curatorial decision making; the fecklessness, as he saw it, of the National Endowment for the Arts; and the decline of intellectual standards in the culture at large.

A resolute high modernist, he was out of sympathy with many of the aesthetic waves that came after the great achievements of the New York School, notably Pop (“a very great disaster”), conceptual art (“scrapbook art”) and postmodernism (“modernism with a sneer, a giggle, modernism without any animating faith in the nobility and pertinence of its cultural mandate”).

At the same time, he made it his mission to bring underappreciated artists to public attention and open up the history of 20th-century American art to include figures like Milton Avery and Arthur Dove, about whom he wrote with insight and affection.

the fate of the western

Winchester_73

However much certain optimists may talk about the survival or possible resurrection of the Western, I fear—much to my regret—that, as a genre, it is pretty well dead and buried, a relic of a more credulous, more innocent, more emotional age, an age less crushed or suffocated by the ghastly plague of political correctness. Nonetheless, whenever a new Western comes out, I dutifully go and see it, albeit with little expectation that it will be any good. In the last decade, I can recall three pointless remakes, vastly inferior to the movies on which they were modelled and which weren’t exactly masterpieces themselves: 3:10 to Yuma by James Mangold, The Alamo by John Lee Hancock, and True Grit by the Coen brothers, all of them uninspired and unconvincing, and far less inspired than the distinctly uneven originals made, respectively, by Delmer Davies, John Wayne, and Henry Hathaway. I recall, too, Andrew Dominik’s interesting but dull The Assassination of Jesse James by the Coward Robert Ford, Ed Harris’s bland, soulless Appaloosa, David von Ancken’s unbearable Seraphim Falls, and the Australian John Hillcoat’s The Proposition, of which my memory has retained not a single image. The only recent Westerns that have managed to arouse my enthusiasm have been those made for TV: Walter Hill’s Broken Trail, and Deadwood, whose third and final season no one has even bothered to bring out on DVD in Spain, which gives you some idea of how unsuccessful the magnificent first two series must have been. In my view, Kevin Costner’s Open Range, which came out slightly earlier, was the last decent Western to be made for the big screen, even though it has long been fashionable to denigrate anything this admirable actor and director does.

more from Javier Marías at Threepenny Review here.