Deconstructing Paul De Man

Photo_47587_portrait_large

Carlin Romano in The Chronicle of Higher Education:

Did Paul de Man and Martin Heidegger ever meet? If so, they could have compared notes on how to bamboozle de-Nazification officials after, well, one’s side loses.

No matter. Now de Man has joined that august cultural club that includes Caravaggio, Wagner, Céline, Pound, Heidegger, and a slew of other accomplished artists, thinkers, and intellectuals who were also no-goodniks. The nasty ethics in the personal lives of those cultural heavies force us to ask two tough questions that are simpler than many pretend:

(1) Is there an inevitable link between a person’s ethics and his creative and intellectual work?

(2) Is it morally acceptable to honor or enjoy the work of artists and intellectuals whom we condemn for their nonprofessional, unethical actions?

Even when a cultural figure simply bears accusations of ethical misdeeds—as in the case of Woody Allen over the years, a matter reopened after Hollywood’s Golden Globes tribute and an Academy Awards nomination—the questions, when retriggered, produce frenzied media meditations. Now, with the long-awaited publication of Evelyn Barish’s The Double Life of Paul de Man (Liveright), a two-decades-in-the-making investigative biography of the Yale literary theorist whose version of “deconstruction” shook up English and comp-lit departments in the 1970s and 80s, the high literary and intellectual worlds face their own revisiting.

According to Barish, de Man (1919-83) committed fraud, forgery (16 separate acts), swindling, embezzlement, and theft as a postwar Belgian book publisher. For his sins as head of the Hermes publishing house, he was, in 1951, “found guilty in absentia and sentenced to six years in prison with heavy fines.” Apparently de Man played fast and loose with more than a million Belgian francs to fuel his lifelong luxury spending. Cornered, he skipped out to the United States on a visa probably obtained illegally by his father.

More here.

Epigenetics: The Sins of the Father

Epi2

Virginia Hughes in Nature:

Biologists first observed this 'transgenerational epigenetic inheritance' in plants. Tomatoes, for example, pass along chemical markings that control an important ripening gene2. But, over the past few years, evidence has been accumulating that the phenomenon occurs in rodents and humans as well. The subject remains controversial, in part because it harks back to the discredited theories of Jean-Baptiste Lamarck, a nineteenth-century French biologist who proposed that organisms pass down acquired traits to future generations. To many modern biologists, that's “scary-sounding”, says Oliver Rando, a molecular biologist at the University of Massachusetts Medical School in Worcester, whose work suggests that such inheritance does indeed happen in animals3. If it is true, he says, “Why hasn't this been obvious to all the brilliant researchers in the past hundred years of genetics?”.

One reason why many remain sceptical is that the mechanism by which such inheritance might work is mysterious. Explaining it will require a deep dive into reproductive biology to demonstrate how the relevant signals might be formed in the germ line, the cells that develop into sperm and eggs and carry on, at a minimum, a person's genetic legacy.

A mother might pass on effects of environmental exposures to a fetus during pregnancy. So, to study the phenomenon of transgenerational epigenetics cleanly, biologists are focusing on fathers, and have been looking at how sperm might gain and lose epigenetic marks. “In the past two to three years there's been a lot of new information,” says Michelle Lane, a reproductive biologist at the University of Adelaide in Australia. But proposals for how it all works are themselves embryonic. “It's a huge black box,” Lane says.

More here.

Thursday, March 6, 2014

Will Scotland Go Independent?

Freedland_1-032014_jpg_600x609_q85

Jonathan Freedland on Scotland's future, in the NYRB:

[I]t is, paradoxically, Scotland that has been clinging to an idea of Britain, one that has been abandoned by the rest of the UK—at least if that idea is defined in part as the collectivist spirit of 1945. As Macwhirter writes, “Scots have arguably been more committed to the idea of Britain than the English over the last 200 years. What Scotland didn’t buy into was the abandonment of what used to be called the post-war consensus: universalism and the welfare state.”

Which is why the Yes campaign’s offer, set out in Scotland’s Future, consists as much of social policy as constitutional change. The document contains few abstractions about democracy, but promises instead “a transformational change in childcare,” the scrapping of London-imposed changes to welfare benefits, and, in the move most likely to attract international attention, the removal of the UK’s Trident nuclear weapon system from Scotland. “We’re half an hour away from the biggest collection of weapons of mass destruction in western Europe,” Jenkins told me. “There’s no version of devolution that allows us to get rid of that.” In other words, only independence allows Scotland to fully realize the distinct political culture that has arisen there.

Some on the left of the No campaign warn that it will be a cruel irony if, by breaking away, Scotland ensures the isolation of its more social democratic ethos. For once Scotland no longer sends fifty-nine MPs to Westminster, many of whom represent safe Labour seats, then Labour’s chances of forming a UK government diminish sharply. If independence happens in 2016, then an England-dominated UK could be the land that is forever Tory. Some electoral analysts dispute that arithmetic; nevertheless it will be this country to which an independent, left-leaning Scotland might be bound in monetary, fiscal, and political union, with the UK Treasury and Bank of England together making major decisions affecting Scotland’s economy. Scottish social democracy could discover it was able to flourish more easily inside Britain than out.

It will be a greater irony still if the ultimate consequence of the program pursued by the great patriot and would-be latter-day Britannia, Margaret Thatcher, was to be the unraveling of the United Kingdom.

More here.

Just Right Inequality

Edsall-contributor-articleInline-v2

Thomas B. Edsall in the NYT:

[Simon] Kuznets’s research into the relationship between inequality and growth laid the foundation for modern thinking about what has become a critical question: Has inequality in this country reached a tipping point at which it no longer provides an incentive to strive and to innovate, but has instead created a permanently disadvantaged class, as well as an ongoing threat of social instability?

One of the most articulate contemporary proponents of the “optimal inequality” thesis is Richard Freeman, a labor economist at Harvard. In a 2011 paper, Freeman wrote: “Is there a level of inequality that optimizes economic growth, stability, and shared prosperity? My answer is yes. The relation between inequality and economic outcomes follows an inverted-U shape, so that increases in inequality improve economic performance up to the optimum and then reduce it.”

Freeman argues that the costs of excessive inequality are high: “Inequality that results from monopoly power, rent-seeking or activities with negative externalities that enrich their owners while lowering societal income (think pollution or crime), adversely affect economic performance. High inequality reinforces corruption by allowing a few ‘crony capitalists’ to lobby politicians or regulators to protect their economic advantages. When national income goes mostly to those at the top, there is little left to motivate people lower down. The 2007 collapse of Wall Street and bailout of banks-too-big-to-fail showed that inequality in income and power can threaten economic stability and give the few a stranglehold on the economy.”

Conservative economists look at the issue of equality from the opposite vantage point: when do government efforts to remedy inequality and to redistribute income worsen conditions by serving as a deterrent to work and productive activity?

More here.

READING DARWIN IN ARABIC

P12_Irwin_Web_410852hRobert Irwin at The Times Literary Supplement:

The title Reading Darwin in Arabicnotwithstanding, most of the men discussed in this book did not read Charles Darwin in Arabic. Instead they read Jean-Baptiste Lamarck, Ernst Haeckel, Herbert Spencer, Thomas Huxley, Gustave Le Bon, Henri Bergson and George Bernard Shaw in European or Arabic versions. They also read popularizing accounts of various aspects of Darwinism in the scientific and literary journal al-Muqtataf (“The Digest”, 1876–1952). The notion of evolution that Arab readers took away from their reading was often heavily infected by Lamarckism and by the social Darwinism of Spencer. Darwin’s The Origin of Species by Means of Natural Selectionwas published in 1859, but Isma‘il Mazhar’s translation of the first five chapters of Darwin’s book into Arabic only appeared in 1918.

For a long time, the reception of Darwinism was bedevilled by the need to find either neologisms or new twists to old words. As Marwa Elshakry points out, there was at first no specific word in Arabic for “species”, distinct from “variety” or “kind”.

more here.

on Ibsen’s “A Doll’s House”

140310_r24710_p233Hilton Als at The New Yorker:

Ibsen starts off by telling us something about who Nora is—or, rather, the conditions she lives under. It’s Christmastime in Norway, and the Helmer household is filled with excitement. A sweet-tempered maid, Helene (Mabel Clements), scurries about the Helmers’ tidy house; she opens the front door, and our fair-haired heroine enters Ian MacNeil’s ingenious set, which sometimes revolves, like a dancer in a music box, as the actors move from room to room, trailed by Stuart Earl’s lovely score. Nora is carrying a number of packages; they’re gifts for her three children. As she sets her packages down and takes off her coat, Helene tells her that her husband, Torvald (Dominic Rowan), is in his study. After years of struggle, he’s about to be made the manager of a local bank. Things are on the upswing in the Helmer household, but something’s wrong.

Before Nora can alert Torvald or the children to her presence, she devours a chocolate that she’s secreted away. But why is her pleasure a secret?

more here.

rereading roth

RothGeorge O'Brien at The Dublin Review of Books:

In the beginning was Newark. Everything that Philip Roth turned to such rich account in his great final spate of works inaugurated by American Pastoral is not only set in his native place but from the start derived its moral energy and edge from it. The city of Newark and especially the Weequahic neighbourhood, the local spaces that reflect the intricate geography of class and ethnicity, the mentalities of old Jews and their superannuated ways and of new Jews with their suburban affluence and unacknowledged assimilation anxieties, men’s moral crossroads and the unreasonable and irrational women who supply the materials for them – that repertoire of essential Roth concerns and interests is as central to Goodbye, Columbus (1959), his first book, whose eponymous novella made his name, as to the novels that crown his achievement forty years later. But instead of devoting himself to that repertoire’s potential, Roth wandered far and wide, following in Henry James’s footsteps in the long, slow, rather airless Letting Go (1962), doing a Mark Twain in Our Gang (1971), and in general trying a lot of modes and tones without ever seeming quite to satisfy the demands of harnessing his talent’s restless fluency to his smarts, his savvy, his wit and his ideas. He was out of Newark, but what did that mean? – See more at: http://www.drb.ie/essays/american-berserk#sthash.85WAOJMj.dpuf

more here.

The psychology of hate: How we deny human beings their humanity

Nicholas Epley in Salon:

Twelve_years_a_slave4-620x412One of the most amazing court cases you probably have never heard of had come down to this. Standing Bear, the reluctant chief of the Ponca tribe, rose on May 2, 1879, to address a packed audience in a Nebraska courtroom. At issue was the existence of a mind that many were unable to see. Standing Bear’s journey to this courtroom had been excruciating. The U.S. government had decided several years earlier to force the 752 Ponca Native Americans off their lands along the fertile Niobrara River and move them to the desolate Indian Territory, in what is now northern Oklahoma. Standing Bear surrendered everything he owned, assembled his tribe, and began marching a six-hundred-mile “trail of tears.” If the walk didn’t kill them (as it did Standing Bear’s daughter), then the parched Indian Territory would. Left with meager provisions and fields of parched rock to farm, nearly a third of the Poncas died within the first year. This included Standing Bear’s son. As his son lay dying, Standing Bear promised to return his son’s bones to the tribe’s burial grounds so that his son could walk the afterlife with his ancestors, according to their religion. Desperate, Standing Bear decided to go home.

Carrying his son’s bones in a bag clutched to his chest, Standing Bear and twenty-seven others began their return in the dead of winter. Word spread of the group’s travel as they approached the Omaha Indian reservation, midway through their journey. The Omahas welcomed them with open arms, but U.S. officials welcomed them with open handcuffs. General George Crook was ordered by government officials to return the beleaguered Poncas to the Indian Territory. Crook couldn’t bear the thought. “I’ve been forced many times by orders from Washington to do most inhuman things in dealings with the Indians,” he said, “but now I’m ordered to do a more cruel thing than ever before.” Crook was an honorable man who could no more disobey direct orders than he could fly, so instead he stalled, encouraging a newspaper editor from Omaha to enlist lawyers who would then sue General Crook (as the U.S. government’s representative) on Standing Bear’s behalf. The suit? To have the U.S. government recognize Standing Bear as a person, as a human being.

More here.


Thursday Poem

The Tao that can be named is not the real Tao.
………………………….. —Lao Tzu

Myself

I am planted in the earth
Happily, like a cabbage
Carefully peel away the layers of language
That clothe me and soon
It will become clear I am nowhere to be found
And yet even so, my roots lie beneath . . .
.

by Chimako Tada
from Hanabi (Fire Works)
publisher: Shoshi Yuriika, Tokyo, 1956
translation: 2010, Jeffrey Angles

How Novels Widen Your Vision

E1e47c11b

Joe Fassler in The Atlantic [h/t: Tunku Varadarajan]:

Dinaw Mengestu is a National Book Award Foundation “5 Under 35” writer, aNew Yorker “20 Under 40” writer to watch, and a recipient of a MacArthur Foundation Fellowship. His other novels are The Beautiful Things That Heaven Bears and How to Read the Air.


Dinaw Mengestu: I came to Tayeb Salih’s Season of Migration to the Northlate in life, shortly after I had finished my second novel and was just beginning to make the first tentative steps into the third. I read it once, and then a few weeks later, once more. I began to carry it in my bag, next to my laptop, or in my coat pocket where it easily fit. I opened it at least once a week to no particular page. After a few minutes, I would close the book, slightly uncertain about what I had just read, even though I knew the outlines of the story better than almost any other novel. I would often wonder why I had never heard of the novel before, and why the same was true for most people I knew. Under the broad banner of post-colonial literature, it deserved a place next to Achebe’sThings Fall Apart, but to think of it only in those terms undercuts its value as a stunning work of literature, as a novel that actively resists the division of art into poorly managed categories of race and history.

Those divisions are a fundamental part of Salih’s novel. The story, set in a recently independent Sudan, with footprints in England and Egypt, mocks and eviscerates the clichés that come with looking at the world as a division between us and the Other. That fractured gaze, whether it is born out of race, gender, or privilege destroys the characters in the novel, none of whom are merely victims or perpetrators. Through them, the story becomes an argument for a better way of seeing, which has always struck me as being one of the novel’s better gifts, something which it is uniquely poised to do, if only because it demands the reader’s imagination, and by doing so affirms our capacity to live beyond the limited means of our private lives. We read not to encounter the Other, but to see ourselves refracted in a different landscape, in a different time, in shoes and clothes that perhaps bear no resemblance to our own.

More here.

How Conservatives and Liberals Misunderstand “Social Construct” Sexuality

140304_OUT_MichelFoucault.jpg.CROP.promovar-medium2

Jesi Egan in Slate:

[L]ast month, the religious journal First Things published a controversial essay by Michael W. Hannon called “Against Heterosexuality,” which offers an ultra-conservative take on the issue of whether our sexual orientations are natural conditions or chosen constructs. Hannon’s piece is just the latest in a number of recent articles in the “choice wars.” Brandon Ambrosino, writing for the New Republic, set off a small firestorm in January when he described his homosexuality as a choice, not a biological fact. His article provoked vitriolic responses from, among others, Gabriel Arana and Slate’s own Mark Joseph Stern. Clearly, the biology vs. choice (or nature vs. culture) debate remains a point of serious contention within the LGBTQ community and beyond.

But does “construct” mean what these new adopters think it does? Though Hannon and Ambrosino have different political endgames, they both invoke a very unlikely ally: Michel Foucault, the French philosopher who’s known as the grandfather of queer theory and a central architect of the “construct” conception of sexuality. Though Foucault died in 1984, his History of Sexuality, Volume I is still mandatory reading in LGBTQ studies courses. His theories about where sexuality comes from have been hugely influential in academia for decades. But Foucault is also responsible for a lot of the confusion surrounding the biology vs. choice debate—largely because his work been taken out of context by liberals and social conservatives alike. While Hannon’s essay is a particularly disturbing piece of work (see Stern’s scathing take-down for more), all of these popular misinterpretations tend to muddy the political waters, and risk obscuring Foucault’s most important contributions to our understanding of sexuality.

Let’s start with a quick primer. In The History of Sexuality, Foucault writes that Western society’s views on sex have undergone a major shift over the past few centuries. It’s not that same-sex relationships or desires didn’t exist before—they definitely did. What’s relatively new, though, is 1) the idea that our desires reveal some fundamental truth about who we are, and 2) the conviction that we have an obligation to seek out that truth and express it.

Within this framework, sex isn’t just something you do. Instead, the kind of sex you have (or want to have) becomes a symptom of something else: your sexuality.

More here.

Gene-editing method tackles HIV in first clinical test

Sarah Reardon in Nature:

HivA clinical trial has shown that a gene-editing technique can be safe and effective in humans. For the first time, researchers used enzymes called zinc-finger nucleases (ZFNs) to target and destroy a gene in the immune cells of 12 people with HIV, increasing their resistance to the virus to the virus. The findings are published today in The New England Journal of Medicine1. “This is the first major advance in HIV gene therapy since it was demonstrated that the ‘Berlin patient’ Timothy Brown was free of HIV,” says John Rossi, a molecular biologist at the Beckman Research Institute of the City of Hope National Medical Center in Duarte, California. In 2008, researchers reported that Brown gained the ability to control his HIV infection after they treated him with donor bone-marrow stem cells that carried a mutation in a gene called CCR5. Most HIV strains use a protein encoded by CCR5 as a gateway into the T cells of a host’s immune system. People who carry a mutated version of the gene, including Brown's donor, are resistant to HIV.

But similar treatment is not feasible for most people with HIV: it is invasive, and the body is likely to attack the donor cells. So a team led by Carl June and Pablo Tebas, immunologists at the University of Pennsylvania in Philadelphia, sought to create the beneficial CCR5 mutation in a person’s own cells, using targeted gene editing. The researchers drew blood from 12 people with HIV who had been taking antiretroviral drugs to keep the virus in check. After culturing blood cells from each participant, the team used a commercially available ZFN to target the CCR5 gene in those cells. The treatment succeeded in disrupting the gene in about 25% of each participant’s cultured cells; the researchers then transfused all of the cultured cells into the participants. After treatment, all had elevated levels of T cells in their blood, suggesting that the virus was less capable of destroying them. Six of the 12 participants then stopped their antiretroviral drug therapy, while the team monitored their levels of virus and T cells. Their HIV levels rebounded more slowly than normal, and their T-cell levels remained high for weeks. In short, the presence of HIV seemed to drive the modified immune cells, which lacked a functional CCR5 gene, to proliferate in the body. Researchers suspect that the virus was unable to infect and destroy the altered cells. “They used HIV to help in its own demise,” says Paula Cannon, who studies gene therapy at the University of Southern California in Los Angeles. “They throw the cells back at it and say, ‘Ha, now what?’”

More here.

Arundhati Roy, the Not-So-Reluctant Renegade

09roy1-master675-v2

Siddhartha Deb in the NYT Magazine:

“I’ve always been slightly short with people who say, ‘You haven’t written anything again,’ as if all the nonfiction I’ve written is not writing,” Arundhati Roy said.

It was July, and we were sitting in Roy’s living room, the windows closed against the heat of the Delhi summer. Delhi might be roiled over a slowing economy, rising crimes against women and the coming elections, but in Jor Bagh, an upscale residential area across from the 16th-century tombs of the Lodi Gardens, things were quiet. Roy’s dog, Filthy, a stray, slept on the floor, her belly rising and falling rhythmically. The melancholy cry of a bird pierced the air. “That’s a hornbill,” Roy said, looking reflective.

Roy, perhaps best known for “The God of Small Things,” her novel about relationships that cross lines of caste, class and religion, one of which leads to murder while another culminates in incest, had only recently turned again to fiction. It was another novel, but she was keeping the subject secret for now. She was still trying to shake herself free of her nearly two-decade-long role as an activist and public intellectual and spoke, with some reluctance, of one “last commitment.” It was more daring than her attacks on India’s occupation of Kashmir, the American wars in Iraq and Afghanistan or crony capitalism. This time, she had taken on Mahatma Gandhi.

More here.

Wednesday, March 5, 2014

Imaginary Jews

Walzer_1-032014_jpg_250x1222_q85

Michael Walzer reviews David Nirenberg's Anti-Judaism: The Western Tradition, in the NYRB:

What Nirenberg has written is an intellectual history of Western civilization, seen from a peculiar but frighteningly revealing perspective. It is focused on the role of anti-Judaism as a constitutive idea and an explanatory force in Christian and post-Christian thought—though it starts with Egyptian arguments against the Jews and includes a discussion of early Islam, whose writers echo, and apparently learned from, Christian polemics. Nirenberg comments intermittently about the effects of anti-Judaism on the life chances of actual Jews, but dealing with those effects in any sufficient way would require another, and a very different, book.

Anti-Judaism is an extraordinary scholarly achievement. Nirenberg tells us that he has left a lot out (I will come at the end to a few things that are missing), but he seems to know everything. He deals only with literature that he can read in the original language, but this isn’t much of a limitation. Fortunately, the chapter on Egypt doesn’t require knowledge of hieroglyphics; Greek, Hebrew, and Latin are enough. Perhaps it makes things easier that the arguments in all the different languages are remarkably similar and endlessly reiterated.

A certain view of Judaism—mainly negative—gets established early on, chiefly in Christian polemics, and then becomes a common tool in many different intellectual efforts to understand the world and to denounce opposing understandings. Marx may have thought himself insightful and his announcement original: the “worldly God” of the Jews was “money”! But the identification of Judaism with materialism, with the things of this world, predates the appearance of capitalism in Europe by at least 1,500 years.

More here.

The ‘Failure’ of the ‘Reset:’ Obama’s Great Mistake? Or Putin’s?

West-Russia-.JPEG-04db6

Daniel Nexon in the Washington Post:

Russia’s political organization is fundamentally imperial in character, composed of a hodgepodge of political units that range from the fully integrated to the semi-sovereign and autonomous. As my colleague, Charles King, wrote in 2003:

“Central power, where it exists, is exercised through subalterns who function as effective tax- and ballot-farmers; they surrender up a portion of local revenue and deliver the votes for the center’s designated candidates in national elections in exchange for the center’s letting them keep their own fiefdoms.”

Nowhere is this kind of arrangement more vividly illustrated than in Chechnya, where Moscow ‘solved’ its separatist problem by devolving power to a local viceroy, Akhmed Kadyrov and, after his assassination, his son, Ramzan Kadyrov.

Second, Moscow’s strategy for managing its internal relations — relying on subalterns, exploiting ethnic divisions, deploying military forces and using the toolkit of electoral authoritarianism — extends, if often in attenuated form, to those states it considers as falling within its “privileged sphere of influence.” It has long backed and leveraged secessionist movements in, among other places, Moldova, Georgia and Azerbaijan in the pursuit of political control. Although not always successful, this pattern dates back to the Soviet era and, before that, the Russian Empire. Indeed, the Kremlin’s power-political practices are as perennial as they are limited. In a theoretically sophisticated Security Studies article, Iver B. Neumann and Vincent Pouliot show that Moscow’s quest for great-power status and recognition, combined with its inability to achieve “insider status” in the international order, invariably drives it back to the same repertoire of asserting great-power perquisites in ways that shock and alarm the international community. Did German Chancellor Angela Merkel describe Putin as living in “another world” after her March 2 phone conversation with him? If not, then sometimes truth does indeed reside in fiction.

More here. For more on the Ukraine, also see Anatol Lieven, and Alexander Motyl.

Trigger Happy

Blindfolded

Jenny Jarvie in The New Republic:

[T]he headline above would, if some readers had their way, include a “trigger warning”—a disclaimer to alert you that this article contains potentially traumatic subject matter. Such warnings, which are most commonly applied to discussions about rape, sexual abuse, and mental illness, have appeared on message boards since the early days of the Web. Some consider them an irksome tic of the blogosphere’s most hypersensitive fringes, and yet they've spread from feminist forums and social media to sites as large as the The Huffington Post. Now, the trigger warning is gaining momentum beyond the Internet—at some of the nation's most prestigious universities.

Last week, student leaders at the University of California, Santa Barbara, passed a resolution urging officials to institute mandatory trigger warnings on class syllabi. Professors who present “content that may trigger the onset of symptoms of Post-Traumatic Stress Disorder” would be required to issue advance alerts and allow students to skip those classes. According to UCSB newspaper The Daily Nexus, Bailey Loverin, the student who sponsored the proposal, decided to push the issue after attending a class in which she “felt forced” to sit through a film that featured an “insinuation” of sexual assault and a graphic depiction of rape. A victim of sexual abuse, she did not want to remain in the room, but she feared she would only draw attention to herself by walking out.

On college campuses across the country, a growing number of students are demanding trigger warnings on class content. Many instructors are obliging with alerts in handouts and before presentations, even emailing notes of caution ahead of class. At Scripps College, lecturers give warnings before presenting a core curriculum class, the “Histories of the Present: Violence,” although some have questioned the value of such alerts when students are still required to attend class. Oberlin College has published an official document on triggers,advising faculty members to “be aware of racism, classism, sexism, heterosexism, cissexism, ableism, and other issues of privilege and oppression,” to remove triggering material when it doesn't “directly” contribute to learning goals and “strongly consider” developing a policy to make “triggering material” optional. Chinua Achebe's Things Fall Apart, it states, is a novel that may “trigger readers who have experienced racism, colonialism, religious persecution, violence, suicide and more.”

More here.

Derrida and the Death Penalty

The-Death-Penalty-243x366

Jan Mieszkowski reviews Jacques Derrida's The Death Penalty, Volume I in the LA Review of Books:

[T]he driving concern of the seminar is as clear as it is provocative. “Never to my knowledge,” Derrida declared in a contemporaneous conversation with French historian Élisabeth Roudinesco, “has any philosopher as a philosopher, in his or her own strictly and systematically philosophical discourse, never has any philosophy as suchcontested the legitimacy of the death penalty.” As an experiment, I shared this claim with a number of academic philosophers. Their initial skepticism quickly turned to surprise as they realized that, as Derrida observes, virtually all of the major philosophers were either ardent advocates of capital punishment, reluctant apologists for it, or markedly silent on the topic. Even those, Derrida adds, “who maintained a public discourse against the death penalty never did so, to my knowledge — and this is my provisional hypothesis — in a strictly philosophical way.”

One may raise an eyebrow at the formulation “in a strictly philosophical way,” if only because one can’t help imagining how Derrida himself, in another mood, might have pounced on it: can philosophy ever be strictlyphilosophical? Doesn’t philosophy come into its own precisely by losing itself when it seeks a way of its own? Yet these are precisely Derrida’s concerns, for at issue is not just what certain philosophers have said about the death penalty, but whether Western philosophy is in some way organized by its investment in this particular doctrine of punishment. Derrida’s suggestion is that the death penalty is both one penalty among others and the penalty of penalties, a transcendental condition of possibility of justice and punishment. Criminal law as we know it, if not law in general, would be inconceivable in its absence. The death penalty, he writes, “has always been the effect of an alliance between a religious message and the sovereignty of the state,” state sovereignty, first and foremost, being the power over the life and death of subjects. It is therefore not simply a question of maintaining that we can only understand the death penalty by explaining the relations between traditional theological, juridical, and political discourses. The reigning theological-juridico-political constellation can be approached and understood only through a study of capital punishment.

More here.

Christopher Lasch and the Role of the Social Critic

LaschMichael J. Kramer at The Point:

The Culture of Narcissism solidified Lasch’s reputation as a leading anti-modernist critic of an America that seemed to have lost its balance as it rollerskated into oblivion. Mistrusting America’s affluence and growing technological achievements, Lasch even critiqued the anti-authoritarian liberation struggles of the 1960s, which belonged for him to the same modernist cult of progress that, failing to recognize necessary limits, would destroy all in its path. The counterculture’s myth of exaggerated self-realization was but the flipside of the retreat into basic self-preservation. Detached by state and market from connections to a more sustaining sense of purpose or obligation, Americans inhabited a culture that left them rootless.

But Lasch should not be remembered merely as a grumbling reactionary. What he feared was “liberation,” not “modernity”—dismissing anti-modernist nostalgia as the fantasy of progress in reverse. For most of his life (he died of cancer in 1994), he remained committed to a more egalitarian society and clung to the hope that change might still occur. As he said toward the end of a career that had turned, beginning with Narcissism, increasingly dark and pessimistic: he still had faith even though he lacked optimism. It was a statement that flummoxed many interviewers, but it is key to understanding Lasch’s complex vision of American culture—and of the role of the social critic within it.

more here.

On Tolstoy and Mortality

322x500xrb_giraldi_01_opt_86.jpg,qitok=rZu8_eN5.pagespeed.ic.xvjGbighvzWiliam Giraldi at The Virginia Quarterly Review:

Written in 1886, The Death of Ivan Ilyich was the first fiction Tolstoy published after the spiritual upheaval he chronicles in Confession. It’s easy to imagine Ilyich as the old and bearded sage-​looking man Tolstoy was upon his death, but he’s only forty-​five years old, and this fact adds to the tremendous pathos of the story: The death of a young man is always more awful than the death of an old man. The priest gives Ilyich little spiritual consolation, and the doctors are self-​important fools, incapable of mitigating his pain. His co-​workers are disgusted by the thought of his wasting body and care only about jockeying for cozier positions once he dies. His wife and children, occupied by the minutiae of their quotidian lives, refuse to admit what has befallen him. He finds their refusal to confront this fanged truth most disgusting of all: “Ivan Ilyich’s chief torment was the lie—​that lie, for some reason recognized by everyone, that he was only ill but not dying.” His sole comfort comes from Gerasim, the peasant servant who does not recoil from the foul stench, who accepts the inevitability of all flesh. If Ilyich’s upper-​crust friends regard death as indecent, Gerasim knows otherwise: His peasant’s dirty-​hands understanding of life, his calm acceptance of every person’s fate, helps to calm Ilyich into his own acceptance. (The peasantry’s calm acceptance of death, by the way, can be noticed in Turgenev, Dostoevsky, and Solzhenitsyn, to name a few—​it seems to fall somewhere in line between Russian literary trope and Russian cultural myth.) Relief for Ilyich comes only after he has followed Gerasim’s lead and acquiesced to his fate.

more here.

A stacked deck at the New York Times

Pareene_pilon_b24.5_630Alex Pareene at The Baffler:

One great problem with financial journalism, especially in the decades leading up to the crash, has been that it’s often written in an argot understandable only to the already highly financially literate. Sorkin doesn’t usually employ such specialized language. This has led to the mistaken belief that he’s explaining the industry to regular people. In fact, he is a dutiful Wall Street court reporter, telling important people what other important people are thinking and saying. At the same time, he is Wall Street’s most valuable flack. He isn’t explainingfinance to the people—you’d be better served reading John Kenneth Galbraith to understand how finance works—he’s justifying it.

The modern finance industry is at a loss when it comes to justifying its own existence. Its finest minds can’t explain why we wouldn’t be better off with a much simpler and more heavily circumscribed model of capital formation. Sorkin likewise can’t make his readers fully grasp why the current system—which turns large amounts of other people’s money and even more people’s debt into huge paper fortunes for a small super-elite, and in such a way as to regularly imperil the entire worldwide economic order—is beneficial or necessary. But the New York Times and Wall Street each need him to try.

more here.