From The Second Sex to The Beauty Myth

Barbara Ellen in The Guardian:

The Second Sex by Simone de Beauvoir (1949)

SimoneTo ask what influence this book had on gender politics is akin to wondering what the sun ever did for the earth. The answer? Everything. Today, The Second Sex is still hailed as the mothership of feminist philosophy. “One is not born, but rather becomes (a) woman,” muses De Beauvoir (the quote varying, according to the translation). Exploring topics from sex, work and family to prostitution, abortion and the history of female subordination, De Beauvoir challenges the notion of men as the default (the ideal), and women as “other”. For many, The Second Sex represents not just key feminist reading, but rather essential feminist thinking and being.

The Feminine Mystique by Betty Friedan (1963)

This book was no mere pity-party for dissatisfied 1950s/60s Valium-gobbling US housewives. It was a call to arms, demystifying what became known as “second-wave feminism” for ordinary women all over the world. Friedan also identified “the problem that has no name”, probing the lack of fulfilment in women’s lives – where everything “domestic” and trivial was deemed theirs, and everything important was “men only”. With a precision and defiance that still resonates today, The Feminine Mystique challenged the notion that, for women, anatomy was destiny.

Sexual Politics by Kate Millett (1970)

Sexual Politics brought the fizz of iconoclasm to gender politics, tackling how women were routinely diminished and over-sexualised in literature and wider culture. Calling out the likes of Norman Mailer, Henry Miller and DH Lawrence, for what might be politely termed patriarchal/male dominant gender bias, and impolitely, literary macho dick-swinging, Millett set the benchmark for in-depth, no-holds-barred feminist critique. Her book remains relevant today because it encouraged readers to question not just the topics cited, but everything around them and tounderstand better how sexism could be systematically ingrained, culturally as well as politically.

More here.

Rushdie’s New York Bubble

Nathaniel Rich in the New York Review of Books:

Rich_1-102617Whether by design, chance, or oracular divination, Salman Rushdie has managed, within a year of the 2016 election, to publish the first novel of the Trumpian Era. On purely technical merits this is an astounding achievement, the literary equivalent of Katie Ledecky lapping the Olympic field in the 1500-meter freestyle. The publishing industry still operates at an aristocratic pace; Egypt built the new Suez Canal in less time than it typically takes to convert a finished manuscript into a hardcover. As a point of comparison, the first novel to appear about September 11, Windows on the World, by the French author Frédéric Beigbeder, was not published until August 2003. Yet less than eight months into the administration, Rushdie has produced a novel that, if not explicitly about the president, is tinged a toxic shade of orange.

Trump poses a risky temptation for novelists, especially those writing amid the shit torrent of his presidency. As political journalists have discovered, the volume of revelations erupting from the White House and the presidential Twitter feed threatens to undermine the reliability of even daily news reports by the time they appear in print. It would seem masochistic to attempt to write a book about such a swiftly moving target, when events could at any time be hijacked by a new revelation of collusion with the enemy, impeachment charges, a nuclear war, a race war. In a nod to the futility of this enterprise, Rushdie uses as an epigram a line from François Truffaut: “La vie a beaucoup plus d’imagination que nous.”

Far more perilous to a novelist, however, is the prospect of writing about a public figure whose name, in the decades before his ascension to the presidency, has carried a fixed set of cultural associations, has been a brand, a trademark, a cliché, appearing in the consciousness if not on the page in boldface type, a textual black hole that threatens to vacuum into itself any gesture toward nuance, complexity, or original thought. Rushdie parries this hazard by omitting Donald Trump’s name and distributing his signature qualities among several characters. The abstraction allows him to scrutinize in turn various aspects of the presidential character, and ours, without succumbing to the familiar catechisms of contemporary political debate.

More here.

Bruno Latour, a veteran of the ‘science wars,’ has a new mission

Jop de Vrieze in Science:

ScreenHunter_2859 Oct. 14 19.00French sociologist of science Bruno Latour, 70, has long been a thorn in the side of science. But in the age of “alternative facts,” he’s coming to its defense.

Latour, who retired last month from his official duties at Sciences Po, a university for the social sciences here, shot to fame with the 1979 book Laboratory Life: The Construction of Scientific Facts, written with U.K. sociologist Steve Woolgar. To research it, Latour spent 2 years at the Salk Institute for Biological Studies in San Diego, California, acting as an anthropologist observing scientists at work. In 1987, Latour elaborated on his thinking in the textbook Science in Action.

Central to Latour’s work is the notion that facts are constructed by communities of scientists, and that there is no distinction between the social and technical elements of science. Latour received praise for his approach and insights, but his relativist and “social-constructivist” views triggered a backlash as well. In their 1994 book Higher Superstition: The Academic Left and its Quarrels with Science, biologist Paul Gross and mathematician Norman Levitt accused Latour and other sociologists of discrediting their profession and jeopardizing trust in science.

The heated debate that followed, known as the “science wars,” lasted for many years. In later writings, Latour acknowledged that the criticism of science had created a basis for antiscientific thinking and had paved the way in particular for the denial of climate change, now his main topic. Today, he hopes to help rebuild confidence in science.

Science Insider spoke with Latour in his apartment here in the French capital. This interview has been edited for clarity and brevity.

More here.

Two new books on the environment show that left vs. right is no longer the relevant political divide

Alex Trembath in Slate:

151211_FUT_BOOKS.jpg.CROP.promo-xlarge2In 1980, Stanford ecologist Paul Ehrlich famously bet libertarian economist Julian Simon that the price index of several precious metals would increase over the ensuing 10 years. Ehrlich, author of the apocalyptic hit The Population Bomb, expected that resource scarcity would inevitably drive up prices. Simon took the bet, insisting that the free market and human ingenuity would find ways to produce more resources at lower costs. Simon won.

That wager, expertly recalled in Paul Sabin’s The Bet, is a microcosm of the left-right divide on the environment and economic growth that has existed for the past several decades. Ehrlich was on the left: an angry, pessimistic academic demanding government policy to halt the growth of population, technology, and consumption. Simon was on the right: a sunny, gregarious economist arguing the free market would forever solve humanity’s resource challenges and that government should get the hell out of the way. Lefties and righties across Western economies have played these roles consistently for a generation.

That divide no longer matters.

More here.

Is there a doctor in my pocket?

Natasha Loder in 1843:

HealthHumans have always dreamed of better, fitter, longer-lasting bodies. But while many science-fiction fantasies, from videophones to self-driving cars, have been realised, health technology has lagged behind our hopes. Artificial organs and smart pills have been a long time coming. There are a number of reasons for this. Biology is an order of magnitude more complicated than other forms of engineering. And it is hard to innovate in health, as there are many rules to protect us from products that might otherwise kill us. The pill or device that promises a longer life needs to prove that it actually works before it can be sold. The price of patient safety is sluggish innovation. Yet, despite these obstacles, there are signs that a digital revolution in health care is imminent. It will be more personalised, and potentially more useful, than anything the world has seen before. It promises to help us manage our health and inform us about the risks ahead.

…Britain’s Babylon Health, based in Kensington in London, is particularly ambitious. At its offices, fake greenery and flowering plants proliferate in a largely unsuccessful attempt to evoke the Hanging Gardens of Babylon. Its app answers medical queries, provides access to doctors and offers users a dashboard of their health stats drawn from the phone or supplemental devices. These data can be supplemented with results from at-home blood-testing kits that one can order via the app. These take readings of liver and kidney function, vitamin levels, bone density and cholesterol. I tried the thyroid test and drew blood with a special device that punches a tiny hole with surprisingly little pain. Then I posted the sample to Babylon. The results (all OK) popped up in the app a day later. If Babylon recommends an appointment with a doctor, it can provide one via video-conferencing almost immediately for £25 ($32). As with many other doctor-on-demand services, it is possible to share notes, or even a video from a consultation, with your regular doctor.

One of the most exciting aspects of digital health is the capacity of mobile phones to gather information as well as deliver it. They can collect data from their own sensors and screens, as well as associated devices such as watches, headbands and the growing constellation of add-ons. Increasingly, such devices are clinically validated and medically useful.

More here.

President Clinton Looks Back at President Grant

Bill Clinton in The New York Times:

GrantThis is a good time for Ron Chernow’s fine biography of Ulysses S. Grant to appear, as we live with the reality of Faulkner’s declaration, “The past is never dead, it’s not even past.” We are now several years into revisiting the issues that shaped Grant’s service in the Civil War and the White House, from the rise of white supremacy groups to successful attacks on the right of eligible citizens to vote to the economic inequalities of the Gilded Age. In so many ways “Grant” comes to us now as much a mirror as a history lesson. As history, it is remarkable, full of fascinating details sure to make it interesting both to those with the most cursory knowledge of Grant’s life and to those who have read his memoirs or any of several previous biographies. It tells well the story of a country boy’s unlikely path to leadership, his peculiarities, strengths, blind spots and uncanny powers of concentration and courage during battle. It covers Grant’s amazing feats on horseback at West Point, where in jumping hurdles “he exceeded all rivals,” clearing the bar a foot higher than other cadets. His mediocre grades have long obscured his interests and abilities: He was president of the literary society, had a talent for drawing and was trusted by classmates to mediate disputes.

For all its scholarly and literary strengths, this book’s greatest service is to remind us of Grant’s significant achievements at the end of the war and after, which have too long been overlooked and are too important today to be left in the dark. Considered by many detractors to be, as a general, little more than a stoic butcher, Grant, in the written terms of surrender at Appomattox, showed the empathy he felt toward the defeated and downtrodden — conditions he knew from harsh personal experience. The terms presented to Robert E. Lee carried “no tinge of malice” and “breathed a spirit of charity reminiscent of Lincoln’s Second Inaugural Address.” He notably allowed the exhausted and starving Confederate regulars to keep their mules and horses, knowing from the rough experience of his failed Missouri farm (Grant presciently named its log cabin “Hardscrabble”) that only by putting in a crop as soon as they returned home would these destitute farmers — and their families — have a chance to survive the coming winter. Grant also knew that if the country had any chance of being brought back together, it needed something other than a harsh peace. In making national healing a priority, he — like Lincoln — took the long view.

More here.

Incomprehensible Things: Philosophy and the Mexico Earthquake

Emmanuel Ordóñez Angulo in The Point:

ScreenHunter_2858-Oct.-13-18.29This summer was my longest stay back home, in Mexico City, in my life as a philosophy student abroad. Because of course I failed to meet my goal of finishing coursework before the holiday, I went back to the studio space I used to rent for writing in Colonia Juárez, near the city centre. By “studio space” I mean the one unusable corner of an apartment in an early twentieth-century building that, the landlord claims, used to house the British diplomatic corps prior to the Mexican revolution, and that now brims with wild flora and peeling green walls.

It was there that the earthquake found me.

The essay I was grappling with deals with the old question whether the things we perceive—the things that we see and touch—have a reality that is independent of us. The relevant discussion starts with Immanuel Kant’s argument against Descartes’ skepticism about the empirical world.

While Descartes aimed to show that the only thing I can be certain of is my own existence, Kant argued that in order for that to be possible I need to in fact be aware of the world around me as actually existing independent of me. This is because, if I am aware of my existence as flowing in time, as I am, then there must be something fixed by reference to which I can be aware that I am not fixed but flowing. Precisely because Descartes is right that I can be certain that I exist, says Kant, I must be certain that a world distinct from me exists as well.

This argument is liable to numerous objections. A famous one, raised by contemporary philosopher Barry Stroud, is that Kant reasons illegitimately from a premise about subjective experience to a conclusion about the existence of objective reality. The problem is that one field of inquiry concerns how we experience and know the world, i.e. what our conceptual framework is like, while the other concerns what actually exists. According to Stroud, the most Kant’s premise can prove is that we experience the external world as existing.

More here.

The world’s first “negative emissions” plant has begun operation—turning carbon dioxide into stone

Akshat Rathi in Quartz:

13-gebald-wurzbacher-side-far-copyright-climeworksThere’s a colorless, odorless, and largely benign gas that humanity just can’t get enough of. We produce 40 trillion kg of carbon dioxide each year, and we’re on track to cross a crucial emissions threshold that will cause global temperature rise to pass the dangerous 2°C limit set by the Paris climate agreement.

But, in hushed tones, climate scientists are already talking about a technology that could pull us back from the brink. It’s called direct-air capture, and it consists of machines that work like a tree does, sucking carbon dioxide (CO2) out from the air, but on steroids—capturing thousands of times more carbon in the same amount of time, and, hopefully, ensuring we don’t suffer climate catastrophe.

There are at least two reasons that, to date, conversations about direct air capture have been muted. First, climate scientists have hoped global carbon emissions would come under control, and we wouldn’t need direct air capture. But most experts believe that ship has sailed. That brings up the second issue: to date, all estimates suggest direct air capture would be exorbitantly expensive to deploy.

More here.

What’s Behind India’s ‘Beef Lynchings’?

Amitava Kumar in The Nation:

ScreenHunter_2857 Oct. 13 18.18I’ll confess to the sin of beef eating in a moment. let me first confess to the sin of not having a true knowledge of science.

In May of this year, Justice Mahesh Chandra Sharma of the Rajasthan High Court suggested that the cow be adopted as the national animal of India. His rationale was that millions of gods and goddesses reside in the cow. And here’s the crucial science bit: According to the judge, the “cow is the only living being which intakes oxygen and emits oxygen.”

I grew up in India during the 1960s and ’70s in a meat-eating Hindu family. Only my mother and my grandparents were vegetarians. The rest of us enjoyed eating—on special occasions—chicken, or fish, or mutton. But I had never eaten beef in India until this summer. And what I ate in restaurants in Mumbai and Delhi, I was repeatedly informed, technically wasn’t beef—it was buffalo meat, or “buff.” It has become too dangerous, in the current political climate, to kill a cow. On the very day I had my first taste of what turned out to be a surprisingly tender buffalo steak in Mumbai, national newspapers carried a report from my hometown of Patna, headlined “Three thrashed in Bihar on suspicion of carrying beef.”

When Prime Minister Narendra Modi led the right-wing Bharatiya Janata Party (BJP) to a landslide victory in the national parliamentary elections in 2014, one of the planks of his campaign was a ban on cow slaughter.

More here.

How the human got his paintbrush

Philip Ball in Prospect Magazine:

PlantsEdward O Wilson, the octogenarian Harvard biologist and ethologist, is one of the most productive, broad-thinking and important scientists of the past century. The central question of his work is why animals do what they do, and how evolution has shaped their behaviour. His new book, The Origins of Creativity, seeks to draw lessons from that understanding about “the unique and defining trait of our species”: creativity, which he defines, not without controversy, as “the innate quest for originality.” Like Charles Darwin, Wilson’s research has mainly focused on non-human behaviour. His specialism is social insects, especially ants. His monumental book The Ants (1991), written with fellow myrmecologist Bert Hölldobler, won a Pulitzer Prize—his second such award—a testament to the fact that Wilson writes as eloquently as he thinks. His first Pulitzer was for On Human Nature (1978), in which his readiness to generalise the lessons of natural history to humankind made him both influential and notorious. He was a pioneer of evolutionary psychology, which explains our impulses and instincts from a Darwinian perspective. These are, in this view, hardwired into our brains because of the reproductive success they conferred on our ancestors. Public resistance to this idea, which he called “sociobiology,” has been widespread and vociferous. In the 1970s, Wilson was denounced as a crypto-fascist who was attempting to offer scientific justification for racism, sexism and bigotry. There were demonstrations at his lectures; during one talk he had water poured over his head.

Frustratingly, both sides seem more interested in trashing each other’s perspective than understanding it. For there is more than science at stake. To Dawkins, a gene’s-eye view of all evolutionary change is the currency of his success and reputation. Many evolutionary biologists, however, accept that natural selection can happen at many levels, not just the genetic. At the group level, it seems possible that cooperation between individuals not closely linked by kinship may sometimes boost their reproductive success. This modern version of group selection, however, if it happens at all, is probably rather rare—except in one species in which complex cultures create a propensity for selective pressure to depend on the specific circumstances of the group. That species is us. The Origins of Creativity shows why group selection matters so much to Wilson: because it enables a close and two-way interplay between evolutionary biology and culture. “It is impossible to overestimate the importance of group selection to both science and the humanities, and further, to the foundation of moral and political reasoning,” he writes

More here.

A Lamentation for a Life Cut Short

Greg Howard in The New York Times:

BookOn the morning of Sept. 15, 1995, a 15-year-old black boy named Michael Allen was rushed to Harbor-UCLA hospital, bleeding out from a bullet wound through his neck. In the ambulance, Michael confessed he had tried to rob an older man who was buffing his car, and things went awry when the man lunged for the gun, got it, and then shot the teenager in self-defense. Michael was arrested for the first time in his life, and spent much of the next 13 years in prison, serving time for the attempted carjacking. In June 2008, he was released; in July 2009, four months shy of his 30th birthday, he was found shot dead in his car. Danielle Allen is a political theorist and professor at Harvard University, and “Cuz: Or the Life and Times of Michael A.” is her attempt to understand the circumstances that ripped her cousin Michael, eight years her junior, from their sprawling, close-knit family before eventually claiming his life.

Who’s Michael? For Allen, Michael might as well be her own baby. “It’s a cliché to say that someone has an electric smile,” she admits, “but what else can you call it when someone beams and all the lights come on?” He had “high cheekbones” and a “bob in his step.” She calls him “beautiful,” “a source of vitality and warmth.” Her descriptions of Michael often verge on cliché or folklore. In Allen’s eyes, Michael is soft, sensitive and flawless; it’s no wonder that he never quite comes into full view. When she peers upon his face at his funeral, she’s astonished by his solidity — this isn’t her little, lithe Michael, but “Big Mike,” as he was known on the street. You realize that Allen didn’t know Michael much at all. And that’s sad too; Michael’s life and potential were stolen from him, and he was stolen from her. “Cuz,” then, is chiefly a story about these thefts, and the merciless carceral state that perpetrated them.

More here.

Pioneering codebreaker Elizebeth Friedman, a poet and mother of two, smashed spy rings by solving secret messages

Simon Worrall in National Geographic:

ScreenHunter_2857 Oct. 12 20.07British codebreaker Alan Turning had a movie, The Imitation Game, made out of his life—and Benedict Cumberbatch to play him. The great American codebreaker Elizebeth Friedman hasn’t been so lucky. Although she put gangsters behind bars and smashed Nazi spy rings in South America, Friedman’s name has been forgotten. Her work remained classified for decades, and others took credit for her achievements. (Find out what secret weapon Britain used against the Nazis.)

Jason Fagone rescues this extraordinary woman’s life and work from oblivion in his new book, The Woman Who Smashed Codes. When National Geographic caught up with Fagone by phone, he explained how Friedman, like Alan Turing, broke the Enigma codes to expose a notorious Nazi spy, how J. Edgar Hoover rewrote history to sideline her achievements, and how the cryptology methods that she and her husband, William Friedman, developed became the foundation for the work of the National Security Agency (NSA). (Go inside the daring mission that stopped a Nazi atomic bomb.)

Elizebeth Friedman is probably not a name familiar to most of our readers. Introduce us to this remarkable woman—and explain what drew you to her.

Well, it’s an amazing American story. A hundred years ago, a young woman in her early twenties became one of the greatest codebreakers America had ever seen. She taught herself how to solve secret messages without knowing the key. That’s codebreaking. And she started from absolutely nothing.

She wasn't a mathematician. She was a poet. But she turned out to be a genius at solving these very difficult puzzles, and her solutions changed the 20th century. She caught gangsters and organized-crime kingpins during Prohibition. She hunted Nazi spies during World War II.

She also helped to invent the modern science of secret writing—cryptology—that lies at the base of everything from government institutions like the NSA to the fluctuations of our daily online lives. Not bad for a Quaker girl from a small Indiana town!

More here.

The Case for Contrarianism

Oliver Traldi in Quillette:

Third-World-Another semester, another academic publishing scandal, complete with calls for penitence and punishment. This time the catalyst is “The Case for Colonialism,” a “Viewpoint” editorial in Third World Quarterly. In this essay, Bruce Gilley argued that “it is high time to question [the anti-colonial] orthodoxy. Western colonialism was, as a general rule, both objectively beneficial and subjectively legitimate in most of the places where it was found, using realistic measures of those concepts.” Gilley’s article has since been withdrawn due to “serious and credible threats of personal violence” made against the journal’s editor. This obviously troubling development should make us wonder: just what evil would this article have brought about if not withdrawn? The Streisand effect is in full display here. The article – detailed, abstruse, and not always beautifully written – has no doubt been far more widely read than it would have been without the controversy.

The publication of “The Case for Colonialism” faced criticism on several grounds: it was offensive; it was unscholarly; the journal did not follow its normal procedures in publishing it (now officially disputed by the publisher); the journal is a special venue for anti-colonial perspectives. This last one is particularly reminiscent of last spring’s Hypatia affair. As in that case, we should be skeptical of appeals to “academic standards” in political disciplines. Often such standards are simply substantive moral or political stances for which the field provides a “safe space”. In a representative attack on the article’s scholarly quality, Sahar Khan says that “the article seems like a bad joke. Can someone, a scholar no less, actually make a case for colonialism?”. A Change.org petition asserts that its “goal is to raise academic publishing standards and integrity,” but then calls on Third World Quarterly‘s editors to “apologize for further brutalizing those who have suffered under colonialism”. And a letter of resignation from some members of the journal’s editorial board even suggests that “caus[ing] offence and hurt . . . clearly violates [the] principle of free speech”.

More here.

Italo Calvino, The Art of Fiction No. 130

Interviewed by William Weaver and Damien Pettigrew in The Paris Review 1992:

Italo-CalvinoUpon hearing of Italo Calvino’s death in September of 1985, John Updike commented, “Calvino was a genial as well as brilliant writer. He took fiction into new places where it had never been before, and back into the fabulous and ancient sources of narrative.” At that time Calvino was the preeminent Italian writer, the influence of his fantastic novels and stories reaching far beyond the Mediterranean. Two years before, The Paris Review had commissioned a Writers at Work interview with Calvino to be conducted by William Weaver, his longtime English translator. It was never completed, though Weaver later rewrote his introduction as a remembrance. Still later, The Paris Review purchased transcripts of a videotaped interview with Calvino (produced and directed by Damien Pettigrew and Gaspard Di Caro) and a memoir by Pietro Citati, the Italian critic. What follows—these three selections and a transcript of Calvino’s thoughts before being interviewed—is a collage, an oblique portrait.

Rowan Gaither, 1992

Thoughts Before an Interview

Every morning I tell myself, Today has to be productive—and then something happens that prevents me from writing. Today . . . what is there that I have to do today? Oh yes, they are supposed to come interview me. I am afraid my novel will not move one single step forward. Something always happens. Each morning I already know I will be able to waste the whole day. There is always something to do: go to the bank, the post office, pay some bills . . . always some bureaucratic tangle I have to deal with. While I am out I also do errands such as the daily shopping: buying bread, meat, or fruit. First thing, I buy newspapers. Once one has bought them, one starts reading as soon as one is back home—or at least looking at the headlines to persuade oneself that there is nothing worth reading. Every day I tell myself that reading newspapers is a waste of time, but then . . . I cannot do without them. They are like a drug. In short, only in the afternoon do I sit at my desk, which is always submerged in letters that have been awaiting answers for I do not even know how long, and that is another obstacle to be overcome.

Eventually I get down to writing and then the real problems begin.

More here.

Why Are More American Teenagers Than Ever Suffering From Severe Anxiety?

Benoit Denizet-Lewis in The New York Times:

BoyOver the last decade, anxiety has overtaken depression as the most common reason college students seek counseling services. In its annual survey of students, the American College Health Association found a significant increase — to 62 percent in 2016 from 50 percent in 2011 — of undergraduates reporting “overwhelming anxiety” in the previous year. Surveys that look at symptoms related to anxiety are also telling. In 1985, the Higher Education Research Institute at U.C.L.A. began asking incoming college freshmen if they “felt overwhelmed by all I had to do” during the previous year. In 1985, 18 percent said they did. By 2010, that number had increased to 29 percent. Last year, it surged to 41 percent. Those numbers — combined with a doubling of hospital admissions for suicidal teenagers over the last 10 years, with the highest rates occurring soon after they return to school each fall — come as little surprise to high school administrators across the country, who increasingly report a glut of anxious, overwhelmed students. While it’s difficult to tease apart how much of the apparent spike in anxiety is related to an increase in awareness and diagnosis of the disorder, many of those who work with young people suspect that what they’re seeing can’t easily be explained away.

…When I asked Eken about other common sources of worry among highly anxious kids, she didn’t hesitate: social media. Anxious teenagers from all backgrounds are relentlessly comparing themselves with their peers, she said, and the results are almost uniformly distressing. Anxious kids certainly existed before Instagram, but many of the parents I spoke to worried that their kids’ digital habits — round-the-clock responding to texts, posting to social media, obsessively following the filtered exploits of peers — were partly to blame for their children’s struggles. To my surprise, anxious teenagers tended to agree.

More here.

Reading Kazuo Ishiguro in Tehran

Arash Azizi in Iran Wire:

ScreenHunter_2854 Oct. 11 17.04The annual announcement of the recipient of the Nobel prize for literature is always big news. Last year, many were shocked (some even offended) when the award went to the songwriter Bob Dylan. Why celebrate a big celebrity when the award could shed light on lesser known talents? In previous years, of course, some grumbled precisely because the award went to what the New York Times recently called “obscure European writers whose work was not widely read in English.” The article listed a few laureates who supposedly fit this description, including the French novelist Patrick Modiano, who won in 2014.

But what if we look at the award given by the Swedish Academy more globally? The Iranian literary community, for instance, would not have considered Modiano “obscure,” as his work was widely translated into Persian and the subject of numerous studies and book events in the country, some in small provincial towns.

This year’s winner, the Japanese-born British writer Kazuo Ishiguro, is well known around the world, a fact partly explained by the fact that he writes in English. But he is also very well known in Iran, where he can perhaps be counted as one of the most-read novelists in the country.

Every single novel by Ishiguro has been translated into Persian, often more than once, and not just by anybody, but by the giants of Persian literature and translation. Ishiguro’s Persian life began when Najaf Daryabandari, arguably the greatest living literary translator working in the Persian language, translated Remains of the Day.

More here. [Thanks to Asad Raza.]