The Making of Edward Said’s ‘Orientalism’

Timothy Brennan in The Chronicle of Higher Education:

Now an academic classic, Orientalism was at first an unlikely best seller. Begun just as the Watergate hearings were nearing their end and published in 1978, it opens with a stark cameo of the gutted buildings of civil-war Beirut. Then, in a few paragraphs, readers are whisked off to the history of an obscure academic discipline from the Romantic era. Chapters jump from 19th-century fiction to the opéra bouffe of the American news cycle and the sordid doings of Henry Kissinger. Unless one had already been reading Edward Said or was familiar with the writings of the historian William Appleman Williams on empire “as a way of life” or the poetry of Lamartine, the choice of source materials might seem confusing or overwhelming. And so it did to the linguists and historians who fumed over the book’s success. For half of its readers, the book was a triumph, for the other half a scandal, but no one could ignore it.

As an indictment of English and French scholarship on the Arab and Islamic worlds, Orientalism made its overall case clearly enough. The field of Oriental studies had managed to create a fantastical projection about Arabs and Islam that fit the biases of its Western audience. At times, these images were exuberant and intoxicating, at times infantilizing or hateful, but at no time did they describe Arabs and Muslims accurately. Over centuries, these images and attitudes formed a network of mutually reinforcing clichés mirrored in the policies of the media, the church, and the university. With the authority of seemingly objective science, new prejudices joined those already in circulation. This grand edifice of learning deprived Arabs of anything but a textual reality, usually based on a handful of medieval religious documents. As such, the Arab world was arrested within the classics of its own past. This much about Orientalism, it seems, was uncontroversial, although readers agreed on little else.

More here.

How to End a Conversation Without Making Up an Excuse

Joe Pinsker in The Atlantic:

A vintage French postcard illustration featuring a sophisticated, stylishly-attired mature couple seated on opposite sides of a banquette, in Paris, circa June, 1909. (Photo by Paul Popper/Popperfoto via Getty Images)

Later this year, if all goes well, Americans will be awash in social interactions again. At offices and schools, on sidewalks and in coffee shops, we’ll be bumping into one another like it’s 2019. The resulting flood of conversations will be extremely welcome. But less front of mind, at this still socially stifled moment, are the awkwardness and discomfort that will return along with day-to-day interactions. The co-worker who yammers on, the chatty subway seatmate who keeps you from reading your book, the friend of a friend who bores you at parties—they are all very excited to see you again, and have lots to catch you up on.

Perhaps this period before social life fully resumes is an occasion to revisit what we want from conversations and, more to the point, how we end them. In this regard, people generally have a poor sense of timing. “Conversations almost never ended when both conversants wanted them to,” concluded the authors of a study published earlier this month that asked people about recent interactions with loved ones, friends, and strangers. About two-thirds of them said they wanted the conversation to end sooner; on average, that group wanted the conversation to be about 25 percent shorter, Adam Mastroianni, a psychology doctoral student at Harvard and a co-author of the study, told me.

More here.

Saturday, March 20, 2021

COVID-19 is here to stay. Now we must redesign our economies around it.

James Meadway in OpenDemocracy:

The demand for zero COVID rightly sets a high bar for the current lockdown conditions, insisting on working towards the virtual elimination of COVID in Britain. It is critical that, unlike the experience last year, lockdown is not ended too soon. Aiming not only to “protect the NHS”, as is still the aim of government, but to reduce infections to near-zero would place this country (like any other) in a far better place, post-lockdown, than it was in following the first or second lockdown. It is also undeniable that the countries that went in hard against the virus early on, have been reaping the benefits – from Vietnam and Taiwan to New Zealand and Australia.

But it would be a major error to think that zero COVID is a permanent solution to the crisis we are now in. The Left and progressives absolutely must not become enthusiasts for lockdown: it is a terrible necessity, not some desirable point to get to. We should no more be cheering for this than we would cheer for war – a war may well be necessary at some point, but it’s hardly something to be called for gladly. The fact is that we have a terrible disease to deal with, and have to do so in a way that minimises death and illness from the disease – but also, importantly, from how we deal with the disease.

The cost of lockdowns is high: not because Gross Domestic Product takes a knock, or because the government has to borrow money, but because of the strains on mental health, on children’s education, or in the sharp rise in reported domestic violence cases. We should aim to minimise the costs of COVID, but we then need to also minimise the costs of lockdown. This means looking to leave this lockdown at an appropriate point, and acting now to never return to lockdown again.

More here.

Thanks for all the fish

Thomas Moynihan in Aeon:

The year is 1961. As Cold War tensions crescendo, an American neuroscientist named John C Lilly makes a bold claim. He announces that he has made contact with the first ‘alien’ intelligence. But Lilly wasn’t talking about little green men from Tau Ceti, he was talking of minds much closer to home: bottlenose dolphins.

Lilly had spent the previous decade hammering electrodes through animals’ craniums, attempting to map the reward systems of the brain. Having started probing the grey matter of macaques, he was shocked when he acquired some dolphins to test upon. Swiftly, he became convinced of their smarts. Upon hearing dolphins seemingly mimic human vocalisations – in their ‘high-pitched, Donald Duck, quacking-like way’ – he became certain that they also spoke to each other in ‘dolphinese’.

Lilly was the first to really demonstrate how socially intelligent these beings are. Of course, others had long made similar claims. Ancient Greek authors celebrated the nobility and philanthropy of the cetacean, recounting tales of human-dolphin companionship. But, in the modern era, the aquatic mammal fell into disrepute. One 19th-century captain referred to them as ‘warlike and voracious’. In 1836, the French zoologist Frédéric Cuvier remarked on this fall from benevolent angel to carnivorous brute, deeming the wild dolphin a ‘stupid glutton’. But, given their prodigious brains, he was certain of the potential for intelligence. They have no natural competition, thus they have no need to cultivate their intellect. Venturing that humans raised in the same state would also be feral, Cuvier suggested that we civilise dolphins – thereby unleashing their potential for rationality.

More here.

A Proudhon for Postmoderns?

Alexander Zevin in New Left Review:

The arrival of Piketty’s latest work, Capital and Ideology, prompts a comparison with another French thinker, who also won widespread fame for a generic attack on inequality published at a time of profound economic crisis. In 1840, Pierre-Joseph Proudhon’s What Is Property? rebutted claims that the answer ‘It is theft!’ was the signal for another 1793. The proposition should be ‘recognized as a lightning rod to shield us from the coming thunderbolt’, he wrote, just as Piketty hoped his warnings that rising levels of inequality in the 21st century could be incompatible with democratic values would produce tax reforms to fend off violent upheavals comparable to those that put an end to the Belle Époque.

Mutatis mutandis, of course. For the journeyman printer, born into a family of Besançon peasants and small-traders, going barefoot to school, read: the son of ex-Trotskyist soixante-huitards, growing up in the leafy Parisian suburb of Clichy Hauts-de-Seine. For La Voix du Peuple, the World Incomes Database; for imprisonment at the Conciergerie, chairs at the lse, Berkeley and ehess; for the people’s bank, the global tax on capital. Proudhon’s pamphlet was also a slower burn than Capital in the Twenty-First Century. It took two years before scandal, prosecution and counter-polemic elevated What Is Property? to international notoriety, hailed as a ‘penetrating work’ in Marx’s paper, the Neue Rheinische Zeitung. When they met in Paris, the young German radicals did their best to educate Proudhon in political economy and the dialectic. In response, six years later, he produced the two fat volumes of his System of Economic Contradictions, or Philosophy of Poverty—drawing from Marx the stinging Poverty of Philosophy. Later, Marx would laughingly chastise himself for having infected Proudhon with Hegelianism—‘for his “sophistication”, as the English call the adulteration of commercial goods.’

More here.

This Veil of Smoke

Erica Eisen in Boston Review:

Stepping out of my apartment building in southern Bishkek one cold November morning in 2019, I was met with a smell that I immediately recognized as fire. I had grown up in southern California, remembered drought-spawned chaparral blazes that would leap over highways and engulf whole tracts of housing, closing schools for a week at a time as waves of people fled for the safety of the coast. I remembered a red sun, a grey sky, a rain of ash, and above all else the acrid smell that closed around me now.

But scrolling through news site after news site revealed nothing: no warehouse gone up in smoke, no stray spark from an electrical wire. The men and women who walked past me did so unhurriedly, without panic, seeming not to register the scent of the air, the smudgy sky. Still unsure, I crossed the street to the weekend bazaar, which bustled as usual with butchers, fishmongers, vegetable sellers all calmly bagging produce and doling out change. I picked some potatoes from a tarp, some carrots from a cardboard box. When I returned home I realized that the smell was on my clothes, my hair, my skin. In the ensuing hours and days it would come to leak into the apartment itself, and then I stopped noticing it, and life, as it always does, went on.

More here.

Mike Davis: Excavating the Future

John Thomason at Commonweal:

Aside from the flicker of fame that followed City of Quartz, Davis has managed to largely avoid the limelight for nearly four decades, despite receiving a MacArthur “Genius” Fellowship, a Lannan Literary Award, and many other honors along the way. For his devoted readers, part of his appeal is surely found in his writing style, which though forceful, self-assured, and playful, is also unapologetically precise, even scientific, making full use of a century-and-a-half’s worth of Marxist vocabulary. And part of it is his seemingly dour and idiosyncratic interests, which have led him to write books about the history of the car bomb, developmental patterns in contemporary slums, and the role of El Niño famines in nineteenth-century political economy.

But topics as weighty as these are only idiosyncratic as long as they have no immediately obvious bearing on the present—and 2020 appears to be the year that many of the apocalyptic futures excavated by Davis have finally come into full view. In 1998, Davis argued that megafires of increasing virulence were an inevitable feature of California’s future, given its rampant, loosely regulated development boom and the counterproductive policy of total fire suppression demanded by real-estate interests.

more here.

Was 1925 Literary Modernism’s Most Important Year?

Ben Libman at the New York Times:

“An illiterate, underbred book it seems to me: the book of a self-taught working man, & we all know how distressing they are, how egotistic, insistent, raw, striking & ultimately nauseating.” So goes Virginia Woolf’s well-known complaint about “Ulysses,” scribbled into her diary before she had finished reading it. Her disparagement is catnip to those many critics who like to view “Mrs. Dalloway” — that other uber-famous, if more lapidary, modernist novel that spans the course of a single day — as Woolf’s rejoinder to Joyce. More than that, though, it tells us something important about our literary history. Nineteen twenty-two, the year of “Ulysses,” may well be ground zero for the explosion of modernism in literature. But the resultant shock wave is better captured by another year: 1925, that of “Mrs. Dalloway” and several other works, all now in the spotlight in 2021, as they emerge from under copyright.

more here.

Saturday Poem

Revisions

Before the poet was a poet
nothing was reworked:

not the smudge of ink on twelve sets of clothes
not the fearsome top berth on the train
not a room full of boxes and dull windows
not the cat that left its kittens and afterbirth in a pair of jeans
not doubt.

Before the poet was a poet
everything had a place:

six years were six years      ………   parallel lines followed rules
like obedient children
[the Dewey Decimal System]
………………………………………………..homes remained where they’d
been left.

Before the poet was a poet
many things went unseen:

clouds sometimes wheedled a ray out of the sun parents kept
…… photographs under their
pillows letters never said everything they wanted to lectures
…… were interrupted by a
commotion of leaves               every step was upon a blind spot.

by Sridala Swami
from 
Escape Artist
Aleph Book Co., New Delhi, 2014

The life of Philip Roth and the art of literary survival

Christian Lorentzen in Bookforum:

When Roth died at age eighty-five in 2018, Dwight Garner wrote in the New York Times that it was the end of a cultural era. Roth was “the last front-rank survivor of a generation of fecund and authoritative and, yes, white and male novelists.” Never mind that at least four other major American novelists born in the 1930s—DeLillo, McCarthy, Morrison, Pynchon—were still alive. Forget about pigeonholing as white and male an author who at the beginning of his career was invited to sit beside Ralph Ellison on panels about “minority writing”—because Jews were still at the margins. No matter that the modes that sustained Roth—autobiography with comic exaggeration, autobiographical metafiction, historical fiction of the recent past—are the modes that define the current moment. Roth was not an end point but the beginning of the present. There had been fluke golden boys before him, like Fitzgerald and Mailer, but Roth, twenty-six when he won the National Book Award for Goodbye, Columbus in 1960, reset the template for the prodigy author in the age of television, going at it with Mike Wallace in prime time. The morning before he spoke to Wallace he gave an interview to a young reporter for the New York Post, who asked him about a critic who’d called his book “an exhibition of Jewish self-hate.” A few weeks later the piece turned up in the mail Roth received from his clipping service while he was staying in Rome. He was quoted as saying the critic ought to “write a book about why he hates me. It might give insights into me and him, too.” “I decided then and there,” his biographer Blake Bailey quotes him saying at the time, “to give up a public career.”

At the time the remark might have been wishful thinking. In retrospect it’s laughably disingenuous. Far from retreating from public view, Roth embarked on a decades-long campaign of public-image control. He always hated critics, but reserved his vitriol for lengthy letters to the editor (one to the New York Review of Books in 1974 suggested that Times staff critic Christopher Lehmann-Haupt be sacked and his job be filled by an annual contest among undergraduates) or fictionalized rebukes where he and his alter-egos had the last word.

More here.

Biologist Marie Fish Catalogued the Sounds of the Ocean for the World to Hear

Ben Goldfarb in Smithsonian:

Among the many puzzles that confronted American sailors during World War II, few were as vexing as the sound of phantom enemies. Especially in the war’s early days, submarine crews and sonar operators listening for Axis vessels were often baffled by what they heard. When the USS Salmon surfaced to search for the ship whose rumbling propellers its crew had detected off the Philippines coast on Christmas Eve 1941, the submarine found only an empty expanse of moonlit ocean. Elsewhere in the Pacific, the USS Tarpon was mystified by a repetitive clanging and the USS Permit by what crew members described as the sound of “hammering on steel.” In the Chesapeake Bay, the clangor—likened by one sailor to “pneumatic drills tearing up a concrete sidewalk”—was so loud it threatened to detonate defensive mines and sink friendly ships.

Once the war ended, the Navy, which had begun to suspect that sea creatures were, in fact, behind the cacophony, turned to investigating the problem. To lead the effort it chose a scientist who, though famous in her day, has been largely overlooked by posterity: Marie Poland Fish, who would found the field of marine bioacoustics. By the time the Navy brought her on board in 1946, Fish was already a celebrated biologist. Born in 1900, Marie Poland—known to friends as Bobbie, on account of her flapper hairstyle—grew up in Paterson, New Jersey, and was a premedical student at Smith College. Upon graduating in 1921, though, she’d turned to the sea to spend more time with Charles Fish, a young plankton scientist whom she’d met while conducting cancer research at a laboratory on Long Island. In 1923, after spending a year as Charles’ research assistant, she took a job with the U.S. Bureau of Fisheries in Massachusetts; that same year, they married.

More here.

Friday, March 19, 2021

Literary Scholars Should Argue Better

Hannah Walser in the Chronicle of Higher Education:

A couple of weeks ago, I attended an interdisciplinary seminar featuring work in progress on law and humanities. After the guest presenter finished reading his chapter draft, the floor opened for discussion: Legal scholars pushed for more terminological precision, historians suggested alternative timelines, political scientists offered comparative context that called some of the author’s conclusions into question. It wasn’t until the frank, fun, productive conversation had wrapped up that I put my finger on what had been missing. Where was the praise?

In my own field, literary studies, almost every talk involves some kind of panegyric, from an effusive speaker introduction to a closing moment of gratitude for the power and timeliness of the event. In between, there’s a very good chance that audience members will begin their questions and comments with an expression of devout gratitude. Thank you so much for this beautiful, this important, this fascinating, this marvelous talk!

More here.

Quantum Mischief Rewrites the Laws of Cause and Effect

Natalie Wolchover in Quanta:

Alice and Bob, the stars of so many thought experiments, are cooking dinner when mishaps ensue. Alice accidentally drops a plate; the sound startles Bob, who burns himself on the stove and cries out. In another version of events, Bob burns himself and cries out, causing Alice to drop a plate.

Over the last decade, quantum physicists have been exploring the implications of a strange realization: In principle, both versions of the story can happen at once. That is, events can occur in an indefinite causal order, where both “A causes B” and “B causes A” are simultaneously true.

“It sounds outrageous,” admitted Časlav Brukner, a physicist at the University of Vienna.

The possibility follows from the quantum phenomenon known as superposition, where particles maintain all possible realities simultaneously until the moment they’re measured. In labs in Austria, China, Australia and elsewhere, physicists observe indefinite causal order by putting a particle of light (called a photon) in a superposition of two states. They then subject one branch of the superposition to process A followed by process B, and subject the other branch to B followed by A. In this procedure, known as the quantum switch, A’s outcome influences what happens in B, and vice versa; the photon experiences both causal orders simultaneously.

More here.

Meritocracy is bad

Matthew Yglesias in Slow Boring:

And if you talk to people with a curious and open mind, you’ll pretty quickly find out that New York Times reporters are really smart. So are McKinsey consultants. So are the people working at successful hedge funds. So are Ivy League professors. Probably the smartest person I know was in a great grad program in the humanities, couldn’t quite get a tenure track job because of timing and the generally lousing job market in academia, and wound up with a job in finance at a firm that is famous for hiring really smart people with unorthodox backgrounds. Our society is great at identifying smart people and giving them important or lucrative jobs.

This just turns out to be an outcome that still has some problems.

More here.

Broken English is My Mother Tongue

Kuba Dorabialski in the Sydney Review of Books:

When I started school in Australia I was put in a special class for ESL children. I was horrified to learn that I couldn’t speak English. I thought I spoke English just fine. Little did I know, it was actually Broken English that I spoke.

Many years later, as an adult, I was involved in a little open mic poetry community. Someone posted a recording from one of these events, and once again I was horrified; this time by the sound of my voice. It sounded so foreign to me. So Anglo-Australian. I had dropped my guard somewhere along the way and my Broken English had given way to an Art School Anglo-Aussie English with hints of Westie.

It took a while to recover from this shock and I momentarily stopped performing my work. After a while it became apparent to me that the only way to reclaim my voice was to return to my mother tongue: Broken English.

The Doctor Will Sniff You Now

Lina Zeldovich in Nautilus:

It’s 2050 and you’re due for your monthly physical exam. Times have changed, so you no longer have to endure an orifices check, a needle in your vein, and a week of waiting for your blood test results. Instead, the nurse welcomes you with, “The doctor will sniff you now,” and takes you into an airtight chamber wired up to a massive computer. As you rest, the volatile molecules you exhale or emit from your body and skin slowly drift into the complex artificial intelligence apparatus, colloquially known as Deep Nose. Behind the scene, Deep Nose’s massive electronic brain starts crunching through the molecules, comparing them to its enormous olfactory database. Once it’s got a noseful, the AI matches your odors to the medical conditions that cause them and generates a printout of your health. Your human doctor goes over the results with you and plans your treatment or adjusts your meds.

That’s how Alexei Koulakov, a researcher at Cold Spring Harbor Laboratory, who studies how the human olfactory system works, envisions one possible future of our healthcare. A physicist turned neuroscientist, Koulakov is working to understand how humans perceive odors and to classify millions of volatile molecules by their “smellable” properties. He plans to catalogue the existing smells into a comprehensive artificial intelligence network. Once built, Deep Nose will be able to identify the odors of a person or any other olfactory bouquet of interest—for medical or other reasons. “It will be a chip that can diagnose or identify you,” Koulakov says. Scent uniquely identifies a person or merchandise, so Deep Nose can also help at the border patrol, sniffing travelers, cargo, or explosives. “Instead of presenting passports at the airport, you would just present yourself.” And doctor’s visits would become a breeze—literally.

More here.

Can Cyrus Vance, Jr., Nail Trump?

Jane Mayer in The New Yorker:

On February 22nd, in an office in White Plains, two lawyers handed over a hard drive to a Manhattan Assistant District Attorney, who, along with two investigators, had driven up from New York City in a heavy snowstorm. Although the exchange didn’t look momentous, it set in motion the next phase of one of the most significant legal showdowns in American history. Hours earlier, the Supreme Court had ordered former President Donald Trump to comply with a subpoena for nearly a decade’s worth of private financial records, including his tax returns. The subpoena had been issued by Cyrus Vance, Jr., the Manhattan District Attorney, who is leading the first, and larger, of two known probes into potential criminal misconduct by Trump. The second was opened, last month, by a county prosecutor in Georgia, who is investigating Trump’s efforts to undermine that state’s election results.

Vance is a famously low-key prosecutor, but he has been waging a ferocious battle. His subpoena required Trump’s accounting firm, Mazars U.S.A., to turn over millions of pages of personal and corporate records, dating from 2011 to 2019, that Trump had withheld from prosecutors and the public. Before Trump was elected, in 2016, he promised to release his tax records, as every other modern President has done, and he repeated that promise after taking office. Instead, he went to extraordinary lengths to hide the documents. The subpoena will finally give legal authorities a clear look at the former President’s opaque business empire, helping them to determine whether he committed any financial crimes. After Vance’s victory at the Supreme Court, he released a typically buttoned-up statement: “The work continues.”

If the tax records contain major revelations, the public probably won’t learn about them anytime soon: the information will likely be kept secret unless criminal charges are filed. The hard drive—which includes potentially revealing notes showing how Trump and his accountants arrived at their tax numbers—is believed to be locked in a high-security annex in lower Manhattan. A spokesman for the Manhattan District Attorney’s office declined to confirm the drive’s whereabouts, but people familiar with the office presume that it has been secured in a radio-frequency-isolation chamber in the Louis J. Lefkowitz State Office Building, on Centre Street. The chamber is protected by a double set of metal doors—the kind used in bank vaults—and its walls are lined with what looks like glimmering copper foil, to block remote attempts to tamper with digital evidence. It’s a modern equivalent of Tutankhamun’s tomb.

More here.

Looking at Cicely Tyson

Danielle A. Jackson at The Current:

“I see the beauty now,” my mother told me when I asked her what she thought of Cicely Tyson’s face, about a week after the pathbreaking actor died in January at ninety-six. “But I didn’t then.” By “then,” she meant the decade and a half in the middle of the twentieth century, when Tyson won role after role in well-financed productions in the Hollywood system and made-for-TV films broadcast on the networks. Those were the years people learned her name. In Tyson’s earliest roles—starting with 1956’s Carib Gold, in which she was part of an ensemble that included Diana Sands and Ethel Waters—she’d made uncredited appearances, customary for actors who were not yet in the union. Tyson was in her early thirties when she began acting, yet she’d place her age behind by a decade at her agent’s request. It was a plausible lie because Tyson kept a youthful glow, with taut, espresso-brown skin that had rosy undertones, round black eyes that pierced and trembled, an erudite poise that made it seem as though her reach stretched well beyond its diminutive frame. For events, she was wise and precise with her attire, a trait she attributed to her parents, who’d arrived at Ellis Island from Nevis toward the end of the 1910s and settled in an East Harlem tenement. “When she and my dad strode into [church]—Mom in her rayon frock, high heels, and straw hat cocked to one side—a hush fell over the sanctuary,” Tyson writes in her memoir, Just as I Am. This was beauty, with substance underneath—wielded as honor and armor.

more here.