Category: Recommended Reading
Jacques d’Amboise (1934 – 2021)
I can’t wait to get back to normal. How long before I’m bored?
Ann Wroe in More Intelligent Life:
Many years ago, on a road trip in America, I found myself in Arcadia. It was not as expected. The Arcadia I imagined was all rolling green hills and verdant woods in which shepherds played their pipes and cross-dressing lovers lounged about on the grass. Arcadia, Kansas, was nothing like that: it was a tiny dot in a great sea of prairie flatness, under a hard blue sky.
This Arcadia had two red-brick storefronts, long abandoned, which were covered with ivy and leaning into the street; a few living stores, including a café with tired net curtains; and a shopfront with the words “City Hall” painted above the windows. Outside it, two middle-aged women were struggling to get the Stars and Stripes to half-mast on a tricky new flagpole, advised or obstructed by two plump young men in a pick-up truck that was parked in the middle of the street. Maybe I was the only other car that passed through Arcadia that day.
America’s pioneers, heading relentlessly west in their lumbering wagons, were easily excited. I also visited Eureka, Kansas, and apparently went through Climax too, though I never noticed. Any place that was halfway comfortable or sheltered, with grazing and water, was immediately hailed as the promised land. This was it: somewhere they could at last stop, settle and get on with the rest of their lives. Just like that place called Normal, for which we now all supposedly sigh. You can go direct to Normal, if you like. You will find that most of the five Normals in America are tiny unincorporated places given their name not for their deep ordinariness, but because they once had a teacher-training college, or “normal school”.
More here.
Saturday, May 8, 2021
Race, Policing, and The Limits of Social Science
Lily Hu in Boston Review:
Since the 1970s, the development of causal inference methodology and the rise of large-scale data collection efforts have generated a vast quantitative literature on the effects of race in society. But for all its ever-growing technical sophistication, scholars have yet to come to consensus on basic matters regarding the proper conceptualization and measurement of these effects. What exactly does it mean for race to act as a cause? When do inferences about race make the leap from mere correlation to causation? Where do we draw the line between assumptions about the social world that are needed to get the statistical machinery up and running and assumptions that massively distort how the social world in fact is and works? And what is it that makes quantitative analysis a reliable resource for law and policy making?
In both academic and policy discourse, these questions tend to be crowded out by increasingly esoteric technical work. But they raise deep concerns that no amount of sophisticated statistical practice can resolve, and that will indeed only grow more significant as “evidence-based” debates about race and policing reach new levels of controversy in the United States. We need a more refined appreciation of what social science can offer as a well of inquiry, evidence, and knowledge, and what it can’t. In the tides of latest findings, what we should believe—and what we should give up believing—can never be decided simply by brute appeals to data, cordoned off from judgments of reliability and significance. A commitment to getting the social world right does not require deference to results simply because the approved statistical machinery has been cranked. Indeed in some cases, it may even require that we reject findings, no matter the prestige or sophistication of the social scientific apparatus on which they are built.
More here.
How Radical Is President Biden?
Colonialism applied to Europe
Branko Milanovic over at his Substack:
Mark Mazower’s “Hitler’s Empire: How the Nazis ruled Europe” is a magisterial book.
I read it on vacation, and it is not a book I would suggest you take with you to the beach. Unless you want to spoil your vacation. But once you have made such a choice, you cannot stop reading it and the book will stay with you throughout your stay (and I believe much longer).
This Summer I read, almost back-to-back Adam Tooze’s “The deluge” and Mazower’s book. The first covers the period 1916-31, the second, the Nazi rule of Europe 1936-45. They can be practically read as a continuum, but they are two very different books. Tooze’s is, despite all the carnage of World War I and Russian Civil War, an optimistic book in which sincere or feigned idealism is battling conservatism and militarism. As I wrote in my review of Tooze’s book, the emphasis on the failed promise of liberal democracy (but a promise still it was) is a thread that runs through most of the book. Mazower’s book, on the other hand, is unfailingly grim and this is not only because the topic he writes about is much more sinister. The tone is bleaker. It is a book about the unremitting evil. It is the steady accumulation of murders, betrayals, massacres, retaliations, burned villages, conquests, and annihilation that makes for a despairing and yet compelling read. Europe was indeed, as another of Mazower’s book is titled, the dark continent.
Here I would like to discuss another aspect of Mazower’s book that is implicit throughout but is mentioned rather discreetly only in the concluding chapter. It concerns the place of the Second World War in global history. The conventional opinion is that the Second War should be regarded as a continuation of the First. While the First was produced by competing imperialisms, the Second was the outcome of the very imperfect settlement imposed at the end of the War, and the difference in interpretations as to how the War really ended (was it an armistice, or was it an unconditional surrender).
But that interpretation is (perhaps) faulty because it cannot account for the most distinctive character of the World War II, namely that it was the war of extermination in the East (including the Shoah). That is where Mazower’s placing of the War in a much longer European imperial context makes sense.
More here.
Kurt Vonnegut’s Socialism From Outer Space
Matthew Gannon and Wilson Taylor in The Tribune:
Kurt Vonnegut died 14 years ago today. A few weeks beforehand, he had been taking his dog for a walk and got tangled up in the leash. The 84-year-old American fabulist fell and hit his head on the sidewalk outside his midtown Manhattan brownstone, slipping into a coma that he never came out of. So it goes, the late author might have said.
‘So it goes’ is the characteristically resigned phrase that recurs throughout Vonnegut’s bleakly witty and moving 1969 novel Slaughterhouse-Five. In marking the anniversary of his death, we might also remember another line from that work, which includes time travel and aliens and the real story of Vonnegut’s time as a prisoner of the Nazis during the 1945 Allied firebombing of Dresden: ‘When a person dies he only appears to die.’
This is something the novel’s protagonist, Billy Pilgrim, learns from the Tralfamadorians, an extraterrestrial species that experiences time all at once rather than sequentially. Even if someone is dead now, they tell Billy, ‘he is still very much alive in the past, so it is very silly for people to cry at his funeral.’
It’s a comforting idea of sorts. ‘When a Tralfamadorian sees a corpse,’ Billy explains, ‘all he thinks is that the dead person is in bad condition in that particular moment, but that the same person is just fine in plenty of other moments.’
Billy Pilgrim becomes ‘unstuck in time’ in Slaughterhouse-Five, and he is unexpectedly dragged from one moment to another—even before his birth and after his death—by some unknown power. He’s an old man one minute and a baby the next. He never knows when he’ll be, and without warning he can be brought back to that firebombing that Vonnegut himself only survived because he and the rest of the prisoners of war sheltered in the basement of an abandoned slaughterhouse.
More here.
Evil in Early Kabbalah
Clive Bell and the Making of Modernism
Kathryn Hughes at The Guardian:
Bell performed his project of, to use Hussey’s subtitle, “making modernism” chiefly through the championing of “modern art”. By this he meant painting that eschewed anecdote, nostalgia or moral messaging in favour of lines and colours combined to stir the aesthetic sense. For ease of reference, he called the thing he was after “significant form”. While sensible Britain saw cubism, together with post-impressionism, as incoherent and formless to the point of lunacy, Bell followed the example of the older and more expert critic Roger Fry in reframing these movements as heroic attempts to purge the plastic arts of any lingering attachment to representational fidelity. His great touchstones were French (he called Paul Cézanne “the great Christopher Columbus of a new continent of form”) but admitted that occasionally you found an English painter who was making the right shapes – Vanessa Bell, say, or Duncan Grant. The fact that Vanessa was his wife and Duncan her lover detracted only slightly from his pronouncements.
more here.
A Wide-Roaming Meditation on Dürer and His Art
John Williams at the NYT:
As a titan of the Renaissance, Dürer needs no puffing up, but Hoare doesn’t stint on the claims: “No one painted dirt before Dürer,” is a particularly arresting example. He created “the first self-portrait of an artist painted for its own sake.” In a later self-portrait, he is “where the modern world begins. That stare, that self, that star.” He became “the first genuinely international artist,” and his engraving “Melencolia I” is “the most analyzed object in the history of art,” a great cipher of a piece that at less than 10 by 8 inches created “a new existential state.”
The breadth of the artist’s work made room for the most granular natural detail and the most hallucinatory fantasy. He was working at a time when reality was asserting itself in new ways, not long before Copernicus and Galileo astonished but also disillusioned us.
more here.
How Adult Children Affect Their Mother’s Happiness
Arthur C. Brooks in The Atlantic:
“You are … irritating and unbearable, and I consider it most difficult to live with you.” So wrote Johanna Schopenhauer in a 1807 letter to her 19-year-old son Arthur. “No one can tolerate being reproved by you, who also still show so many weaknesses yourself, least of all in your adverse manner, which in oracular tones, proclaims this is so and so, without ever supposing an objection. If you were less like you, you would only be ridiculous, but thus as you are, you are highly annoying.”
The two-century-old letter amazes not just for its mix of archaic diction and sick burns, but also because it violates some of humanity’s most basic assumptions about how mothers feel about their children. Motherhood is supposed to bring unparalleled happiness. The Bible, for example, is full of stories of women—Sarah, Hannah, Elizabeth—who go from sorrow to joy when God grants them an unexpected child. In real life, the relationship between happiness and motherhood is more complicated. Raising small children is far from unmitigated bliss. Year after year, surveys that ask mothers what they most want for Mother’s Day find that their No. 1 answer is time alone. As children grow up, mothers’ mixed feelings seem to stick around. Research suggests that plenty of mothers, while perhaps not as up-front as Johanna, feel some resentment toward their adult progeny, especially when the relationship feels unequal. Thankfully, social science also offers clues to how adult children can patch things up and make their moms happier.
…Arthur Schopenhauer grew up to become one of the greatest thinkers of the 19th century, but he never figured out how to make his mom happy. “The door that you slammed so loudly yesterday, after you had conducted yourself extremely improperly toward your mother, closed forever between you and me,” Johanna wrote to him after an especially bad argument in 1813. By all accounts they never saw each other again.
More here.
The Women of NPR, When NPR Was a Start-Up
Zoe Greenberg in The New York Times:
When Nina Totenberg was a young reporter hustling for bylines in the 1960s, she pitched a story about how college women were procuring the birth control pill. “Nina, are you a virgin?” her male editor responded. “I can’t let you do this.” Such were the obstacles that Totenberg and the women journalists of her generation faced, largely relegated to the frivolous “women’s pages” and denied the chance to cover so-called hard news. But as Lisa Napoli’s “Susan, Linda, Nina & Cokie” chronicles, just as some women journalists were suing Newsweek and The New York Times over gender discrimination, in the 1970s, an upstart nonprofit called National Public Radio arrived on the scene offering new opportunities.
NPR, unlike its well-resourced competitors, was eager to hire sharp, inventive, low-wage workers who couldn’t find jobs anywhere else — in other words, women. That decision launched the star-bright careers of Napoli’s subjects: Susan Stamberg, Linda Wertheimer, Totenberg and Cokie Roberts, and they in turn helped transform NPR from “the nation’s largest unlicensed Montessori school,” as an early study described it, to the vaunted institution it is today. (Finally allowed to do the stories she wanted, Totenberg became an iconic Supreme Court reporter.) Napoli, herself a reporter for print and radio who has written three other books, illuminates the terrifying, thrilling energy of NPR as start-up: “Not a day would go by without a tape reel being hurled like a Frisbee into the control room at the last minute, or breaking during playback, and it was never quite clear who’d show up for work or whether there’d be enough stories to fill the time.”
The book is a lesson in how the fringe project of one generation becomes the mainstream of the next.
More here.
Saturday Poem
Service Economy Fantastique
After the
lights are ….. turned off at the
………………………………. restaurant
………… the grillcook and dishwasher
………… wait for the floor they have
………… mopped to dry in the dark to the
………… humid ministrations of eight ceiling
………… fans and two stereo speakers
………… pulsing like the throat of
………… Marvin Gaye calling
……………… GT-UP GET-UP GET-UP GET-UP
as
they wait
in dining room darkness,
dancing in place before the Nero
neon of the clock on the wall, … as the … windows
…………………………………………………… …….blush
by Michael Veve
from El Coro
University of Massachusetts Press, 1997
Friday, May 7, 2021
The Hume paradox: how great philosophy leads to dismal politics
Julian Baggini in Prospect:
How did one of the greatest philosophers who ever lived get so much wrong? David Hume certainly deserves his place in the philosophers’ pantheon, but when it comes to politics, he erred time and again. The 18th-century giant of the Scottish Enlightenment was sceptical of democracy and—despite his reputation as “the great infidel”—in favour of an established church. He was iffy on the equality of women and notoriously racist. He took part in a pointless military raid on France without publicly questioning its legitimacy.
In unravelling the Hume paradox, what we find is that the very qualities that made Hume such a brilliant philosopher also made him a flawed political thinker. There are implications here for contemporary academic philosophy—whose much-vaunted “transferable critical skills” turn out not to transfer so well after all. Styles of thinking that work brilliantly in some domains fail miserably in others: indeed, some of our biggest mistakes arise when we transfer a way of thinking apt for one domain to another where it just doesn’t fit. There are consequences, too, for day-to-day and working life: Hume shows that the smartest person in the room isn’t necessarily the smartest choice for the job. And then there are general implications for the way in which a healthy intellectual scepticism, the essential precondition for rational enquiry in science and much else, can easily become a fatalistic cynicism about the prospects for building a better society.
More here.
The weird science of the placebo effect keeps getting more interesting
Brian Resnick in Vox:
The story of the placebo effect used to be simple: When people don’t know they are taking sugar pills or think they might be a real treatment, the pills can work. It’s a foundational idea in medicine and in clinical drug trials dating back to the 1950s.
Then Ted Kaptchuk came along.
Kaptchuk is a professor at Harvard Medical School, and over the past decade, he and colleagues have shown, in study after study, that giving people placebos openly — that is, telling them they are taking a placebo — helps them feel better. Specifically, they found a placebo can relieve not just pain but also anxiety and fatigue.
In February, Kaptchuk and his colleagues published the results of a clinical trial comparing these open-label placebos to double-blind placebos (the gold standard in medical research) in treating irritable bowel syndrome. Both were equally effective.
More here.
Can capitalism be fixed?
Alyssa Battistoni in The Nation:
The economist Branko Milanovic has been a central participant in the debates of this emerging field, as well as one of its most idiosyncratic contributors. Born in Belgrade when it was part of Yugoslavia, Milanovic wrote his dissertation on income inequality in his home country long before it was a fashionable topic. He went on to research income inequality as an economist at the World Bank for nearly two decades before taking up a string of academic appointments; he currently teaches at the Graduate Center of the City University of New York. But he is not your typical World Bank economist: Milanovic knows his Marx and, though not a Marxist himself, has long insisted on the value of class analysis and historical perspectives to economics, while also dabbling in political-philosophy debates about distributive justice. His experience of life under actually existing socialism, meanwhile, gave him critical distance from the end-of-history narratives that were trumpeted in much of the West after the fall of the USSR—as well as from the end-of-the-end-of-history hand-wringing that has proliferated since 2016. The discourse, then, seems to be catching up to where Milanovic has been all along.
More here.
“Trains & Tempos”: A Timelapse Film by Michael Tretner
[Thanks to Rafaël Newman.]
Crush
Larissa Pham at The Believer:
To apply Claude Lévi-Strauss’s structural anthropology, a crush is called a crush because it crushes you. A crush is distinct from friendship or love by dint of its intensity and sudden onset. It is marked by passionate feeling, by constant daydreaming: a crush exists in the dreamy space between fantasy and regular life. The objects of our crushes, who themselves may also be referred to as crushes, cannot be figures central to our daily lives. They appear on the periphery of our days, made romantic by their distance.
Crush can act as both a noun and a verb: “You are my crush”; “I am crushing on you.”
Crush can be both subject and object: “You are my crush”; “I have a crush on you.”
According to my Google searches, the first recorded instance of the use of crush in a romantic sense, to mean a person one is infatuated with, is from 1884, in the diary of Isabella Maud Rittenhouse. As in: “Wintie is weeping because her crush is gone.” By 1913, it had entered usage as a verb.
more here.
Roxane Gay, Larissa Pham, and Kim Fu present Kink
The Travels of a Master Storyteller
Yasmine Seale at The Paris Review:
Aladdin, readers are sometimes surprised to learn, is a boy from China. Yet the text is ambivalent about what this means, and pokes gentle fun at the idea of cultural authenticity. Scheherazade has hardly begun her tale when she forgets quite where it is set. “Majesty, in the capital of one of China’s vast and wealthy kingdoms, whose name escapes me at present, there lived a tailor named Mustafa.” The story’s institutions are Ottoman, the customs half-invented, the palace redolent of Versailles. It is a mishmash and knows it.
Like Aladdin, like Aleppo, Diyab’s is a story of mixture. He knows French, Turkish, Italian, even Provençal—but not Greek: in Cyprus, unable to understand the language, he feels like “a deaf man in a wedding procession.” Slipping in and out of personae, he is alert to the masquerades of others.
more here.
