The earliest letters are sometimes very funny, as Larkin tries on attitudes. At Oxford he claims to be lumbered with ancient and/or mentally defective tutors; his work appears in magazines but is no good; he is upbraided when he reads for pleasure. He tailors his tone to the recipient: bluff and undeluded for Sydney Larkin (‘Pop’), safely and tenderly domestic for Eva (‘Mop’) and affectionately satirical for his older sister, Kitty. He includes some moody Oxford scenery: ‘the playing fields wait for the games of this afternoon; through the unecstatic street the gowned bicycles are whirling.’ This is pretty sophisticated for an eighteen-year-old – partly a parody of the promised bicycle races in Auden’s ‘Spain’, partly the kind of ‘real’ thing that finds its way into Larkin’s poetry and fiction. His description of trying to access a copy of Lady Chatterley’s Lover at the Bodleian is a small-scale classic that would slot perfectly into Lucky Jim. You enjoy the voice without ever quite believing what Larkin says, but ‘what he says’, the making over of the humdrum world of college and digs into curmudgeonly comedy, is what matters. This spirit of negation persists into his maturity, but it hardens from playfulness into habit.
Culture shapes who we are, so it follows that it would also shape our manifestations of stress, mental disorder, emotion. Yet, that also implies a kind of messiness that modern psychology and psychiatry, particularly the American kind, have spent the last 100 years struggling to tidy up.
Since their founding, psychology and psychiatry have strove to standardize the diagnosis and treatment of mental disorders — to bring some certainty to what can feel like a very uncertain field.
But increasingly, clinicians are recognizing the downside of those strictures. Delivering the best care for patients will require something broader and more adaptable — mental health care models that can accommodate hundreds of individual cultures. And because no individual patient experiences a culture the same way, those models will ultimately have to do something even more radical: create the sort of super-personalized mental health care that the profession has aspired to — or, perhaps, should have aspired to — all along.
It is worth noting how unerring Berlin’s taste was. She spoke of both Chekhov and William Carlos Williams as models. Her characters read Middlemarch the way other people read Flowers in the Attic: dangling from one hand. The editor of A Manual for Cleaning Women, Stephen Emerson, describes exchanging books with her. He gave her Dreiser once and she hated it, saying he wrote like a guy. I have a soft spot for Dreiser, but it’s because half his writing is made up of descriptions of girls’ trim waists in tight suits twinkling up the steps. The only reason to read Dreiser at age 11 is to become bisexual, and Lucia was far too straight to fall for that. Or perhaps I’m talking about keeping it classy. Do you read Racine when you’re drunk? No, I read a novelisation of One Tree Hill called A Heart So True and it’s awesome, and that’s the reason I’ll never have a story in the Atlantic Monthly.
There is less to say about writers who know what to leave out. Even Davis, in her introduction to A Manual for Cleaning Women, seems somewhat at a loss, though their affinity is a given: a woman who writes a story like Davis’s ‘Mown Lawn’ is going to like a woman who writes: ‘There are certain perfect particular sounds. A tennis ball, a golf ball hit just right. A fly ball in a leather glove. Lingering thud of a knockout. I get dizzy at the sound of a perfect pool break, a crisp bank shot followed by three or four muffled slides and consecutive clicks.
Timothy Aubry in The Chronicle of Higher Education:
However inclined by their training to vacillate, scholars in the humanities are increasingly being asked to take sides. Should they support or oppose their students’ efforts to ban a reactionary speaker from campus? Should they defend the feminist philosopher who affirms the possibility of transracial identity or join those demanding her article be retracted? Should they remove an influential writer from the syllabus because of his fascist sympathies? Should they sign the petition urging their professional organization to join the Boycott, Divestment, and Sanctions campaign against Israel? So much of academic life seems colored by high-stakes political struggles, and so many decisions large and small are now treated as gestures of allegiance to particular ideological camps and as betrayals of others. It’s difficult to even list these polarizing campus scenarios without attracting political labels.
Literary scholars will very likely regard this situation as nothing new. Their discipline, after all, has been insisting for decades that everything is political. As far back as 1968, a group of radical scholars sought to take over the Modern Language Association’s annual conference. Louis Kampf was arrested for taping a poster to the wall of the conference hotel announcing that “The Tygers of Wrath are Wiser than the Horses of Instruction,” and Noam Chomsky led a Vietnam War teach-in with 500-plus attendees. Then, as anyone even glancingly familiar with the culture wars of the 1980s and 1990s knows, the emphasis on politics gave birth to an attitude of suspicion toward the canon. While authors like Shakespeare and Woolf were sometimes celebrated for their works’ subversive power, many others, like Conrad, were condemned for reinforcing dominant ideologies. The urge to dethrone literary heroes on the basis of their bad politics has persisted up to the present moment, gaining strength as the Black Lives Matter and #MeToo movements have emerged. The past year witnessed an especially heated debate about how to treat Junot Diaz’s work in light of allegations that he sexually harassed several women, with many now identifying conspicuous signs of misogyny in his fiction, while other scholars have maintained that he is being scapegoated because of his ethnic background.
If you’re an expert in climate science, you probably get this question a lot.
“I do,” said Kate Marvel, associate research scientist at the NASA Goddard Institute for Space Studies. “And I’ve been hearing it more recently.” It’s no mystery why. Reports of the threats from a warming planethave been coming fast and furiously. The latest: a startling analysis from the Intergovernmental Panel on Climate Change predicting terrible food shortages, wildfires and a massive die-off of coral reefs as soon as 2040, unless governments take strong action. The Paris climate accord set a goal of keeping the global temperature from rising more than 2 degrees Celsius, or 3.6 degrees Fahrenheit, above preindustrial levels.At 2 degrees, things are bad enough: Arctic sea ice is 10 times more likely to disappear over the summer, along with most of the world’s coral reefs. As much as 37 percent of the world’s population becomes exposed to extreme heat waves, with an estimated 411 million people subject to severe urban drought and 80 million people to flooding from rising sea levels.
But if we can hold the global temperature increase to 1.5 degrees Celsius, Arctic sea ice is far likelier to survive the summers. Coral reefs will continue to be damaged, but will not be wiped out. The percentage of people exposed to severe heat waves would plummet to about 14 percent. The number exposed to urban drought would drop by more than 60 million people. Still, no major industrialized nation is on track to meet the 2-degree goal, much less the 1.5-degree mark. And the Earth has already warmed by 1 degree. Even if, through huge effort and force of will, we cut our greenhouse gas emissions greatly, the effects of today’s carbon dioxide in the atmosphere will be felt for centuries to come.
While that is undoubtedly grim, it’s not as bad as it could be. Reducing the amount of greenhouse gases in the atmosphere could eventually reverse some of the most troublesome effects of warming.
I met Rene Magritte a few weeks ago at the Starline Social Club in Oakland. A surprisingly jolly fellow, it turns out he’s working these days as a pedicab driver in San Francisco. Surrealism isn’t my jam, but when he offered me a pickup the next morning at the BART and mushrooms and tickets for two to the retrospective of his work at SFMoMA, well—I had to accept.
There are many things one could say about the SFMoMA. It’s massive, it’s powerful, and it’s complicated. I have a sneaking suspicion that it operates as a kind of Leviathan in the Bay Area that sucks aesthetic energy into its great maw, gobbling up the local, less well-endowed swimmers and forcing an evacuation of the surrounding area that threatens to leave the gallery and studio scene like a bleached-out coral reef bereft of any but the largest predators. The regular reports of artists fleeing the Bay Area and moving to Los Angeles would seem to bear this out, as would the wriggling, lamprey-like presence of a Gagosian outpost since 2016 across the street.
That said, the collection SFMoMA contains is nothing short of incredible. Bracketing for the time being the delicious irony of the Fisher holdings—one of the world’s most extensive collections of post-war and contemporary art, a collection of some of the most sublime works of 20th century art, built from a fortune made by erasing the distinction between high and low culture (The Gap), valued at well over a billion dollars—visiting SFMoMA is an eye-widening, jaw-dropping experience. It has everything, all of it, and it is informed by a profound and generous curatorial intelligence. Each visit promises new understandings, a renewed interest in old favorites (“Here’s a room full of Paul Klee!”), and a reminder of what art and artists can do: the limitless reach of human creativity. It doesn’t engage in easy juxtapositions or cheap didactics. It just quietly and seductively invites you to join the conversation. There are some very smart people working there.
Le fils de l’homme (The Son of Man), by Rene Magritte, 1964.
The Magritte exhibit was proof of that. It was exquisitely curated. But whether it was because I didn’t eat the mushrooms or because surrealism is a movement that appeared during the hour when the sun casts its shortest shadow, I found the curatorial effort behind the exhibit more compelling than the work itself. Sorry, but I was underwhelmed. Magritte (for my taste) is neither creepy enough, nor playful enough, nor philosophical enough to warrant bathing with. Spending that much time with that much of his work was a lesson in the limits of puns and dreams. There’s a reason that it appeals to children, and we are not living in child-like times. These are ugly, impoverished times. Wounded times. Magritte doesn’t help us negotiate them. So at the end of it, I went in search of some art that would explode in my face or deepen my emotions or fill me with awe. Give me some Kara Walker, please, or some William Kentridge. Like I said, SFMoMA has it all. Finally, I found myself looking at a bunch of paintings by Philip Guston. Read more »
Popular political commentary from across the spectrum is replete with warnings social about “bubbles,” “silos,” and “echo chambers.” These are said to produce “closure,” “groupthink,” and an “alternate reality.” In turn, these forces result in the dysfunction of polarization, a condition where political officials and ordinary citizens are so deeply divided that there is no basis for compromise or even productive communication among them.
That polarization is politically dysfunctional might seem obvious. Where polarization prevails, the ground for compromise recedes, and so politics becomes a series of standoffs and bottlenecks. Yet politics still needs to get done. Hence democracy devolves into a numbers game of modus vivendi truces and strained compromises, resembling nothing like self-government among social equals.
In order to know what to do about polarization, we need a more precise view of what it is. It is helpful to distinguish (as we have elsewhere) between two different kinds of polarization: political polarization and belief polarization. The former refers to various ways of measuring the distance between political rivals. This distance can be conceived in terms of policyand platformdivides or else in terms of inter-party antipathy. But in either case, where political polarization prevails, the common ground among politically opposed parties falls out, resulting in political deadlock. The latter, belief polarization, refers to a phenomenon to which we are all subject by which interactions with like-minded others transforms us into more extreme versions of ourselves. Read more »
There’s a door behind her ear—or really in the fold where the ear meets the skull where a down of short brown hairs nestles up to her ear-cup, the door sometimes burns and she drags one long fingernail deep along the crease like an animal scratching itself with a claw to quiet whoever it is who lives there— she’s pretty sure it’s a door but it could be a wall.
* * *
Here you have the ear parade the hearing party the masquerade of ears that meet by the waterfront under the train bridge where the waves are at their reckless loudest—all cloaked in deep fur muffs or decorated with a little pretty lace hanging from a hat or jeweled with a flesh-tone hearing aid like a prehistoric statue.
I recently rewatched “12 Angry Men” with The Philosophy Club at the University of Iowa as part of their “Owl of Minerva” film series. The 1957 film has the late, great Henry Fonda as the lone holdout on a jury ready to convict a poor, abused 18-year-old boy for allegedly stabbing his father to death. Over one long, tense evening (shown in something close to real-time), juror #8 – none of the jurors are identified by name, only number – forces the rest of the jury to methodically reexamined the evidence. It’s not a courtroom drama, it’s a jury-room drama in which only 3 of 1:36 minutes of running time take place outside the sweaty, claustrophobic jury room. The film is intense, moving, and effective. Afterwards, I made the following remarks.
The number of jurors – the “12”, as they are starkly described in the 2007 Russian remake of “12 Angry Men” – is not entirely random. We have the Marquis de Condorcet, at least in part, to thank for that number. Condorcet was a moderate democrat during the French Revolution. He advocated universal suffrage and was an early advocate of universal primary education. He went into hiding after voting against the death penalty for Louis XVI, but was captured and died in his cell nine months later. Ironically, his warders had lost track of who he was by the time he died and he was identified only by the copy of Horace’s “Epistles” he had been carrying when he was arrested.
Condorcet had studied juries and concluded that, under the right circumstance juries and, by extension voting, is an extremely effective procedure for getting right answers. This was a consequence of his famous “Jury Theorem”. I won’t rehearse the mathematics here. But on an issue with two alternatives, where the decision is made independently by each participant, where there is also an objectively right decision, and each decision-maker has a greater than 50% chance of making that right decision a group of 5 or more people have a high likelihood of making the correct decision, a group of 12 has a higher likelihood of giving the correct verdict, and a group of a 1000 or more is nearly certain – out to several decimal places certain – to make the right call. In other words, if we think of a jury as a kind of procedure to determine the truth of a question, the more the better, but 12 makes a solid, practicable number. Read more »
It’s getting colder now in Beijing, and I can’t help but feel for the clothing left outside to dry. They had to hang through the night and on through the weak sunrise, doing their best to catch the wind before the temperature drops again. How do they feel being out there for passers-by to see, all exposed, caught up in the dust and very small toxic particles?
I wonder if they catch cold when the temperature drops and they’re still damp. They might huddle together for warmth, but then they’d have to stay out longer, and no one wants that. Is it wrong to forget about them, or retrieve them late, only when it is convenient? Not for the fabric—which I have no idea about—but, you know, morally. They’ll survive, sure, but maybe they deserve better.
“You start with a scarf…each 90-by-90-centimeter silk carré, printed in Lyon on twill made from thread created by the label’s own silkworms, holds a story. Since 1937, almost 2,500 original artworks have been produced, such as a 19th-century street scene from Ruedu Faubourg St.-Honore, the company’s home since 1880. The flora and fauna of Texas. A beach in Spain’s Basque country” –- this is a fragment from an advertisement article for Hermès in this month’s issue of a luxury magazine. The article is called “The Silk Road.” Does it refer to the “Silk Road” in any way that justifies the title, beyond the allure of legend? No. Does it mention that the first scarves created for this very label, in 1937, were made with raw silk from China? No. Not necessary, not relevant to the target reader. In fact, the less we mention the “East” while trying to sell such luxury designer items, the better, aiming as we are for the rich collector, the global consumer of fashion (whether belonging to the East or West) willing to spend hundreds of dollars on a small square of silk, and more likely to associate such status symbols with Western Europe rather than with the “underdeveloped,” impoverished, overpopulated, conflict-ridden East.
While silk has always been a coveted item, a symbol of wealth and power for millennia in many cultures around the world, the detailed, de-mythologized, accurate history of what we have come to know as the “Silk Road” is not only of little interest but has been deliberately suppressed in the West. Besides a vague connection with Marco Polo, most people usually draw a blank at its mention. During the many years that I have been working on (and presenting from) a trilogy of poetry manuscripts based on aspects of this history, I have come across few readers (including writers and academics) in the US who have a clear idea of the regions that have been, since antiquity, a part of these trade routes we call the Silk Road (or “’Seidenstrasse,” a term coined by the German historian Ferdinand von Richthofen in 1877) to define a network of land and sea routes of central and continuous importance to global trade as well as civilizational influences and the shifts in geographical borders around the world: a history that has been shaping not only how the world map looks from time to time, but how attitudes, knowledge, goods, technology, weapons, fashions, and even diseases and cures have been spreading across the continents and through the centuries. Read more »
A few months back my boss and I had lunch with the person who, wearing a t-shirt that read “black death spectacle”, stood in protest in front of a painting of Emmett Till by Dana Schutz called Open Casket at the last Whitney Biennial. Shortly after his gesture another artist penned an open letter about how Schutz’s painting uses “black pain” as a medium, and how this use by non-Black artists needs to go. I’m not sure what the ethical verdict is (of whether or not Schutz made a gravely racist error), or whether the artist’s letter voiced an instance of over-reaching aesthetic censorship, nor will I make any attempt at trying to resolve that issue here; it would take far more space than what is available and is not my aim. Consider reading Aruna D’Souza’s recent book Whitewalling: Art, Race & Protest in 3 Acts for a thorough treatment (which, not so incidentally, the above mentioned protestor provided images for).
What I am interested in, however, is the broader idea of pain as a medium. That pain can be an aestheticized form is completely fascinating, and yet it has been employed since at least the Bible. We’re talking here about both physical and mental pains. How does one explain the benefits of painful aesthetics – of horror, of discomfort, of terror, of anything undesirable in real life; generally speaking, of pain – if pain is intrinsically undesirable for most people? Well I understand pain to be valuable as a tool for education and experience because pain, more than pleasure, has the tendency to traumatize the people who suffer it. In other words, it makes a lasting impression. Pleasure leaves an impression too, but it doesn’t traumatize, and that seems relevant to this discussion. Read more »
“All the greatest hits from the past forty years have the same four chords,” Axis of Awesome taught us a decade ago. “You can take those four chords, repeat them, and pump out every pop song ever.”
The band has since released multiple versions of its famous four chord medley, cataloguing more than 80 chart-toppers that follow pretty much the exact same structure. Amongst them: “Let It Be” by the Beatles, “Can You Feel the Love Tonight” by Elton John, and “Under the Bridge” by the Red Hot Chili Peppers.
While they may have lifted the veil for some lay listeners, Axis of Awesome did not invent the music theory behind the four chord song. In fact, that progression has served as the backbone of popular music since the thirties, and its prevalence has only increased with time.
Recall such recent four chorders as “Despacito” by Luis Fonsi with Daddy Yankee, “Edge of Glory” by Lady Gaga, “Someone Like You” by Adele, “Call Me Maybe” by Carly Rae Jepsen, and “We Are Never Ever Getting Back Together” by Taylor Swift. Roll your eyes if you must, but these were seminal to the Billboard Top 100 when they came out. The trend has its fingers in far more pies than just pop music; there are four chord songs as indie folk as “The Sounds of Silence” by Simon and Garfunkel, “In the Aeroplane Over the Sea” by Neutral Milk Hotel, and “Lying to You” by Keaton Henson.
Music wasn’t always like this; Brahms and Wagner and Strauss would all sob at what earns airplay these days. So when the permutations, instrumentation, length, and knowledge of music are more boundless than ever, why do we settle for the same four chords? And if most songs are built on that, what makes certain ones destined for mixtapes and wedding dances and long car rides and summer soundtracks? Why do we consider anything to be special? Read more »
In Mohammed Hanif’s third novel Red Birds, US Air Force Major Ellie despairs of mission simulations being “dreamt up by some kid who’d never seen the inside of a cockpit”. Readers of literary fiction about war, if not of fiction in general, may feel a similar despair. Does the writer have enough authority to make their simulation convincing? Before Hanif was a Booker-longlisted author, or wrote for the New York Times and BBC, he trained as a pilot in Pakistan’s air force. What Major Ellie says about ejector seats and fireproof suits has the confidence of truth. But much more importantly, Hanif knows about the absurdity of war in a way that a civilian never could.
Ellie has been sent on a mission that is simply an “opportunity” to perform an act of courage that might distinguish him in his “straight line” of a career. It involves bombing a refugee camp in the middle of nowhere, “an outpost in a war that the war itself is not interested in”. (Hanif doesn’t specify the desert location; we feel it is an amalgam of borderlands between South Asia and the Middle East.) But after his plane goes down he is taken in, reluctantly, by the inhabitants of the camp. As he looks to find a way back home, a 15-year-old inhabitant Momo sees his chance to gain access to the deserted neighbouring US base.
Zhen Dai holds up a small glass tube coated with a white powder: calcium carbonate, a ubiquitous compound used in everything from paper and cement to toothpaste and cake mixes. Plop a tablet of it into water, and the result is a fizzy antacid that calms the stomach. The question for Dai, a doctoral candidate at Harvard University in Cambridge, Massachusetts, and her colleagues is whether this innocuous substance could also help humanity to relieve the ultimate case of indigestion: global warming caused by greenhouse-gas pollution.
The idea is simple: spray a bunch of particles into the stratosphere, and they will cool the planet by reflecting some of the Sun’s rays back into space. Scientists have already witnessed the principle in action. When Mount Pinatubo erupted in the Philippines in 1991, it injected an estimated 20 million tonnes of sulfur dioxide into the stratosphere — the atmospheric layer that stretches from about 10 to 50 kilometres above Earth’s surface. The eruption created a haze of sulfate particles that cooled the planet by around 0.5 °C. For about 18 months, Earth’s average temperature returned to what it was before the arrival of the steam engine.
In the United States today, the rich lay claim to a higher share of our nation’s wealth than they have at any point since the Gilded Age — and foreign-born residents account for a higher share of our nation’s population than at just about any time since that same era.
In his 2016 book, The Great Exception: The New Deal & The Limits of American Politics, the historian Jefferson Cowie suggests that these two developments are related. His case is simple: It is hard to implement egalitarian economic policies in the absence of working-class solidarity — and it is hard to achieve the latter in a context of mass, multi-ethnic immigration.
According to this analysis, it wasn’t purely coincidence that American workers secured themselves a “New Deal” shortly after Congress passed (profoundly racist) restrictions on immigration, nor that the New Deal consensus unraveled shortly after those restrictions were lifted in 1965. Throughout the Gilded Age, America’s industrial working class was riven by bitter tensions between protestants and Catholics; and/or between longtime Anglo-American citizens, and newly arrived Irish, German, and Jewish immigrants. These ethnic and religious tensions divided workers (and the trade union movement) between the two major parties, preventing them from consolidating power within either.
The National Institute on Drug Abuse estimates that 72,000 Americans died from drug overdoses in 2017, up from some 64,000 the previous year and 52,000 the year before that—a staggering increase with no end in sight. Most involved opioids.
A few definitions are in order. The term opioid is now used to include opiates, which are derivatives of the opium poppy, and opioids, which originally referred only to synthesized drugs that act in the same way as opiates do. Opium, the sap from the poppy, has been used throughout the world for thousands of years to treat pain and shortness of breath, suppress cough and diarrhea, and, maybe most often, simply for its tranquilizing effect. The active constituent of opium, morphine, was not identified until 1806. Soon a variety of morphine tinctures became readily available without any social opprobrium, used, in some accounts, to combat the travails and boredom of Victorian women. (Thomas Jefferson was also an enthusiast of laudanum, one of the morphine tinctures.) Heroin, a stronger opiate made from morphine, entered the market later in the nineteenth century. It wasn’t until the twentieth century that synthetic or partially synthetic opioids, including fentanyl, methadone, oxycodone (Percocet), hydrocodone (Vicodin), and hydromorphone (Dilaudid), were developed.
In 1996 a new form of oxycodone called OxyContin came on the market, and three recent books—Beth Macy’s Dopesick, Chris McGreal’s American Overdose, and Barry Meier’s Pain Killer—blame the opioid epidemic almost entirely on its maker, Purdue Pharma.