Are My Stomach Problems Really All in My Head?

Constance Sommer in The New York Times:

The New Mexican desert unrolled on either side of the highway like a canvas spangled at intervals by the smallest of towns. I was on a road trip with my 20-year-old son from our home in Los Angeles to his college in Michigan. Eli, trying to be patient, plowed down I-40 as daylight dimmed and I scrolled through my phone searching for a restaurant or dish that would not cause me pain. After years of carefully navigating dinners out and meals in, it had finally happened: There was nowhere I could eat.

“I’m so sorry, honey,” I said. “I feel really, really bad.” And I did. I was on the verge of tears, as much out of self-pity and shame as any maternal concern. Eli shook his head. “It’s OK, Mom. It’s not your fault.” But it was. Because of me — or, to be precise, my digestive system — we would not eat until we reached Amarillo, Texas, at 10 p.m., and bought frozen food from a grocery store near our Airbnb. My gut is not a carefree traveler. Ingest the wrong items, and my stomach feels as though someone’s scoured it with a Brillo pad. For the next few hours, I may also experience migraines, achy joints and a foggy, feverish sensation like I’m coming down with the flu. My doctors call this irritable bowel syndrome, or I.B.S. I call it a terrible shame.

More here.

Sunday, December 5, 2021

Wisława Szymborska: Advice for Writers

Joanna Kavenna in Literary Review:

Inverting the old cliché, Christopher Hitchens said, ‘Everyone has a book in them and that, in most cases, is where it should stay.’ The journalist and satirist Karl Kraus agreed: journalists, especially, should never write novels. This was self-satire, partly. Yet there are writers who can barely go to the shops without publishing a voluminous account immediately afterwards. At the other end of this unscientific spectrum are writers who destroy their work, either because they think it’s rubbish (Joyce, Stevenson, etc) or because they’ve become recently convinced it was written by the Devil (Gogol). Some people doubt themselves far too much, others not remotely enough.

With some or none of this in mind, the Polish author Wisława Szymborska, who died in 2012, destroyed 90 per cent of her writing. Despite that (or maybe because of it), she won the Nobel Prize in 1996 for ‘poetry that … allows the historical and biological context to come to light in fragments of human reality’. I have no idea what that means either.

More here.

The Webb Space Telescope Will Rewrite Cosmic History. If It Works.

Natalie Wolchover in Quanta:

To look back in time at the cosmos’s infancy and witness the first stars flicker on, you must first grind a mirror as big as a house. Its surface must be so smooth that, if the mirror were the scale of a continent, it would feature no hill or valley greater than ankle height. Only a mirror so huge and smooth can collect and focus the faint light coming from the farthest galaxies in the sky — light that left its source long ago and therefore shows the galaxies as they appeared in the ancient past, when the universe was young. The very faintest, farthest galaxies we would see still in the process of being born, when mysterious forces conspired in the dark and the first crops of stars started to shine.

But to read that early chapter in the universe’s history — to learn the nature of those first, probably gargantuan stars, to learn about the invisible matter whose gravity coaxed them into being, and about the roles of magnetism and turbulence, and how enormous black holes grew and worked their way into galaxies’ centers — an exceptional mirror is not nearly enough.

More here.

John Rawls’s doctrine of fairness

Olúfémi O. Táíwò in The Nation:

With its doctrine of fairness, A Theory of Justice transformed political philosophy. The English historian Peter Laslett had described the field as “dead” in 1956; with Rawls’s book that changed almost overnight. Now philosophers were arguing about the nature of Rawlsian principles and their implications—and for that matter were once again interested in matters of political and economic justice. Rawls’s terms became lingua franca: Many considered how his arguments, focused mostly on domestic or national issues of justice, might be applied to questions of international justice as well. Others sought to extend his theory’s set of political principles, while still others probed the limits of Rawls’s epistemology and the narrowness of his focus on individuals. A decade after A Theory of Justice appeared, Forrester notes, 2,512 books and articles had been published engaging with its central claims.

Rawls’s liberal theory of justice as fairness has continued to define the shape and trajectory of political philosophy and liberalism writ large to this day.

More here.

Master’s Degrees Are the Second Biggest Scam in Higher Education

Jordan Weissmann in Slate:

Last week, the Wall Street Journal published a troubling exposé on the crushing debt burdens that students accumulate while pursuing master’s degrees at elite universities in fields like drama and film, where the job prospects are limited and the chances of making enough to repay their debt are slim. Because it focused on MFA programs at Ivy League schools—one subject accumulated around $300,000 in loans pursuing screenwriting—the article rocketed around the creative class on Twitter. But it also pointed to a more fundamental, troubling development in the world of higher education: For colleges and universities, master’s degrees have essentially become an enormous moneymaking scheme, wherein the line between for-profit and nonprofit education has been utterly blurred. There are, of course, good programs as well as bad ones, but when you scope out, there is clearly a systemic problem.

More here.

Jerrold Rosenbaum: Are Psychedelics an Effective Treatment for Mood Disorders?

From Harvard Magazine:

Nancy Kathryn Walecki: So first of all, most people probably think of psychedelics in the context of the 1960s countercultural movements, when they were being used recreationally. So is using psychedelic drugs in psychiatric treatment a new idea?

Jerrold Rosenbaum: No, it’s not a new idea. The discovery of the psychoactive properties of psychedelics goes pretty far back. The iconic molecule LSD was synthesized by scientists, Albert Hofmann in 1938, working for Sandoz. And it was really due to a inadvertent contact with the substance—he hadn’t determined what it would be good for—that he actually absorbed some and had these remarkable experiences, visual and perceptual changes, and he knew he had something interesting. There was a fair amount of research going on with LSD really into the 60s as a serious exploration of what it might mean for patients with psychiatric disorders. And it was explored as a model of psychosis, as a potential treatment for psychotic disorders, and there was some recognition that it might play a role in treating addictive disorders. And so, serious scientists really in the early 60s at our National Institutes of Health, were looking at these LSD and related substances. The patent life for LSD ended and Sandoz stopped making it, but it turns out to be a pretty easy compound to manufacture. And so, home labs came into existence and LSD made its way into non-medical use and non-research use, and there was a fair amount of interest in the experience that LSD generated and it became sort of a hallmark of, as you pointed out, the counterculture population, particularly with disaffected youth in the 60s, triggered by political events, the Vietnam War and other frustrations with society. And, as you know, a Harvard professor or Harvard faculty member, Timothy Leary, was enthusiastic and was encouraging people to use it, that the reality or the alternative experience from day-to-day reality that these substances afforded was viewed as preferred. And so he encouraged people to tune-in turn-on, dropout, and many did, setting up alternative lifestyles. But it was also associated with the protest movement against the Vietnam War.

More here.

When You Can’t Change the World, Change Your Feelings

Arthur C. Brooks in The Atlantic:

Everyone—even the most privileged among us—has circumstances they would like to change in their life. As the early sixth-century Roman philosopher Boethius put it, “One has abundant riches, but is shamed by his ignoble birth. Another is conspicuous for his nobility, but through the embarrassments of poverty would prefer to be obscure. A third, richly endowed with both, laments the loneliness of an unwedded life.”

…Sometimes, changing your circumstances is difficult but absolutely necessary, such as in cases of abuse or violence. And sometimes, changing your circumstances is fairly easy: If you are lethargic every morning, start going to bed earlier. But in the gray areas in between, fighting against reality can be impossible, or incredibly inefficient. Maybe you have been diagnosed with a chronic illness for which there are no promising treatment options. Perhaps your romantic partner has left you against your wishes and cannot be persuaded otherwise. Maybe you have a job you like but a manager you don’t, and no one will give you a new boss.

In these sorts of situations, changing how you feel can actually be much easier than changing your physical reality, even if it seems unnatural. Your emotions can seem out of your control at the best of times, and even more so during a crisis—which is exactly when changing them would give you the greatest benefit. That can be blamed in part on biology. Negative emotions such as anger and fear activate the amygdala, which increases vigilance toward threats and improves your ability to detect and avoid danger. In other words, stress makes you fight, flee, or freeze—not think, What would a prudent reaction be at this moment? Let’s consider the options. This makes good evolutionary sense: Half a million years ago, taking time to manage your emotions would have made you a tiger’s lunch.

More here.

Sunday Poem

The Last Day of November

My wife is at her spinning wheel. She
first cleans, dries, and combs the fleece,
then dies the wool. She will spin yarn to make

a shawl, stocking cap, socks. She disappears
into her gentle quiet. I am a third of the way
through reading four books, but I don’t want

to read any of them. I want what I know you
want: to be happy, actually happy, to love
in a happy world. Today there was yet another

school shooting. Some students felt it coming.
Three kids who thought they were grown up,
dead. One more thought likely to die, did. The

others will live. The news dares to say recover.
Tonight we played Christmas carols for the first
time this season. Yes, ’tis the season. This morning

surgeons at three different hospitals awakened
assuming yet another routine day of rounds and
operations. When they were seventeen, did they

Read more »

Saturday, December 4, 2021

Lawrence Weiner (1942–2021)

more at Artforum:

Lawrence Weiner, a towering figure in the Conceptual art movement arising in the 1960s and who profoundly altered the landscape of American art, died December 2 at the age of seventy-nine. Known for his text-based installations incorporating evocative or descriptive phrases and sentence fragments, typically presented in bold capital letters accompanied by graphic accents and occupying unusual sites and surfaces, Weiner rose to prominence among a cohort that included Robert Barry, Douglas Huebler, Joseph Kosuth, and Sol LeWitt. A firm believer that an idea alone could constitute an artwork, he established a practice that stood out for its consistent embodiment of his famous 1968 “Declaration of Intent”:

The artist may construct the piece.
The piece may be fabricated.
The piece need not be built.
Each being equal and consistent with the intent of the artist the decision as to condition rests with the receiver upon the occasion of receivership.

more here.

Mandy Patinkin on Stephen Sondheim

Leo Robson at The New Statesman:

“Stephen’s story is well documented, the pain of it. Now here he was writing a beautiful song for the mother and wanting to write the son’s part. I had a relationship with my mother that I don’t think was as difficult, it had a little more grace, but it was challenging nonetheless. Stephen and I came to the conclusion that we never made the connection in the way we were searching for it. We kept passing by each other like ships in the night. A few days later, he hands me my part of the mother’s song. He’d taken our conversation and poeticised it. I got to be a teeny tiny part of what he was trying to say for this character. He wrote the most beautiful love song of two human beings trying to reach each other. That was the highlight of my entire professional life.”

more here.

Competition Is Not the Cure

Brian Callaci in The Boston Review:

A little less than a decade ago, after spending several years as a union staffer helping workers organize in low-wage industries, I was assigned to conduct research in support of fast food workers on strike for a $15 minimum wage and a union. It was exciting; workers were making bold demands on some of the most powerful corporations in the country, including a wage increase to double the current level of the Federal minimum wage. Too bold, in fact, for many. Democratic policymakers balked at $15. And with rare exceptions, the entire academic economics profession was opposed. Economists argued that if we forced employers to pay higher wages, they would simply hire fewer workers. A famous liberal economist even wrote a New York Times op-ed opposing $15. However, the Fight for $15 largely sidestepped the debates, so important to economists, about whether higher minimum wages would result in zero or nonzero job loss. Instead, it articulated a social vision of worker rights: workers had the right to the dignity of a living wage—their living in poverty was intolerable in a rich society.

Then a funny thing happened. Ignoring the experts, cities started passing $15 hourly minimum wage ordinances, and the economic sky didn’t fall. Unemployment didn’t skyrocket, or even rise perceptibly. The economists were wrong. Intellectually honest if initially mistaken, economists looked for theories that would better reflect reality, and a previously out-of-fashion theory known as “monopsony” became the new reigning conventional wisdom.

More here.

A Century of Disappointment: Reappraising Neoliberalism

Yulia Gromova interviews Quinn Slobodian in Strelka:

Yulia Gromova: The founding fathers of neoliberalism—Friedrich Hayek, Ludwig von Mises, and others, as you describe in your book, have created the basis for today’s global world order. They laid the foundation for institutions including the European Union and the World Trade Organization (WTO). What was particular about the Geneva School of neoliberalism?

Quinn Slobodian: What occurred to me was that capitalism hadn’t really had to deal with democracy until the twentieth century. It could reproduce itself and all of its inequalities by simply keeping some people out of the political picture, and denying them a voice in the political process.

At the beginning of the twentieth century, most of the world was in colonial status. The parts of the world that were in the metropole—the center of the empire, or had independence like the United States or Latin American countries—did not have anything close to universal suffrage. Women in every case were still denied the right to vote—allowing women to vote happened experimentally within the Paris Commune in the 1870s, but was quickly withdrawn and was only rolled out in exceptional cases more after the First World War. And in places like France, not until the Second World War.

So, the core question for Vienna school neoliberals in the 1920s and 1930s was first of all how to expand the voice within rich Western populations to include people without property and women. And then how to expand it beyond Europe to the former colonial countries of Asia, Africa, and parts of Latin America. And how to do that while preserving a system which has been proven to produce jaw-dropping inequalities between populations and parts of the world.

The first problem that they saw was decolonization. The dissolution of the Habsburg Empire after the First World War accompanied the end of the Russian and Ottoman Empires. It was the triumph of the nationality principle for the first time. That also came along with a certain idea of self-determination and popular sovereignty at the national level, which was more asserted than practiced until after the First World War.

More here.

Why Some People Find It Harder To Be Happy

Jolanta Burke in IFL Science:

The self-help industry is booming, fuelled by research on positive psychology – the scientific study of what makes people flourish. At the same time, the rates of anxietydepression and self-harm continue to soar worldwide. So are we doomed to be unhappy, despite these advances in psychology?

According to an influential article published in Review of General Psychology in 2005, 50% of people’s happiness is determined by their genes, 10% depends on their circumstances and 40% on “intentional activity” (mainly, whether you’re positive or not). This so-called happiness pie put positive-psychology acolytes in the driving seat, allowing them to decide on their happiness trajectory. (Although, the unspoken message is that if you are unhappy, it’s your own fault.) The happiness pie was widely critiqued because it was based on assumptions about genetics that have become discredited. For decades, behavioural genetics researchers carried out studies with twins and established that between 40% and 50% of the variance in their happiness was explained by genetics, which is why the percentage appeared in the happiness pie.

Behavioural geneticists use a statistical technique to estimate the genetic and environmental components based on people’s familial relatedness, hence the use of twins in their studies. But these figures assumed that both identical and fraternal twins experience the same environment when growing up together – an assumption that doesn’t really hold water. In response to the criticism about the 2005 paper, the same authors wrote a paper in 2019 that introduced a more nuanced approach on the effect of genes on happiness, which recognised the interactions between our genetics and our environment.

Nature and nurture are not independent of each other. On the contrary, molecular genetics, the study of the structure and function of genes at the molecular level, shows that they constantly influence one another. Genes influence the behaviour that helps people choose their environment.

More here.

Who Said Science and Art Were Two Cultures?

Kevin Berger in Nautilus:

On a May evening in 1959, C.P. Snow, a popular novelist and former research scientist, gave a lecture before a gathering of dons and students at the University of Cambridge, his alma mater. He called his talk “The Two Cultures and the Scientific Revolution.” Snow declared that a gulf of mutual incomprehension divided literary intellectuals and scientists.

“The non-scientists have a rooted impression that the scientists are shallowly optimistic, unaware of man’s condition,” Snow said. “On the other hand, the scientists believe that the literary intellectuals are totally lacking in foresight, peculiarly unconcerned with their brother men, in a deep sense anti-intellectual, anxious to restrict both art and thought to the existential moment.”

Snow didn’t expect much of his talk. “I thought I might be listened to in some restricted circles,” he said. “Then the effect would soon die down.” It didn’t. Snow tapped a cultural fault line that continues to rumble to this day. In his 2018 book, Enlightenment Now, Harvard psychologist Steven Pinker wrote that “Snow’s argument seems prescient.” The “disdain for reason, science, humanism and progress has a long pedigree in elite intellectual and artistic culture.”

More here.