Suffering, not just happiness, weighs in the utilitarian calculus

Scott Samuelson in Aeon:

In 1826, at the age of 20, John Stuart Mill sank into a suicidal depression, which was bitterly ironic, because his entire upbringing was governed by the maximisation of happiness. How this philosopher clambered out of the despair generated by an arch-rational philosophy can teach us an important lesson about suffering. Inspired by Jeremy Bentham’s ideals, James Mill’s rigorous tutelage of his son involved useful subjects subordinated to the utilitarian goal of bringing about the greatest good for the greatest number. Music played a small part in the curriculum, as it was sufficiently mathematical – an early ‘Mozart for brain development’. Otherwise, subjects useless to material improvement were excluded. When J S Mill applied to Cambridge at the age of 15, he’d so mastered law, history, philosophy, economics, science and mathematics that they turned him away because their professors didn’t have anything more to teach him.

The young Mill soldiered on with efforts for social reform, but his heart wasn’t in it. He’d become a utilitarian machine with a suicidal ghost inside. With his well-tuned calculative abilities, the despairing philosopher put his finger right on the problem:

[I]t occurred to me to put the question directly to myself: ‘Suppose that all your objects in life were realised; that all the changes in institutions and opinions which you are looking forward to could be completely effected at this very instant: would this be a great joy and happiness to you?’ And an irrepressible self-consciousness distinctly answered: ‘No!’ At this my heart sank within me: the whole foundation on which my life was constructed fell down.

For most of our history, we’ve seen suffering as a mystery, and dealt with it by placing it in a complex symbolic framework, often where this life is conceived as a testing ground. In the 18th century, the mystery of suffering becomes the ‘problem of evil’, in which pain and misery turn into clear-cut refutations of God’s goodness to utilitarian reformers. As Mill says of his father: ‘He found it impossible to believe that a world so full of evil was the work of an Author combining infinite power with perfect goodness and righteousness.’

For a utilitarian, the idea of worshipping the creator of suffering is not only absurd, it undercuts the purpose of morality. It channels our energies toward the acceptance of what we should remedy. To revere the natural order could even turn us into moral monsters. Mill says: ‘In sober truth, nearly all the things which men are hanged or imprisoned for doing to one another, are nature’s every day performances.’ What Mill calls the ‘Religion of Humanity’ involves pushing aside the old conception of God, and taking over responsibility for what happens in the world. We’re to become the good architect that God never was.

More here.

CRISPR takes on Huntington’s disease

Michael Eisenstein in Nature:

Like most other neurological disorders, Huntington’s disease has proved to be a costly and frustrating target for drug developers. But it also has distinctive features that make it a good match for treatments that target genes. It arises from a mutation in a single gene that encodes the protein huntingtin, and a disease-causing copy of the gene can be readily distinguished from a normal copy by the presence of an overlong stretch of a repeated triplet of nucleotides, CAG. Before turning to CRISPR, Davidson and her colleagues had some success in treating animal models of Huntington’s disease with RNA interference (RNAi), which uses synthetic molecules of RNA to prevent the production of mutant huntingtin — although it took them a considerable amount of time to get there. “We’ve focused the last 17 years on RNAi-based approaches,” says Davidson. However, both this and a promising related treatment for Huntington’s disease that involves antisense oligonucleotides will probably require long-term, repeated administration to provide sustained benefits.

By contrast, CRISPR could achieve the same benefits through a single dose that permanently inactivates the defective gene with remarkable efficiency, as Davidson’s team demonstrated last year1, both in cells from people with Huntington’s disease and in mouse models of the condition. “I was surprised how easy it was — I think that’s the beauty of the system,” she says. In the past five years, several teams of researchers have independently shown that genome editing can reliably eliminate the gene that encodes mutant huntingtin, thereby halting the production of the toxic protein and its accumulation into clumps in experimental models.

But clearing protein clumps in mice is of questionable value when researchers often struggle to translate such findings into treatments for people — in general, potential therapies for brain disorders have a long history of failure and disappointment in clinical trials. Accordingly, the early adopters of CRISPR are trying to obtain clearer evidence of its probable clinical benefits while grappling with thorny questions related to its safety, efficacy and delivery that it is crucial to answer before trials in people can take place. “I believe we can now seriously consider clinical strategies to edit huntingtin,” says Nicole Déglon, a neurologist at the Lausanne University Hospital in Switzerland, “but I would say we are still at the very beginning of the story.”

More here.

Thursday Poem

To Have My Sister Back

Deeba, did you know 
I went to your room yesterday 
looking for you?

The room was so dark 
I thought you were sleeping.
I tiptoed, 
whispered your name in a hanged man’s voice 
– Deeba – Deeba – Deeba.
But you did not reply.
I drew the curtains 
to catch your eyes in the light, 
only to be disappointed.

You were not there 
only everything else that was yours: 
saris in the alna 
lipstick, hair clips, hair brush 
on the dressing table, 
finger marks on the mirror,
shadowy, without intent, 
tablas on the almira 
tanpura and sitar leaning 
on the side of your bed.

It looked as though 
nothing visits your room 
but the pungent dust 
of the growing city 
that is trying to claim 
hold of your belongings.

An overpowering silence in the room, 
so overpowering in fact that 
I could hear your invisible hands 
still tapping the tablas you revered, 
so I picked up the tanpura 
and pulled its string 
to bring back 
a little melody
to the room 
that died with you.

The Defeat of Reason

Tim Maudlin in the Boston Review:

People are gullible. Humans can be duped by liars and conned by frauds; manipulated by rhetoric and beguiled by self-regard; browbeaten, cajoled, seduced, intimidated, flattered, wheedled, inveigled, and ensnared. In this respect, humans are unique in the animal kingdom.

Aristotle emphasizes another characteristic. Humans alone, he tells us, have logos: reason. Man, according to the Stoics, is zoön logikon, the reasoning animal. But on reflection, the first set of characteristics arises from the second. It is only because we reason and think and use language that we can be hoodwinked.

Not only can people be led astray, most people are. If the devout Christian is right, then committed Hindus and Jews and Buddhists and atheists are wrong. When so many groups disagree, the majority must be mistaken. And if the majority is misguided on just this one topic, then almost everyone must be mistaken on some issues of great importance. This is a hard lesson to learn, because it is paradoxical to accept one’s own folly. You cannot at the same time believe something and recognize that you are a mug to believe it. If you sincerely judge that it is raining outside, you cannot at the same time be convinced that you are mistaken in your belief. A sucker may be born every minute, but somehow that sucker is never oneself.

The two books under consideration here bring the paradox home, each in its own way. Adam Becker’s What Is Real? chronicles the tragic side of a crowning achievement of reason, quantum physics. The documentarian Errol Morris gives us The Ashtray, a semi-autobiographical tale of the supremely influential The Structure of Scientific Revolutions (1962) by Thomas S. Kuhn. Both are spellbinding intellectual adventures into the limits, fragility, and infirmity of human reason.

More here.

All life on Earth, in one staggering chart

Brian Resnick and Javier Zarracina in Vox:

By weight, human beings are insignificant.

If everyone on the planet were to step on one side of a giant balance scale, and all the bacteria on Earth were to be placed on the other side, we’d shoot violently upward. That’s because all the bacteria on Earth combined are about 1,166 times more massive than all the humans.

Comparisons to other categories of life similarly demonstrate how very, very small we are. As a sweeping new study in the Proceedings of the National Academy of Sciences finds, in a census sorting all the life on Earth by weight (measured in gigatons of carbon, the signature element of life on Earth), we make up less than 1 percent of life.

There are an estimated 550 gigatons of carbon of life in the world. A gigaton is equal to a billion metric tons. A metric ton is 1,000 kilograms, or about 2,200 pounds.

We’re talking in huge, huge, mind-boggling terms here.

So, using the new data in PNAS, we tried to visualize the weight of all life on Earth to get a sense of the scale of it all.

More here.

Unions are too vital to democracy to be allowed to gentrify and die

Kenan Malik in The Guardian:

Two reports last week exposed both the changing character of the labour market and the degree to which the power of the organised working class has eroded.

The Office for National Statistics revealed that there were just 79 strikes (or, more specifically, stoppages) last year, the lowest figure since records began in 1891. Just 33,000 workers were involved in labour disputes, the lowest number since 1893. Victorian conditions have returned in more ways than one.

It’s not just the number of strikes that has fallen. Trade union membership has too. The latest figures from the Department for Business, Energy and Industrial Strategy show that just 23.2% of employees were unionised in 2017, a half that of the late 1970s.

The fall has been greatest among the young. The proportion of union members under 50 has fallen over the past 20 years, while that above 50 has increased.

Strikingly, too, unions have increasingly become clubs for professionals. One in five employees works in professional jobs, but they make up almost 40% of union members. These days, you are twice as likely to be unionised if you have a degree than if you have no qualifications. It’s a far cry from the old image of the trade unionist as an industrial worker. Unions have not just shrunk – their very character has changed. Like politics, trade unionism has become more professional and technocratic.

More here.

The Man Behind the Weegee

Christopher Bonanos at The Paris Review:

Usher Fellig was a greenhorn, a hungry shtetl child from eastern Europe who spoke no English. When he came through Ellis Island in 1909, at ten years old, he reinvented himself, as so many immigrants do. In his first years in New York, Usher became Arthur, a Lower East Side street kid who was eager to get out of what he called “the lousy tenements,” earn a living, impress girls, make a splash. He had turned his name (slightly) less Jewish and his identity (somewhat) more American, as much as he could make it. As a young man, he was shy, awkward, broke, and unpolished, and at fourteen, he became a seventh-grade dropout. He was also smart, ambitious, funny, and (as he and then his fellow New Yorkers and eventually the world discovered) enormously expressive when you put a camera in his hands.

more here.

donald glover’s ‘atlanta’

Niela Orr at The Baffler:

The show’s visual grammar reflects this anxiety and paranoia. Murai’s choice to use film is a potent reminder that this is still fiction, but the choice makes it all the more real—not only because film adds texture, but also because when the celluloid occasionally pops, that jumpiness contributes to the show’s overall tone. Late night, after the strip club gets out, there’s an Edward Hopper tableau of small, illuminated spaces engulfed in darkness. The introduction of magical realism is straight out of Gabriel Garcia Marquez, only Atlanta is a black Macondo made famous by Magic City, and instead of a woman floating up into the sky, there’s a disappearing club promoter and an invisible sports car.

more here.

Wednesday Poem

XX. In Which the Cartographer Tells Off the Rastaman

The cartographer sucks his teeth
and says – every language, even yours,
is a partial map of this world – it is
the man who never learnt the word
‘scrupe’ – sound of silk or chiffon moving
against a floor – such a man would not know
how to listen for the scrupe of his bride’s dress.
And how much life is land to which
we have no access? How much
have we not seen or felt or heard
because there was no word
for it – at least no word we knew?
We speak to navigate ourselves
away from dark corners and we become,
each one of us, cartographers.

by Kei Miller
from The Cartographer Tries to Map a Way to Zion
Carcanet, Manchester, 2014

When an extraordinary collection of talent gathered at the White House

Thomas Oliphant in The Washington Post:

The credit for arguably the best idea ever for a White House event goes to the fertile brain of Richard Goodwin. In a note to Jacqueline Kennedy in November 1961, the president’s aide was to the point: “How about a dinner for the American winners of the Nobel Prize?” Within six months, the largest social event of the New Frontier had occurred. By contemporary accounts, it was a smash. And it has resonated through the decades as a symbol of what that “one brief, shining moment” was capable of on its best days, and of the impact a White House can have on American culture and the creative minds who inhabit it. Comparisons to the disgusting atmosphere of the present are obvious. John Kennedy was pleased, but not entirely. According to his and his wife’s friend, the artist William Walton, the president called him later to complain about the woman who had been seated on his right, Ernest Hemingway’s fourth wife and widow of a year, Mary, who gave him repeated guff for his Cuba policy; Kennedy was more impressed by the dignified woman on his left, Katherine Marshall, the widow of George C. Marshall, the World War II commander and architect of the postwar reconstruction plan that bore his name. “Well, your friend Mary Hemingway is the biggest bore I’d had for a long time,” Walton quoted JFK as saying. “If I hadn’t had Mrs. Marshall I would have had a terrible night.” Walton couldn’t say much in reply: Mary Hemingway was right next to him in his Georgetown home when the call came.

According to author Joseph A. Esposito — whose “Dinner in Camelot” is a delightful, detailed account of the dinner, its background, its repercussions and its lasting meaning — the 127 seated guests (at 19 tables, 14 in the State Dining Room and five more in the Blue Room, where Mrs. Kennedy was located) included 49 Nobel laureates and spouses. The vast majority had toiled in the hard sciences; for the April 29, 1962, affair, the list expanded to include a sprinkling of Latin American luminaries and one Canadian, Lester Pearson.

He was on his way to becoming prime minister after getting a Nobel Peace Prize in 1957 for his efforts as foreign minister to end the fighting over the Suez Canal. It also included several bright lights in the emerging field of American letters. Of the five U.S. winners of the literature prize, three were dead (Sinclair Lewis, Eugene O’Neill and Hemingway); William Faulkner, in his final weeks and under care in Charlottesville, said he thought 100 miles was a bit too far to go for supper; and John Steinbeck, who got his Nobel later that year, pleaded business in Europe. T.S. Eliot, a St. Louis native and eventually a British subject, wasn’t invited; but Pearl Buck was there and briefly debated Korea policy with Kennedy before the post-supper program in the East Room.

More here.

Why Professors Distrust Beauty

Charlie Tyson in the Chronicle of Higher Education:

How easy it would be, a young Susan Sontag considered, to slide into academic life. Make good grades, linger in graduate school, write “a couple of papers on obscure subjects that nobody cares about, and, at the age of sixty, be ugly and respected and a full professor.” Nearly 70 years after this journal entry, the word “ugly,” planted so squarely in the sentence’s final phrase, still stings.

Academe styles itself as the aristocracy of the mind; it is generally disdainful of the body and of the luxury goods commercial society finds beautiful. In short, professors distrust beauty. The preening self-abasement with which they do so, Stanley Fish wrote in the 1990s, is why academics take pride in driving Volvos.

In truth, beauty’s conflicted status among academics probably derives less from the elevation of mind over body and more from the long exclusion of women from the professoriate. For most of the 20th century to be a professor was to be male, and therefore theoretically unsexed, and thus seemingly exempt from the female gendered standards of the fashion industry and mass entertainment. Female academics face a double bind: Look attractive and you seem unserious; look homely and you seem dour. Male academics, for their part, loll in ink-stained corduroys and rumpled shirts. The fashion-conscious few adopt intellectual aesthetics, for instance, riffs on Foucault with black turtlenecks, sleek, shaved heads, and big plastic glasses thrown in for good measure.

That academics encounter beauty in their private lives as a mystifying or corrupted alien force was a cliché by the time Fish cast his eye on the faculty parking lot. Yet the inconsistent treatment beauty has received in scholarly research demands explanation. In the humanities, beauty is ignored or seen as a vague embarrassment, and in the social sciences the topic is treated only superficially. If beauty remains a serious subject of study anywhere, it is in the sciences, certain corners of which have enlisted beauty as an organizing ideal.

More here.

Why some scientists say physics has gone off the rails

Dan Falk at NBC News:

The 20th century was a remarkably productive one for physics. First, Albert Einstein’s theory of general relativity helped us view gravity not as a force but as a distortion of space. Then Max Planck, Erwin Schrödinger and Werner Heisenberg gave us quantum mechanics — and a fresh understanding of the subatomic world.

In the middle of the century, two new forces were discovered deep within the atom (the strong and weak nuclear forces). Finally, in the century’s last decades, we got the Standard Model of particle physics — an accounting of all the particles and forces known to exist in our universe.

But the new century brought a rough patch. Yes, there have been some remarkable findings, including the 2012 discovery of the Higgs Boson and the discovery of gravitational waves four years later. But those triumphs were based on theories developed decades earlier — a full century earlier in the case of gravitational waves. And new ideas like string theory (which holds that matter is made up of tiny vibrating loops of energy) remain unverified.

More here.

Meet the Economist Behind the One Percent’s Stealth Takeover of America

Lynn Parramore at the Institute for New Economic Thinking:

Ask people to name the key minds that have shaped America’s burst of radical right-wing attacks on working conditions, consumer rights and public services, and they will typically mention figures like free market-champion Milton Friedman, libertarian guru Ayn Rand, and laissez-faire economists Friedrich Hayek and Ludwig von Mises.

James McGill Buchanan is a name you will rarely hear unless you’ve taken several classes in economics. And if the Tennessee-born Nobel laureate were alive today, it would suit him just fine that most well-informed journalists, liberal politicians, and even many economics students have little understanding of his work.

The reason? Duke historian Nancy MacLean contends that his philosophy is so stark that even young libertarian acolytes are only introduced to it after they have accepted the relatively sunny perspective of Ayn Rand. (Yes, you read that correctly). If Americans really knew what Buchanan thought and promoted, and how destructively his vision is manifesting under their noses, it would dawn on them how close the country is to a transformation most would not even want to imagine, much less accept.

More here.  [Thanks to Asad Raza.]

Imran Khan’s rise is a metaphor for a changing world the west has failed to see

Jason Burke in The Guardian:

It is election season in Pakistan. Expect massive rallies, dust, shouted slogans in stadiums, dirty tricks, a modicum of violence and industrial quantities of sweet tea consumed by candidates and voters alike.

The frontrunner in the poll is Imran Khan, the cricketer turned politician. Now 65, Khan has been on the stump for two decades. This is a long time in politics. I stood close enough at one of his first major rallies in his hometown of Lahore in 1998 to read his speech over his shoulder. The first line on the first page read: “Believe in Pakistan.” I was sceptical of his prospects and my report was headlined No Khan Do.

Now the top job in one of the world’s most troubled, resilient and strategically important nations could soon be his. The story of how this happened contains a lesson for us all. Khan has attracted much attention in western media over the years, much of it for the wrong reasons. His sporting prowess, playboy reputation and marriage to and divorce from socialite heiress Jemima Goldsmith fuelled tabloid fascination. His midlife turn to religion, conservative values and political ambitions attracted more serious analysis. But what I, like most others, long missed was that Khan was ahead of his time, not behind it.

More here.

Tuesday Poem

Hurricane

Four tickets left, I let her go—
Firstborn into a hurricane.

I thought she escaped
The floodwaters. No—but her

Head is empty of the drowned
For now—though she took

Her first breath below sea level.
Ahhh       awe       &       aw
Mama, let me go—she speaks

What every smart child knows—
To get grown you unlatch

Your hands from the grown
& up & up & up & up
She turns—latched in the seat

Of a hurricane. You let
Your girl what? You let

Your girl what?
I did so she do I did
so she do so—

Girl, you can ride
A hurricane & she do
& she do & she do & she do

She do make my river
An ocean. Memorial,
Baptist, Protestant birth—my girl

Walked away from a hurricane.
& she do & she do & she do & she do
She do take my hand a while longer.

The haunts in my pocket
I’ll keep to a hum: Katrina was
a woman I knew. When you were

an infant she rained on you & she

do & she do & she do & she do

by Yona Harvey
from Hemming the Water
four Way Books

Patrick Melrose, a nuanced portrait of addiction

Lucinda Smyth at Prospect Magazine:

What’s especially impressive about this adaptation is not only that it is enjoyable, but that it directly confronts the problem of addiction without glamorising it. Other dramas about womanising addicts—for example Mad Men or Californication—tend to focus on the debauchery as much as the interior consequences; this makes it easy for the viewer to avoid processing the trauma. Watching Mad Men’s Don Draper pour himself yet another glass of whiskey, having hit yet another rock bottom, doesn’t quell the desire (provoked by the sexy depiction of the world of the programme) to reach for a cigarette and a whiskey yourself. Even in Channel 4’s Sherlock, the detective’s opium habit is seen as a facet of his genius: a necessary method for him to open the doors of perception. There is something beguiling and (literally) intoxicating about watching someone dance so close to the cliff edge. Indulging this behaviour onscreen can make it seem attractive rather than repellent.

more here.

Chimamanda Ngozi Adichie Comes to Terms with Global Fame

Larissa MacFarquhar at The New Yorker:

Around the time that Imasuen was getting yelled at by his mother, the author of “Purple Hibiscus,” Chimamanda Ngozi Adichie, who is now regarded as one of the most vital and original novelists of her generation, was living in a poky apartment in Baltimore, writing the last sections of her second book. She was twenty-six. “Purple Hibiscus,” published the previous fall, had established her reputation as an up-and-coming writer, but she was not yet well known.

Although there had been political violence in the background of her first book, she had written it as a taut, enclosed story of one family; her second, “Half of a Yellow Sun,” would be much larger. She was constructing a story of symphonic complexity, with characters from all over Nigeria and many levels of society, twisted together by love and the chance encounters of refugees.

more here.

Patrick Heron and British abstract art

Michael Prodger at the New Statesman:

Heron the critic was also one of the figures responsible for introducing the American abstract expressionists to a British audience. Although he was later to accuse Pollock, De Kooning et al (but not Rothko) of cultural imperialism and not being painterly enough, the scale of their pictures, their rhythm, saturated palette and insistence that each part of the canvas was as important as every other, had a profound effect on him. Heron believed that, “Painting is thinking with one’s hand; or with one’s arm; in fact with one’s whole body,” and the epic AbEx works represented his aphorism in action.

Heron himself had not started as an abstract painter but he shared what Gillian Ayres defined as the postwar generation’s obsession with “what can be done with painting”.

more here.