Catherine Wilson in Aeon:
Like many people, I am skeptical of any book, lecture or article offering to divulge the secrets of happiness. To me, happiness is episodic. It’s there at a moment of insight over drinks with a friend, when hearing a new and affecting piece of music on the radio, sharing confidences with a relative or waking up from a good night’s sleep after a bout of the flu. Happiness is a feeling of in-the-moment joy that can’t be chased and caught and which can’t last very long.
But satisfaction with how things are going is different than happiness. Satisfaction has to do with the qualities and arrangements of life that make us want to get out of bed in the morning, find out what’s happening in the world, and get on with whatever the day brings. There are obstacles to satisfaction, and they can be, if not entirely removed, at least lowered. Some writers argue that satisfaction mostly depends on my genes, where I live and the season of the year, or how other people, including the government, are treating me. Nevertheless, psychology and the sharing of first-person experience acquired over many generations, can actually help.
So can philosophy. The major schools of philosophy in antiquity – Platonism, Stoicism, Aristotelianism and, my favourite, Epicureanism, addressed the question of the good life directly. The philosophers all subscribed to an ideal of ‘life according to nature’, by which they meant both human and nonhuman nature, while disagreeing among themselves about what that entailed.
More here.

Robert Hockett in Forbes:
For anyone with a platform,
The cable news talking heads seem obsessed with Joe Biden’s “significant” lead over Donald Trump in the national polls – as if this lead signifies a certain coming Biden victory in the presidential election. Also feeding the narrative that Biden is likely to win are stories and film clips of millions of Americans standing in long lines to vote early in record numbers.
B
Since the last volume of “His Dark Materials” appeared in 2000, Pullman has written two more (superb) volumes of a second trilogy called “
In 2002 the Cambridge astrophysicist and Astronomer Royal Lord Rees predicted that, by the end of 2020, “bioterror or bioerror will lead to one million casualties in a single event”. In 2017 the Harvard psychologist Steven Pinker took the other side in a formal bet. As the terms of the wager defined casualties to include “victims requiring hospitalization”, Rees had already won long before the global death toll of Covid-19 passed the one million mark in September. Sadly for him, the stake was a meagre $400.
The 1990s now seem as distant as the 1950s. America is famously polarized into cultural tribes, each regarding the other with contempt and alarm. In the White House sits a populist entertainer with little evident commitment to constitutional norms. Sober scholars publish books with titles such as How Democracies Die
Silliness is not such a stretch as one would think, should one think of Wittgenstein. The ludic is at play, from conception, in the language-game itself; or the duck-rabbit as a vehicle of ambiguity, a perceptual “third thing,” a visual neologism: now ears … now beak, or “Both; not side-by-side, however, but about the one via the other.” Silliness is ruminative, i.e., we must look back over time for its sense to resonate. And it seeks a beyond. I consider the question of whether the two-headed calf should be counted as one animal or two—a serious moment for the Vatican Belvedere Gardens of 1625. Three, of course. “For the physicians, the feature distinguishing the individual was the brain; for followers of Aristotle, the heart,” wrote Carlo Ginzburg. And the dissection of the animal “was done with the aim of establishing not the ‘character’ peculiar to that particular animal, but ‘the common character’ … of the species as a whole.” A parallelism: Wittgenstein’s insistence that a student learn a new word by experiencing it through practice, being responsible to it and its usage, and eventually making the word meaningful for a greater understanding of her (whole) (linguistic) (yet shared, and seen) world.
When he died in 1677 at the age of forty-four, Spinoza left behind a compact Latin manuscript, Ethics, and his disciples quickly got it printed. It is a book like no other. It urges us to change our whole way of thinking and in particular to rid ourselves of the deep-rooted conceit that makes us imagine that our petty, transient lives might have some ultimate significance. The argument is dynamic, disruptive and upsetting, but it is contained within a literary framework of extraordinary austerity, comprising an array of definitions, axioms, propositions, proofs and scholia. The overall effect is not so much beautiful as sublime, like watching a massive fortress being shaken by storms and earthquakes. Many readers have found themselves deeply moved by Ethics but unable to say exactly what it means.
Remarkably, in these fractious times, President Trump has managed to forge a singular area of consensus among liberals and conservatives, Republicans and Democrats: Nearly everyone seems to agree that he represents a throwback to a vintage version of manhood. After Mr. Trump proclaimed his “domination” over coronavirus, saluting Marine One and ripping off his mask, the Fox News host Greg Gutfeld cast the president in cinematic World War II terms — a tough-as-nails platoon leader who “put himself on the line” rather than abandon the troops. “He didn’t hide from the virus,”
American society is prone, political theorist Langdon Winner wrote in 2005, to “technological euphoria,” each bout of which is inevitably followed by a period of letdown and reassessment. Perhaps in part for this reason, reviewing the history of digital democracy feels like watching the same movie over and over again. Even Winner’s point has that quality: He first made it in the mid-eighties and has repeated it in every decade since. In the same vein, Warren Yoder, longtime director of the Public Policy Center of Mississippi, responded to the Pew survey by arguing that we have reached the inevitable “low point” with digital technology—as “has happened many times in the past with pamphleteers, muckraking newspapers, radio, deregulated television.” (“Things will get better,” Yoder cheekily adds, “just in time for a new generational crisis beginning soon after 2030.”)