99 Exercises in Style

by John Allen Paulos

Raymond Queneau was a French novelist, poet, mathematician, and co-founder of the Oulipo group about which I wrote last year here. The group is primarily composed of French writers, mathematicians, and academics and explores the use of mathematical and quasi-mathematical techniques in literature. Their work is funny, experimental, weird, and thought-provoking.

A reader of my piece recently suggested I read Queneau’s 99 Exercises in Style. I’d read a lot of Oulipo’s writings, but never this collection of 99 retellings of the same simple anecdote. Each version is in a different style, if we interpret “style” loosely to encompass just about any variation in the telling. In almost all of the 99 variations the narrator gets on a bus in Paris, sees an argument between a long-necked man with an unusual hat and another passenger. Elsewhere and a while later the narrator sees the same man speaking with someone about button on his coat.

Queneau’s entries are clever, varied, and, taken in their entirety constitute a brilliant tour de force. The basic anecdote is banal, but that’s the point. Banality varied and repeated is far from banal. On a long overnight flight with nothing much to do, the entries inspired me to try my hand at writing a few of my own such short pieces rather than cite a sample of them from the book, many of which, incidentally, are only a half dozen sentences long. So here goes my comparatively feeble quota of stories a la Queneau, which are based on a different anecdote and set in Philadelphia. I hope other readers might be tempted to play with the idea as well, perhaps with the help of ChatGPT. Read more »

The Constructive Culture of Gen X Cynicism

by Mindy Clegg

One of many Gen X memes about our aloof worldview, with Ally Sheedy in her role from the John Hughes film The Breakfast Club.

In a thread on a Gen X subreddit, a poster named QueenShewolf wondered about the truth of Gen X cynicism as her own Gen X siblings seemed far less “cynical and disconnected” than herself, a millennial. Some responded that they were certainly cynical, others felt they were merely realistic. Skepticism drove some of this more cynical or realistic worldview, based on their experiences growing up in the 70s and 80s. Many expressed distrust of institutions because of growing up during Watergate and the Reagan administration. Cynicism has its uses, according to the posters in the thread. Being at least a bit cynical about the world around you can help you engage more critically for one. But is too much cynicism dangerous for making positive social change? Has Gen X adoption of cynicism meant that our generation has been less engaged in larger social and political life, leaving it to other generations to shape? While there is some truth in the charges of ironic cynicism, some of that has been imposed on us from without. I argue that Gen X has made some positive contributions to our culture that are often overlooked. Our generation might be more cynical, but we also helped to create durable non and even anti-commercial culture, despite our paucity of numbers.

The term Generation X came to describe to the generation that was born from the mid-60s to early 80s. The term was popularized by Douglas Coupland, a baby boomer. Paula Gottula Miles said that he settled on the “X” to illustrate how this generation had no great existential experiences unifying them “unlike previous generations.” Instead, “they do have as a unifying childhood experience … a phenomenal rise in the divorce rates, and a national debt that went from the millions to the billions.” Such shared experiences (which are apparently less meaningful than wars or political assassinations) meant they shared a common cynicism, as illustrated by his work.1 But I would argue that Coupland ignored major events that had a formative impact, such as the Reagan era weapons build-up, the end of the Cold War itself, and several major conflicts that saw major acts of genocide (Yugoslavia and Rwanda). Read more »

On Outgrowing Books

by Mary Hrovat

There are many reasons to want to own a book. Most of them are straightforward: you can’t get it at the library, or you read the library’s copy and want your own copy to re-read or consult; you need it on hand as a reference; or it’s so beautiful that you want to be able to look at it whenever you like.

There are also more nebulous reasons for owning books that have to do with hopes, memories, and self-image—perhaps even illusions. I held onto my college textbooks for years after I graduated, even though I didn’t use my degree in my job. I finally got rid of most of them; space is always tight, and textbooks are typically bulky. Overall I suppose it was a good decision, but for years I missed my calculus textbook. I didn’t miss it for any practical reason. I never used calculus. I missed being the kind of person who needs to use calculus. I could probably pick up a calculus textbook easily enough at one of the used book sales held in my town, but there’s another reason I miss that book: It was mine. I carried it to class for four semesters; I worked the problems. It belonged to a past life that I still miss sometimes.

I’ve heard people use the term weeding to refer to the removal of books from library collections. In the collection development course I took in library school, I was taught to use the more formal terms deaccession and deaccessioning. The word withdrawal is also used. I agree that the word weeding isn’t suitable, not because it’s informal but because it suggests that the books being removed were never assets to the collection. I think pruning is better; when I decide to give away (or very occasionally throw away) some of my books, I’m removing things that no longer fit well enough to merit shelf space. Read more »

Simple Pleasures, Complicated Times

by David Greer

One of nature’s sweetest simple pleasures: a wild strawberry at the peak of perfection. David Greer photo.

In the mid 1990s, I asked an assortment of people to describe their favorite simple pleasures. As I expected, the responses were wide-ranging, everything from feelings of accomplishment (perfecting a Chopin étude on the piano) to intimate moments (nursing a baby) to bringing order to chaos (the zen of vacuuming). There were innovative comforts (storing sheets in the freezer for hot summer nights) and quirky delights (reading Hegel at 3 am at the all-night diner) and uninhibited messy moments (eating watermelon shirtless on a sweltering day with juice and seeds dribbling down your chest).

I remember being struck both by how pleased interviewees were to be asked about their pleasures, which they often described eloquently and at length, and by what often seemed like faint embarrassment, as if there was something a little improper about feeling so attached to experiences that weren’t somehow useful or productive. Disapproval of the deliberate pursuit of pleasure is nothing new. The Athenian philosopher Epicurus built an entire treatise around the pursuit of pleasure in the third century BCE. He was roundly criticized for it then, and Epicureanism, though it has attracted followers for more than two thousand years, is still widely dismissed as a celebration of unrestrained hedonism. Not only could nothing be further from the truth, but Epicureanism may be the philosophy best equipped for the navigation of our present age of high anxiety. More on that later. Read more »

Invasive Mammal Found In North Country For First Time

by Mike Bendzela

It took a couple of million years, but any careful observer could have seen it coming.

Dispersal of Homo species over time.

One of the most destructive and invasive mammalian species in the world has been seen striding across the continent.

This primate was recently spotted at a watering hole in the northern territories, confirming its presence in the area for the first time.*

Until recently, the primate had penetrated only as far as the margins of the central mountains. It is believed to have originated on the eastern African continent.

Genus Homo is a bipedal, diurnal, omnivorous and predatory mammal that can survive for decades.

This new ape can form encampments on hill and in dale; in forests, prairies, or deserts; on tundra and on islands. It establishes semi-permanent colonies that feed on surrounding flora and fauna until they are depleted. Then it moves on.

Even though it produces small litters and spaces out births by years, it can go from hundreds to thousands to millions to billions of individuals in a geologically brief time.

Many characteristic traits of Homo set it apart from the other primates. Therewith, it dismantles the natural checks on its existence, one by one. Read more »

Monday, May 29, 2023

The Problem With Stories

by Thomas R. Wells

Human minds run on stories, in which things happen at a human level scale and for human meaningful reasons. But the actual world runs on causal processes, largely indifferent to humans’ feelings about them. The great breakthrough in human enlightenment was to develop techniques – empirical science – to allow us to grasp the real complexity of the world and to understand it in terms of the interaction of mindless (or at least unintentional) processes rather than humanly meaningful stories of, say, good vs evil. Hence, for example, the objectively superior neo-Darwinian account of adaptation by natural selection that has officially displaced premodern stories about human-like but bigger (‘God’) agents creating the world for reasons we can make sense of.

Science flourishes still, demonstrating the possibility for human minds to escape the fairy tale epistemology that we have inhabited for tens of thousands of years and to inquire systematically into the world, or at least to benefit from the work of those who do. But, as the evolution example illustrates, stories continue to exert a powerful psychological hold over human minds. Despite being one of the most educated societies in the world, surveys routinely find that only around a third of US adults accept the scientific account of evolution. Despite their deficiencies stories continue to dominate our minds, and hence the world that we build together with our minds via politics. From our thinking on the economy to identity politics to Covid to Climate Change to Climate Change activism, stories continue to blind us to reality and to generate mass conflict and stupidity. Read more »

Oppenheimer III: “Oppenheimer seemed to me, right from the beginning, a very gifted man.”

by Ashutosh Jogalekar

“Oppenheimer, Julius Robert”, by David A. Wargowski, December 7, 2018

This is the third in a series of posts about J. Robert Oppenheimer’s life and times. All the others can be found here.

In 1925, there was no better place to do experimental physics than Cambridge, England. The famed Cavendish Laboratory there has been created in 1874 by funds donated by a descendant of the eccentric scientist-millionaire Henry Cavendish. It had been led by James Clerk Maxwell and J. J. Thomson, both physicists of the first rank. In 1924, the booming voice of Ernest Rutherford reverberated in its hallways. During its heyday and even beyond, the Cavendish would boast a record of scientific accomplishments unequalled by any other single laboratory before or since; the current roster of Nobel Laureates associated with the institution stands at thirty. By the 1920s Rutherford was well on his way to becoming the greatest experimental physicist in history, having discovered the laws of radioactive transformation, the atomic nucleus and the first example of artificially induced nuclear reactions. His students, half a dozen Nobelists among them, would include Niels Bohr – one of the few theorists the string-and-sealing-wax Rutherford admired – and James Chadwick who discovered the neutron.

Robert Oppenheimer returned back to New York in 1925 after a vacation in New Mexico to disappointment. While he had been accepted into Christ College, Cambridge, as a graduate student, Rutherford had rejected his application to work in his laboratory in spite of – or perhaps because of – the recommendation letter from his undergraduate advisor, Percy Bridgman, that painted a lackluster portrait of Oppenheimer as an experimentalist. Instead it was recommended that Oppenheimer work with the physicist J. J. Thomson. Thomson, a Nobel Laureate, was known for his discovery of the electron, a feat he had accomplished in 1897; by 1925 he was well past his prime. Oppenheimer sailed for England in September. Read more »

The Erudite Philistine

by Ada Bronowski

One of the amusing things about academic conferences – for a European – is to meet with American scholars. Five minutes into an amicable conversation with an American scholar and they will inevitably confide in a European one of two complaints: either how all their fellow American colleagues are ‘philistines’ (a favourite term) or (but sometimes and) how taxing it is to be always called out as an ‘erudite’ by said fellow countrymen. As Arthur Schnitzler demonstrated in his 1897 play Reigen (better known through Max Ophühls film version La Ronde from 1950), social circles are quickly closed in a confined space; and so, soon enough, by the end of day two of the conference, by pure mathematical calculation, as Justin Timberlake sings, ‘what goes around, comes around’, all the Americans in the room turn out to be both philistines and erudite.

A contradiction in terms? Not so fast and not so sure. Firstly, such confidences are made to the European in the room, identified as anyone from the old continent whose mother-tongue is not English and who is thereby draped with the aura of natural bilingualism – the key to culture. Like Theseus’ ship, the European is both up-to-date and very very old and therefore, involuntary tingling from the direct contact with centuries of history and civilisation. A position which, from the point of view of the American colleague, is not without its own contradictions: for if the oceanic separation awakes a nagging complex of inferiority from the erudite philistine, who fills the distance with fantasies of the old world and the riches of Culture, which (white) America has been lusting over from the Bostonians to Indiana Jones, it is a distance which also buttresses the sense of self-satisfaction suffused within the American psyche, and which, in the world of academia, has evolved into the self-made scholar. Yet another oxymoronic formula which brings us back to ‘the enigma wrapped up in a mystery’ that is the erudite philistine.

Two fundamental principles lie at the heart of this strange bird, the first, that you are always secretly guilty of what you blame others; the second, that you are always somebody else’s philistine. Read more »

Everyday Quantum: On The Origin Of Force

by Jochen Szangolies

Magnetic field made visible using so-called ferrofluid. Image credit: Gregory F. Maxwell, own work, CC BY-SA 3.0, via wikimedia commons.

The ubiquity of a phenomenon sometimes causes its mystery to grow stale. Consider the strange lines along which a magnet orients iron filings in its vicinity: what we take in stride would, if experienced afresh, seem like the purest magic. Indeed, humble iron is probably the closest we get to genuine sci-fi ‘unobtainium’. Not only can the right treatment increase its durability by a multiple, enabling technologies unthinkable without it, but if you take a rod of iron, and hit it hard with a hammer, it suddenly acquires the ability to attract other bits of iron—as if the imparted force is transformed and stored in a mysterious field surrounding it. Moreover, if you then take that empowered core, and pass it through loops of very thinly wrought iron, you find that you can suddenly draw tiny bolts of lightning from the wire—or use the energy to power a light, or a motor, or, indeed, our civilization.

So, what is it that imbues iron with these near-magical capabilities? What yields it the power to attract or repel? Or, in the immortal words of the Insane Clown Posse, fucking magnets, how do they work?

Pace Violent J and Shaggy 2 Dope, the answer to this question is well known—in that particular sense of ‘well known’ that physicists use when they mean ‘don’t worry about it, the math says it works’. But even among physicists, the answer isn’t usually spelled out all that clearly—which is a bit of a shame: as we will see, it’s a cardinal example of how quantum mechanics, far from being just a ‘theory of the small’, is directly responsible for many everyday phenomena. Ultimately, the key to the question ‘why do magnets attract things’ lies in the phenomenon of quantum interference—though we’ll have to plot a bit of a roundabout course to get there. Read more »

Talent Surpluses and the Myth of Meritocracy in Science

by Joseph Shieber

It is ironic that one of the primary obstacles to meritocracy in science, the oversupply of talent, is simultaneously a key contributor to the myth that merit reigns in science.

To explain, I’ll proceed in three steps. First, I’ll devote most of my efforts to provide reason for thinking that there IS an oversupply of talent in science. Second, I’ll briefly explain why the oversupply of talent is problematic for meritocratic conceptions of science. Third, I’ll suggest a reason why such oversupplies of talent paradoxically fuel the myth of meritocracy.

Is there a surplus of talent in science? In 2019 alone, there were roughly 23,000 Ph.D.s awarded in the natural sciences, computer science, and mathematics – and an additional 10,000 or so Ph.D.s awarded in engineering. This vastly dwarfs the number of research positions – particularly academic research positions – available.

Indeed, as a 2021 article in Areo magazine notes, “between 1982 and 2011 in the US, around 800,000 science and engineering PhDs were awarded, but only about 100,000 new tenure-track faculty positions were created in those fields.” Read more »

Responsibility Gaps: A Red Herring?

by Fabio Tollon

What should we do in cases where increasingly sophisticated and potentially autonomous AI-systems perform ‘actions’ that, under normal circumstances, would warrant the ascription of moral responsibility? That is, who (or what) is responsible when, for example, a self-driving car harms a pedestrian? An intuitive answer might be: Well, it is of course the company who created the car who should be held responsible! They built the car, trained the AI-system, and deployed it.

However, this answer is a bit hasty. The worry here is that the autonomous nature of certain AI-systems means that it would be unfair, unjust, or inappropriate to hold the company or any individual engineers or software developers responsible. To go back to the example of the self-driving car; it may be the case that due to the car’s ability to act outside of the control of the original developers, their responsibility would be ‘cancelled’, and it would be inappropriate to hold them responsible.

Moreover, it may be the case that the machine in question is not sufficiently autonomous or agential for it to be responsible itself. This is certainly true of all currently existing AI-systems and may be true far into the future. Thus, we have the emergence of a ‘responsibility gap’: Neither the machine nor the humans who developed it are responsible for some outcome.

In this article I want to offer some brief reflections on the ‘problem’ of responsibility gaps. Read more »

Talking to Kids about Dead Kids

by Rebecca Baumgartner

I don’t want to write this and you don’t want to read it. But this is the world we live in. As I write this, it’s been three weeks since a man without a criminal record legally purchased a trunk-full of guns, opened fire at the Allen Premium Outlets mall and killed eight people, including three children, and wounded seven others, all in the space of about three minutes. A place that previously had been known mostly for its contribution to traffic jams on Stacy Rd. will now be linked forever to white-shrouded bodies and blood splatters on the concrete outside the H&M store.

***

Because I live in a very conservative state, I’ve often had to swallow my true self, my beliefs, my reason, my outrage, my sadness, and my intelligence when listening to those around me speak about current events. A few years back, a conservative I know told me he was disturbed by the decision of his family’s school district to allow a trans person to serve as a substitute teacher for his daughter’s third-grade class. “I wasn’t prepared to talk to her about…all that yet,” he said, gesturing vaguely to indicate the existence of trans people. “I mean, she’s only eight.”

For complex reasons that any liberal living and working alongside conservatives in the Bible Belt will understand, I kept my response to a minimum. There was no point in telling him that talking about trans people with his daughter didn’t need to be a fraught conversation; it was really just about accepting people as they are and recognizing that difference exists. Eight-year-olds can understand that. In fact, it’s important that they understand that, because otherwise they will struggle to develop the empathy and emotional intelligence they need to connect with others who don’t see the world exactly as they do.

I had cause to recall that particular conversation when my son and niece asked me about the mall shooting. Read more »

I’m Him!

by Derek Neal

In the first round of this year’s NBA playoffs, Austin Reaves, an undrafted and little-known guard who plays for the Los Angeles Lakers, held the ball outside the three-point line. With under two minutes remaining, the score stood at 118-112 in the Lakers’ favor against the Memphis Grizzlies. Lebron James waited for the ball to his right. Instead of deferring to the star player, Reaves ignored James, drove into the lane, and hit a floating shot for his fifth field goal of the fourth quarter. He then turned around and yelled, “I’m him!”. The initial reaction one might have to this statement—“I’m him”—is a question: who are you? The phrase sounds strange to our ears. Who could you be but yourself? And if you are someone else, shouldn’t we know who this other person is? Who is the referent of the pronoun “him”? Perhaps because of its cryptic nature, “I’m him” is an evocative statement, and it has quickly spread throughout sports and gaming culture. In a 2022 episode of “The Shop,” Lebron James himself declared “I’m him.” On YouTube, there are numerous compilations with titles such as “NFL ‘I’m Him’ Moments.” If you search the phrase on Twitter, people use it in relation to sports, music, video games, and themselves. But what does it mean, and where did it come from?

An internet search turns up a few articles explaining the history of this statement. Apparently, a rapper named Kevin Gates was the first to popularize the phrase when he titled his 2019 album, I’m Him. In this instance, “Him” acted as an acronym for “His Imperial Majesty.” We could then understand people saying, “I’m Him” to be saying something along the lines of “I’m the king.” This expression would function much like another sports saying, “the GOAT,” meaning “the greatest of all time.” But I think there is more to it than this. Since Gates used the term, no other examples have used “him” as an acronym. Read more »

Monday, May 22, 2023

Midnight Judges And Jefferson’s Battle Over The Federal Courts

by Michael Liss

“Declaration of Independence,” John Trumbull. Capitol Rotunda.

November 1800. In the Presidential rematch between John Adams and Thomas Jefferson we have a clear loser, but not yet a winner. John Adams will be returning home. Thomas Jefferson, thanks to a bizarre tie in the Electoral College with his erstwhile running mate, Aaron Burr, will have to wait for the House of Representatives. Whatever that result might be, it is clear that a new team is coming to Washington. Jefferson’s Democratic-Republicans have flipped the House and have narrowed the gap in the Senate. Over the course of the next few months, thanks to by-elections, three more Federalist Senators will go down, and Jefferson’s party will control both the Executive and Legislative branches.

It’s fair to say that many Federalists are in a panic. Through Washington’s two terms and Adams’ first, being in power is the only thing they have known. It was so easy in the beginning, given Washington’s enormous personal prestige. Then, because people will talk, and there were more ambitious and talented men than there were positions to fill, the grumbling set in. It took just three years from Washington’s 1789 inauguration for Madison’s (and, sotto voce Jefferson’s) new political party to emerge, and, although the Democratic-Republican team did not contest the Presidency against Washington in 1792, it was part of a loose Anti-Administration coalition that won the House.

The grumbling increased in Washington’s Second Term, first directed at his Cabinet, particularly Alexander Hamilton, then, respectfully, of course, at Washington himself. A great man, yes, it was whispered, but in decline and controlled by his advisors. Among the whisperers was Washington’s own Secretary of State, Thomas Jefferson, who left the Administration at the end of 1793 to return to Virginia and do what Jefferson did exceptionally well—ponder, and quietly, oh so quietly, move political chess pieces around on the board.

The Federalists’ reign was not over: in 1796, enough people thought John Adams had earned a stint in the hot seat, and, by the narrowest of margins and with the help of the House of Representatives, Adams held the office for the party.  Still, the balance of power was inevitably shifting away from the Federalists. The Party was basically “aging early,” becoming stiff, cranky, lacking in new ideas. Read more »