Oppenheimer I: “An unctuous, repulsively good little boy”

by Ashutosh Jogalekar

“Oppenheimer, Julius Robert”, by David A. Wargowski, December 7, 2018

This is the first of a series of short pieces on J. Robert Oppenheimer. The others can all be found here. Popular interest in Oppenheimer’s life seems to have peaked this year with the upcoming release of Christopher Nolan’s mainstream film, “Oppenheimer”. Several books about Oppenheimer – and even a popular opera – have come out just in the last two decades. Analogies between Oppenheimer and nuclear weapons and new technologies like AI and gene editing are frequently drawn, sometimes incisively and often misleadingly. Why does this man continue to inspire so much interest? And why now?

My goal is to provide readers who might not want to read a full biography of Oppenheimer with some of the highlights of his life that address these questions and put them in context. Needless to say, this series – which is essentially chronological – is not supposed to be an exhaustive biography and is biased by what I personally think was most interesting about this brilliant, fascinating, complicated man’s life and times.

The physicist Isidor Rabi – a friend who perhaps knew Robert Oppenheimer better than he knew himself – once said that Oppenheimer was a man composed of many shining splinters. That succinct assessment captures the central dilemma of Oppenheimer’s life: identity. It was a dilemma that made him who he was and one that contributed to the myriad problems he faced in his life. I also believe that it was this dilemma that makes him a fascinating man of enduring interest, more interest than many of his contemporaries who were far better-known scientists.

To understand this dilemma it’s worth starting, as it often is, with Oppenheimer’s childhood. The affluent household where Oppenheimer was born and grew up was, in the words of a biographer, “like Ibsen’s Rosmersholm, that aristocratic estate where voices and passions were always subdued, and where children never cried – and when they grew up never laughed.” Read more »

The Ghost in the Machine (Part I): Emergence and Intelligence in Large Language Models

by Ali Minai

Part II of this article can now be read here.

One of the most interesting debates within the larger discussion around large language models (LLMs) such as GPT-4 is whether they are just mindless generators of plausible text derived from their training – sometimes termed “stochastic parrots” – or systems with a spark of real intelligence and “emergent” abilities. In this article, I will try to look at these issues from a complex systems perspective. While this article will focus mainly on large language models (LLMs) such as the GPT systems, I will begin by laying out a more general conceptual framework.

Complexity and Complex Systems

Complex systems are defined as systems that consist of a large number of nonlinearly interacting components with the ability to generate large-scale organization without explicit design or control. This self-organization is the essential feature that distinguishes these systems from other complicated but specifically designed systems such as computers and aircraft. Complex systems are common in the natural world, including everything from galaxies and planets to forest fires and hurricanes, but the most profound examples occur in the domain of life. All living systems from bacteria to humans are complex systems. In more complex organisms, their subsystems are also complex systems, e.g., the brain and the immune system in humans. Ecosystems and ecologies too are complex systems, as are collective systems of living agents from slime molds and insect colonies to human societies and economies. In addition to self-organization, some complex systems – and, in particular, those involving living agents – also have the property of adaptivity, i.e., they can change in response to their environment in ways that enhance their productivity. Crucially, this adaptation too is not controlled explicitly by an external agency but occurs within the system through its interactions with the environment and the consequences of these interactions. These systems are called complex adaptive systems. An example of this is evolving species in changing ecosystems, but one that is more pertinent to the current discussion is the nervous system of complex animals such as humans. This system is embedded in another complex system – the rest of the animal’s body – and that, in turn, is embedded in the complex system that is the rest of the world.

Complexity in the sense defined above has several profound implications. One of these is that a complex system’s behavior is inherently impossible to predict by reductionistic causal analysis, and thus impossible to control by any top-down mechanism. This is because almost all large-scale phenomena – attributes, structures, processes, functions – in the system arise bottom-up from the interaction of a very large number of components – often billions or more, as in the cells of the brain – and can neither be reduced to nor described by the behavior of individual components. This property is called emergence. Read more »

Sweet Truth

by Deanna K. Kreisel (Doctor Waffle Blog)

The first time I came across the “candy bar interiors” quiz, I was not disturbed by how many I got wrong, but rather how many I got right. While a few of the pictured confections were alien to me (Zagnut? who the hell eats a Zagnut? did Charlie Chaplin enjoy one in Modern Times?), I was intimately familiar with enough of the images that I could extrapolate the rest.

I just picked the weirdest and creepiest looking option as the Zagnut. I was correct.

I have been unhealthily obsessed with candy bars for as long as I can remember. My obsession is deep, tender, and gently festering, and I feel ambivalent about stirring up the dead leaves at the bottom of that pool. Even though I have written essays about my consistently abusive mother, my intermittently abusive father, my history of panic disorder, and many other delicate topics, an exploration of my feelings about candy bars feels like the most difficult thing I have ever attempted.

The outside layer of a Zagnut is coconut. Who does that?

My parents were largely uninterested in their children, but every once in a while they made a symbolic stab at parenting by doing something like restricting our food choices or giving us a curfew. They forbade sugary breakfast cereals, soda, candy, and chips of any kind—thus setting us up for an enduring obsession with junk food that I carry with me to this day. Even though I am now in my 50s and wholly in charge of my own snack choices, my stomach still flips over a little at the sight of a potato chip bag or a Chip Ahoy. So salty! So mouthfeely! So naughty. Read more »

Is there something fishy about radiocarbon dating?

by Paul Braterman

England Great Army map.svg
A map of the route taken by the Viking Great Heathen Army. Hel-hama, own work, via Wikipedia

The Vikings started out as raiders, but then, in the way of these things, ended up as rulers, and their influence stretched from Greenland to what is now Russia. They first enter English history in 793, with the sacking of the Monastery of Lindisfarne. By the late 9th century, they were colonising Iceland, and serving as mercenaries to the Emperor of Byzantium. In 862, Vikings under Rurik established themselves in Novgorod, forming the nucleus of what would become Kyivan Rus. In 885, Vikings besieged Paris, and although they were beaten back settled in what is now Normandy (Norman, Northmen). In 865, the Viking Great Heathen Army arrived in England, and a year later, under Ivar the Boneless, captured York, which would remain their capital in England until the defeat of Eric Bloodaxe in at 954.

The Vikings’ goal was to establish themselves as rulers over Anglo-Saxon England, divided at that time into the four kingdoms of Northumbria, East Anglia, Mercia, and Wessex, and in this they were almost successful. After establishing their kingdom in York, they swept south, taking control of East Anglia and killing its king, who had earlier provided them with horses. They then spend the next five years consolidating their hold over what had been the most powerful of the Saxon kingdoms, Mercia, stretching from the Thames to the Humber, whose king took refuge in Paris. The Anglo-Saxon Chronicle tells us that in 873-44, a date that will prove significant for us, the army spend the winter in Repton, then a town of some importance. It then divided, one part going north to consolidate control over York, while the other swept south through Mercia into Wessex, which they effectively overran over the next two years. Read more »

On the use and abuse of the term “fascism” to describe current events

by David J. Lobina

For someone who grew up in the South of Europe but has lived in the UK for the last 20 or so years, and who, moreover, is a sort-of linguist, the recent proliferation of the word “fascism” to refer to certain political events and tendencies in the English-speaking world, especially in the US, is not a little surprising. After all, the people we used to refer to as fascists when I was growing up in Italy and Spain certainly bore a resemblance to classical fascism – some were the descendants of actual fascists, in fact – whereas the guys who get called fascist all the time these days, especially in the US, are nothing like them. And in any case, it was the 1990s then and it is 2023 now, is the term “fascism” still relevant today?[i]

Who were these fascists from my youth, then? If you lived in Italy or Spain in the 1990s and were politically active, you would most certainly run into them sooner or later and there were a couple of dates in the calendar that you needed to look out for, as neo-fascists, to employ a perhaps more appropriate nomenclature, tended to come out to commemorate events such as the so-called March on Rome, on the 27th of October, in Italy, and Francisco Franco’s death, on the 20th of November, in Spain.[ii]

As mentioned, some of these people were the descendants of real fascists, and this is perhaps clearest in the case of the Movimento Sociale Italiano, or MSI (The Italian Social Movement), a political party founded in 1946 by veterans from the so-called Republic of Salò – more properly, the Repubblica Sociale Italiana (The Italian Social Republic), a Nazi puppet state nominally led by Benito Mussolini in his nadir days – and which, in 1994, and under the name of Alleanza Nazionale (The National Alliance), entered the government of Silvio Berlusconi, the business magnate turned politician.[iii] This is to some extent also true of the many offshoots of the original Falange Española (The Spanish Falange), a party that was founded in the 1930s on the model of Mussolini’s Partito Nazionale Fascista (the National Fascist Party), the latter created in 1921. In its latest iteration, the Falange goes by the name of the Falange Española de las JONS.[iv]

The modern versions of these organisations, however, differ greatly from classical fascism, by which I mean Italian fascism from 1922 to 1943 (roughly), as well as from each other, and these differences become a chasm when it comes to US politics. And yet seemingly every other week there is an article out there about how the Republican Party is becoming a fascist party, or about how Donald Trump, Tucker Carlson, or Ron DeSantis are all fascists. Read more »

Eugenics and the Biological Justification of Economic Exploitation in Southern Italy

by Andrea Scrima (excerpt from a work-in-progress)

When they arrived in the U.S., Southern Italians brought with them the sense that they’d been branded as underdogs, that they belonged and would forever belong to a lower class, but the birth of the Italian-American gangster was rooted in attitudes toward the Mezzogiorno that dated back far earlier. After Italy was unified under Vittorio Emanuele II in 1861, a new national government imposed Piedmont’s centralized administrative system on the South, which led to violent rebellion against State authority. Politicians and intellectuals took pains to deflect responsibility for what they saw as the “barbarism” of the Mezzogiorno, and were particularly receptive to theories that placed the blame for the South’s many problems on Southern Italians’ own inborn brutishness. The decades following Unification saw the nascent fields of criminal anthropology and psychiatry establish themselves in the universities of Northern Italy; implementing the pseudosciences of phrenology and anthropometry in their search for evolutionary remnants of an arrested stage of human development manifested in the people of the Mezzogiorno, they used various instruments to measure human skulls, ears, foreheads, jaws, arms, and other body parts, catalogued these, and correlated them with undesirable behavioral characteristics, inventing in the process a Southern Italian race entirely separate from and unrelated to a superior Northern race and officially confirming the biological origins of Southern “savagery.” Read more »

American Snowflakes: Banning Books and Beer Boycotts

by Mark Harvey

Censorship is the strongest drive in human nature; sex is a weak second. –Phil Kerby

A Big Moon Cake for Little Star by Grace Lin

Here’s a book that could really harm your child: A Big Mooncake for Little Star. The title itself promises dark political theories of socialism, sexual deviance, and a character-corroding leitmotif. Thank goodness some astute parents saw the danger of the book and got it on a banned-books list.

Here’s the summary of the book’s plot: Little Star, who is a cute little girl, has a mother who likes to bake. She bakes a big moon cake and puts it into the night sky to cool. She asks Little Star not to eat the cake while it cools. The cake looks like the moon and Little Star can’t resist its deliciousness. So night after night, despite her best intentions to leave the cake alone, she tiptoes into the night sky and nibbles away at the cake. The cake shrinks like a waning moon—from full circle to thin crescent–until there are only crumbs left. The mother discovers that her daughter ate the big mooncake and gently admonishes her. Then the two decide to make another cake together.

That’s the entirety of the book. It’s beautifully illustrated and won the Caldecott Medal, one of the most prestigious awards for children’s books. Full disclosure, I had to know what happened to the mooncake and couldn’t put the book down. But it’s definitely something children should not be exposed to. I’m not exactly sure why but if the book was banned, it was banned for a good reason and would surely set children on a dangerous life course. Mooncake is probably shorthand for moonbeam and moonbeam has a vague reference to hippy culture so maybe that’s it. Read the book to your five-year-old and the next thing you know she’s out on the road selling marijuana! Read more »

Could Be Worse (Part Two)

by Mike Bendzela

Brain MRI from Public Domain.

[Part One of this essay can be found here.]

Alexia, Redux

Throughout the winter of 2017, as he recovered from the stroke, Don went through a battery of therapies, including walking on a treadmill with and without handrails; navigating the winding corridors of the rehab center and having to find his way back to where he started; taking apart and putting back together block puzzles; scanning a large computer screen for numbered sequences.

I was invited to accompany him to his reading therapy sessions because his gay therapist got such a kick out of us. He would spend the first few minutes asking us about life on the farm. He wanted to know about milking cows and slopping pigs, shoveling shit, wringing the necks of unwanted roosters. Two husbands husbanding, har, har. He even devised a few bucolic reading exercises for Don.

Once the reading therapy began, it was painful to watch. Don would stare at the page upon which the therapist had printed a sentence in block letters:

IT WAS ON THE PIG.

The forefinger of Don’s left hand worked furiously as he struggled to recognize the words, like how I used to count on my fingers in math class. He had discovered he could bypass the damaged areas of his brain by transferring the task of letter recognition to his finger. Air writing, as it were. He eventually got around to naming letters without using his finger as a crutch. Read more »

The Future of Medicine: When Doctors Unionize

by Carol A Westbrook

Trucks in a traffic jam on US81

It was the last straw.  “We’re transferring you, Dr. Westbrook,” my Medical Director said to me.

“One of our offices in another town is desperately in need of a Hematologist, ever since Dr. Paul died,” he continued, “and you are the best hematologist on our staff,” he said, trying to cajole me with flattery.

“But I don’t want to be transferred. I really like working here,” I said. “I have a nice practice, which I built up over the last three years since I started here. I really like my patients and have a good rapport with them. Furthermore, I feel I am part of the community now.”

“Don’t worry. We will assign your patients to one of our other doctors, “he said, in a rather cold-blooded tone. It was no consolation at all.

“Do I have a choice? “I asked bluntly.

“No, not if you want to get paid. You can either transfer, or lose your job,” he said. The Director knew that I could be fired without cause, at the discretion of any of my superiors.

I gave it some thought and reviewed my options. The opportunity to practice full-time Hematology was actually appealing; it’s a difficult specialty, and I enjoy the challenge. And there would be no hospital call, either. But on the downside, it  would mean a long commute—it’s thirty miles away from home, traveling on a crowded interstate, US81 with a lot of speeding truck traffic and three mountain ranges to cross. Most importantly, I would have to leave a practice that I’d built up over three years, with many dear patients I hated to leave behind. If I refused the transfer, I would lose my job, and I doubted if I would be able to find another position at age 64. It would still be a year until I would qualify for Medicare and for full Social Security benefits. I would have to stick it out. Read more »

Monday, May 1, 2023

The ‘I’ and the ‘We’

by Martin Butler

Bemoaning the ills of individualism is nothing new. Jonathan Sack’s bestseller, “Morality, Restoring the Common Good in Divided Times”(2020) provides us with one of the more comprehensive accounts of how we lost community and why we need it back. Justin Welby sums it up well in the foreword: “His message is simple enough: ours is an age in which there is too much ‘I’ and not enough ‘We’. Sacks himself puts the point succinctly when he says: “The revolutionary shift from “We” to “I” means that everything that once consecrated the moral bonds binding us to one another – faith, creed, culture, custom and convention – no longer does … leaving us vulnerable and alone.”

The book is divided into five parts, the first four giving a detailed account of how the shift from ‘We’ to ‘I’ took place and the fifth, entitled ’The Way Forward’, providing some suggestions as to how we might return to a more ‘We’-based society. The breadth and depth of knowledge Sacks displays is impressive, and he draws on a vast range of philosophers and numerous psychological and sociological studies to make his case, which is both detailed yet accessible to the general reader. Sacks divides society into three domains, the state, the economy and the moral system, and it is in this third domain that he claims an ‘unprecedented experiment’ has taken place in the western world, the long-term consequences of which – the divisive politics of recent years, populism, the epidemic of anxiety and depression, increasing inequality, “the assault on free speech taking place on university campuses in Britain and America” and more – we are now living with. He identifies three phases. Read more »

A Re-Evaluation of Gratitude

by Rebecca Baumgartner

Photo by Gabrielle Henderson on Unsplash

Last year, I watched a Kurzgesagt video about the science of how gratitude practices can make you happier, so I decided to start a gratitude journal. It could only help, right? 

I started jotting down a variety of things I was grateful for: my family, my piano, my cats and dog, my financial stability. I also listed smaller and more fleeting pleasures like solving the day’s Wordle in two tries, listening to a good podcast, a lovely rainstorm to break up the scorching heat. 

Then, after only about three days of journaling, a switch flipped. I suddenly wanted to chuck the gratitude journal out the window. I was rebelling not just against the journaling exercise but against the very idea of being grateful, although I wasn’t sure why.

Over the next few days, the truth gradually bubbled to the surface. Gratitude journaling felt punitive and performative, like having to write lines in school or recite prayers on a rosary. Even though the journal entries were only for me, I’d felt like I was on trial, like I was being accused – of not appreciating my life, of not trying hard enough to live in the moment, of taking my life for granted. Each journal entry was an indictment: What could I possibly have to be unhappy about?

The fact is, the feeling of thankfulness is much more complex and ambivalent than common knowledge assumes. Gratitude has a dark side that we lie to ourselves about, and this is hardly ever indicated in the bubbly, feel-good cultural conversation surrounding it. Read more »

Monday Poem

What’s More . . .

who was the woman who, 13.787 ± 0.020 billion years ago,
birthed the universe from a bang? what an extravagant nativity,
what an immense and unruly child came forth, unfurling,
emerging in the space it made

how could she have imagined how singular and varied her child would become,
how extraordinary, how expansive, how crammed with darkness and light,
bulging with multitudes of whims and trajectories, how
creative and destructive in cosmic tantrums, blowing itself apart
in an inner life of psychotic episodes of collapses and collisions
of psychic arguments, mergers, its gravities contesting, warring,
yanking part from part, captivating them in inevitable attractions,
incarcerating them in humiliating orbits

how could the mother of the universe have imagined
what would become of her child,
how it would grow past comprehension
compounding the vastness of nothing
in its creation of time.

—what’s more, how could we, her child’s
chronically egotistical mites imagine?

Jim Culleny, © 4/29/23

How Not to Defend Science

by Joseph Shieber

“In Defense of Merit in Science,” a paper published recently in the Journal of Controversial Ideas, suggests that (1) success in science is currently determined by merit, the (2) success of science in discovering significant truths is due to the fact that science is a merit-based system, and (3) the greatest threat to the continued success of science is what the authors term “Critical Social Justice.” The paper is wrong on all three counts.

1. Success in science is not currently determined by merit

Throughout the paper, the authors extol the virtues of what they call “merit-based science.” For example, on p. 5 they write:

Merit ­based science is truly fair and inclusive. It provides a ladder of opportunity and a fair chance of success for those possessing the necessary skills or talents. Neither socioeconomic privilege nor elite education is necessary.  Indeed, several co-authors of this [article] have built successful careers in science, despite being immigrants, coming from lower socioeconomic backgrounds, and not being products of ‘elite education.’

The implication of this passage is not merely that a culture of science based on merit would be a worthwhile ideal, but also that the current culture of science is largely merit-based. This would seem to be the point of including the observation regarding the “successful careers in science” of some of the co-authors, despite their having come “from lower socioeconomic backgrounds, and not being products of ‘elite education.’”

While it might be the case that it would be a worthwhile idea for science to be organized meritocratically, there is little evidence to suggest that science currently is a meritocracy. Read more »

To Reach For Your Knife: Awe As An Ethical Principle

by Jochen Szangolies

Northern lights over Reykjavik.

I would not consider myself a particularly ‘outdoors’ sort of person. Much of my work and leisure is spent in front of one screen or another, or between the pages of a book (less of the latter than I’d like, at that). However, in those last years lost to COVID-19, I, like many others, found myself spending increasing amounts of time on long hikes through the countryside.

It came as no small revelation to me how restorative that time spent wandering—alone, occasionally, but mostly together with my wife, who deserves credit for motivating me to peel myself off of the couch and making my first tentative steps into the great outdoors—turned out to be. Indeed, it almost feels a bit like cheating: a week’s worth of work anxiety, stress, and general ill-ease washed away, or at least smoothed over, by a few hours spent in the woods.

That’s not to say I would submit it as a cure-all, or some great spiritual revelation. After all, there is presumably a perfectly ordinary evolutionary reason the woods seem to satisfy some inner longing to exchange my everyday surroundings for a more natural habitat. Most zoos have long since realized that their animals fare better in some approximation of their ancestral dwellings, while we still lock ourselves away in small boxes aggregated into great concrete and glass agglomerations. It’s just something that turns out to work for me, that I was lucky to add to my self-care tool-box. Read more »

Oh No, Not Another Essay on ChatGPT

by Derek Neal

At some point in the last couple of months, my reading about ChatGPT reached critical mass. Or, I thought it had. Try as I might, I couldn’t escape the little guy; everywhere I turned there was talk of ChatGPT—what it meant for the student essay, how it could be used in business, how the darn thing actually worked, whether it was really smart, or actually, really stupid. At work, where I teach English language to international university students, I quickly found myself grading ChatGPT written essays (perfect grammar but overly vague and general) and heard predictions that soon our job would be to teach students how to use AI, as opposed to teaching them how to write. It was all so, so depressing. This led to my column a few months ago wherein I launched a defense of writing and underlined its role in the shaping of humans’ ability to think abstractly, as it seemed to me that outsourcing our ability to write could conceivably lead us to return to a pre-literate state, one in which our consciousness is shaped by oral modes of thought. This is normally characterized as a state of being that values direct experience, or “close proximity to the human life world,” while shunning abstract categories and logical reasoning, which are understood to be the result of literate cultures. It is not just writing that is at stake, but an entire worldview shaped by literacy.

The intervening months have only strengthened my convictions. It seems even clearer to me now that the ability to write will become a niche subject, similar to that of Latin, which will be pursued by fewer and fewer students until it disappears from the curriculum altogether. The comparison to Latin is intentional, as Latin was once the language used for all abstract reasoning (Walter Ong, in Orality and Literacy: The Technologizing of the Word, even goes so far as to say that the field of modern science may not have arisen without the use of Learned Latin, which was written but never spoken). In the same way, a written essay will come to be seen as a former standard of argumentation and extended analysis. Read more »