Oppenheimer III: “Oppenheimer seemed to me, right from the beginning, a very gifted man.”

by Ashutosh Jogalekar

“Oppenheimer, Julius Robert”, by David A. Wargowski, December 7, 2018

This is the third in a series of posts about J. Robert Oppenheimer’s life and times. All the others can be found here.

In 1925, there was no better place to do experimental physics than Cambridge, England. The famed Cavendish Laboratory there has been created in 1874 by funds donated by a descendant of the eccentric scientist-millionaire Henry Cavendish. It had been led by James Clerk Maxwell and J. J. Thomson, both physicists of the first rank. In 1924, the booming voice of Ernest Rutherford reverberated in its hallways. During its heyday and even beyond, the Cavendish would boast a record of scientific accomplishments unequalled by any other single laboratory before or since; the current roster of Nobel Laureates associated with the institution stands at thirty. By the 1920s Rutherford was well on his way to becoming the greatest experimental physicist in history, having discovered the laws of radioactive transformation, the atomic nucleus and the first example of artificially induced nuclear reactions. His students, half a dozen Nobelists among them, would include Niels Bohr – one of the few theorists the string-and-sealing-wax Rutherford admired – and James Chadwick who discovered the neutron.

Robert Oppenheimer returned back to New York in 1925 after a vacation in New Mexico to disappointment. While he had been accepted into Christ College, Cambridge, as a graduate student, Rutherford had rejected his application to work in his laboratory in spite of – or perhaps because of – the recommendation letter from his undergraduate advisor, Percy Bridgman, that painted a lackluster portrait of Oppenheimer as an experimentalist. Instead it was recommended that Oppenheimer work with the physicist J. J. Thomson. Thomson, a Nobel Laureate, was known for his discovery of the electron, a feat he had accomplished in 1897; by 1925 he was well past his prime. Oppenheimer sailed for England in September. Read more »



The Erudite Philistine

by Ada Bronowski

One of the amusing things about academic conferences – for a European – is to meet with American scholars. Five minutes into an amicable conversation with an American scholar and they will inevitably confide in a European one of two complaints: either how all their fellow American colleagues are ‘philistines’ (a favourite term) or (but sometimes and) how taxing it is to be always called out as an ‘erudite’ by said fellow countrymen. As Arthur Schnitzler demonstrated in his 1897 play Reigen (better known through Max Ophühls film version La Ronde from 1950), social circles are quickly closed in a confined space; and so, soon enough, by the end of day two of the conference, by pure mathematical calculation, as Justin Timberlake sings, ‘what goes around, comes around’, all the Americans in the room turn out to be both philistines and erudite.

A contradiction in terms? Not so fast and not so sure. Firstly, such confidences are made to the European in the room, identified as anyone from the old continent whose mother-tongue is not English and who is thereby draped with the aura of natural bilingualism – the key to culture. Like Theseus’ ship, the European is both up-to-date and very very old and therefore, involuntary tingling from the direct contact with centuries of history and civilisation. A position which, from the point of view of the American colleague, is not without its own contradictions: for if the oceanic separation awakes a nagging complex of inferiority from the erudite philistine, who fills the distance with fantasies of the old world and the riches of Culture, which (white) America has been lusting over from the Bostonians to Indiana Jones, it is a distance which also buttresses the sense of self-satisfaction suffused within the American psyche, and which, in the world of academia, has evolved into the self-made scholar. Yet another oxymoronic formula which brings us back to ‘the enigma wrapped up in a mystery’ that is the erudite philistine.

Two fundamental principles lie at the heart of this strange bird, the first, that you are always secretly guilty of what you blame others; the second, that you are always somebody else’s philistine. Read more »

Everyday Quantum: On The Origin Of Force

by Jochen Szangolies

Magnetic field made visible using so-called ferrofluid. Image credit: Gregory F. Maxwell, own work, CC BY-SA 3.0, via wikimedia commons.

The ubiquity of a phenomenon sometimes causes its mystery to grow stale. Consider the strange lines along which a magnet orients iron filings in its vicinity: what we take in stride would, if experienced afresh, seem like the purest magic. Indeed, humble iron is probably the closest we get to genuine sci-fi ‘unobtainium’. Not only can the right treatment increase its durability by a multiple, enabling technologies unthinkable without it, but if you take a rod of iron, and hit it hard with a hammer, it suddenly acquires the ability to attract other bits of iron—as if the imparted force is transformed and stored in a mysterious field surrounding it. Moreover, if you then take that empowered core, and pass it through loops of very thinly wrought iron, you find that you can suddenly draw tiny bolts of lightning from the wire—or use the energy to power a light, or a motor, or, indeed, our civilization.

So, what is it that imbues iron with these near-magical capabilities? What yields it the power to attract or repel? Or, in the immortal words of the Insane Clown Posse, fucking magnets, how do they work?

Pace Violent J and Shaggy 2 Dope, the answer to this question is well known—in that particular sense of ‘well known’ that physicists use when they mean ‘don’t worry about it, the math says it works’. But even among physicists, the answer isn’t usually spelled out all that clearly—which is a bit of a shame: as we will see, it’s a cardinal example of how quantum mechanics, far from being just a ‘theory of the small’, is directly responsible for many everyday phenomena. Ultimately, the key to the question ‘why do magnets attract things’ lies in the phenomenon of quantum interference—though we’ll have to plot a bit of a roundabout course to get there. Read more »

Talent Surpluses and the Myth of Meritocracy in Science

by Joseph Shieber

It is ironic that one of the primary obstacles to meritocracy in science, the oversupply of talent, is simultaneously a key contributor to the myth that merit reigns in science.

To explain, I’ll proceed in three steps. First, I’ll devote most of my efforts to provide reason for thinking that there IS an oversupply of talent in science. Second, I’ll briefly explain why the oversupply of talent is problematic for meritocratic conceptions of science. Third, I’ll suggest a reason why such oversupplies of talent paradoxically fuel the myth of meritocracy.

Is there a surplus of talent in science? In 2019 alone, there were roughly 23,000 Ph.D.s awarded in the natural sciences, computer science, and mathematics – and an additional 10,000 or so Ph.D.s awarded in engineering. This vastly dwarfs the number of research positions – particularly academic research positions – available.

Indeed, as a 2021 article in Areo magazine notes, “between 1982 and 2011 in the US, around 800,000 science and engineering PhDs were awarded, but only about 100,000 new tenure-track faculty positions were created in those fields.” Read more »

Responsibility Gaps: A Red Herring?

by Fabio Tollon

What should we do in cases where increasingly sophisticated and potentially autonomous AI-systems perform ‘actions’ that, under normal circumstances, would warrant the ascription of moral responsibility? That is, who (or what) is responsible when, for example, a self-driving car harms a pedestrian? An intuitive answer might be: Well, it is of course the company who created the car who should be held responsible! They built the car, trained the AI-system, and deployed it.

However, this answer is a bit hasty. The worry here is that the autonomous nature of certain AI-systems means that it would be unfair, unjust, or inappropriate to hold the company or any individual engineers or software developers responsible. To go back to the example of the self-driving car; it may be the case that due to the car’s ability to act outside of the control of the original developers, their responsibility would be ‘cancelled’, and it would be inappropriate to hold them responsible.

Moreover, it may be the case that the machine in question is not sufficiently autonomous or agential for it to be responsible itself. This is certainly true of all currently existing AI-systems and may be true far into the future. Thus, we have the emergence of a ‘responsibility gap’: Neither the machine nor the humans who developed it are responsible for some outcome.

In this article I want to offer some brief reflections on the ‘problem’ of responsibility gaps. Read more »

Talking to Kids about Dead Kids

by Rebecca Baumgartner

I don’t want to write this and you don’t want to read it. But this is the world we live in. As I write this, it’s been three weeks since a man without a criminal record legally purchased a trunk-full of guns, opened fire at the Allen Premium Outlets mall and killed eight people, including three children, and wounded seven others, all in the space of about three minutes. A place that previously had been known mostly for its contribution to traffic jams on Stacy Rd. will now be linked forever to white-shrouded bodies and blood splatters on the concrete outside the H&M store.

***

Because I live in a very conservative state, I’ve often had to swallow my true self, my beliefs, my reason, my outrage, my sadness, and my intelligence when listening to those around me speak about current events. A few years back, a conservative I know told me he was disturbed by the decision of his family’s school district to allow a trans person to serve as a substitute teacher for his daughter’s third-grade class. “I wasn’t prepared to talk to her about…all that yet,” he said, gesturing vaguely to indicate the existence of trans people. “I mean, she’s only eight.”

For complex reasons that any liberal living and working alongside conservatives in the Bible Belt will understand, I kept my response to a minimum. There was no point in telling him that talking about trans people with his daughter didn’t need to be a fraught conversation; it was really just about accepting people as they are and recognizing that difference exists. Eight-year-olds can understand that. In fact, it’s important that they understand that, because otherwise they will struggle to develop the empathy and emotional intelligence they need to connect with others who don’t see the world exactly as they do.

I had cause to recall that particular conversation when my son and niece asked me about the mall shooting. Read more »

I’m Him!

by Derek Neal

In the first round of this year’s NBA playoffs, Austin Reaves, an undrafted and little-known guard who plays for the Los Angeles Lakers, held the ball outside the three-point line. With under two minutes remaining, the score stood at 118-112 in the Lakers’ favor against the Memphis Grizzlies. Lebron James waited for the ball to his right. Instead of deferring to the star player, Reaves ignored James, drove into the lane, and hit a floating shot for his fifth field goal of the fourth quarter. He then turned around and yelled, “I’m him!”. The initial reaction one might have to this statement—“I’m him”—is a question: who are you? The phrase sounds strange to our ears. Who could you be but yourself? And if you are someone else, shouldn’t we know who this other person is? Who is the referent of the pronoun “him”? Perhaps because of its cryptic nature, “I’m him” is an evocative statement, and it has quickly spread throughout sports and gaming culture. In a 2022 episode of “The Shop,” Lebron James himself declared “I’m him.” On YouTube, there are numerous compilations with titles such as “NFL ‘I’m Him’ Moments.” If you search the phrase on Twitter, people use it in relation to sports, music, video games, and themselves. But what does it mean, and where did it come from?

An internet search turns up a few articles explaining the history of this statement. Apparently, a rapper named Kevin Gates was the first to popularize the phrase when he titled his 2019 album, I’m Him. In this instance, “Him” acted as an acronym for “His Imperial Majesty.” We could then understand people saying, “I’m Him” to be saying something along the lines of “I’m the king.” This expression would function much like another sports saying, “the GOAT,” meaning “the greatest of all time.” But I think there is more to it than this. Since Gates used the term, no other examples have used “him” as an acronym. Read more »

Monday, May 22, 2023

Midnight Judges And Jefferson’s Battle Over The Federal Courts

by Michael Liss

“Declaration of Independence,” John Trumbull. Capitol Rotunda.

November 1800. In the Presidential rematch between John Adams and Thomas Jefferson we have a clear loser, but not yet a winner. John Adams will be returning home. Thomas Jefferson, thanks to a bizarre tie in the Electoral College with his erstwhile running mate, Aaron Burr, will have to wait for the House of Representatives. Whatever that result might be, it is clear that a new team is coming to Washington. Jefferson’s Democratic-Republicans have flipped the House and have narrowed the gap in the Senate. Over the course of the next few months, thanks to by-elections, three more Federalist Senators will go down, and Jefferson’s party will control both the Executive and Legislative branches.

It’s fair to say that many Federalists are in a panic. Through Washington’s two terms and Adams’ first, being in power is the only thing they have known. It was so easy in the beginning, given Washington’s enormous personal prestige. Then, because people will talk, and there were more ambitious and talented men than there were positions to fill, the grumbling set in. It took just three years from Washington’s 1789 inauguration for Madison’s (and, sotto voce Jefferson’s) new political party to emerge, and, although the Democratic-Republican team did not contest the Presidency against Washington in 1792, it was part of a loose Anti-Administration coalition that won the House.

The grumbling increased in Washington’s Second Term, first directed at his Cabinet, particularly Alexander Hamilton, then, respectfully, of course, at Washington himself. A great man, yes, it was whispered, but in decline and controlled by his advisors. Among the whisperers was Washington’s own Secretary of State, Thomas Jefferson, who left the Administration at the end of 1793 to return to Virginia and do what Jefferson did exceptionally well—ponder, and quietly, oh so quietly, move political chess pieces around on the board.

The Federalists’ reign was not over: in 1796, enough people thought John Adams had earned a stint in the hot seat, and, by the narrowest of margins and with the help of the House of Representatives, Adams held the office for the party.  Still, the balance of power was inevitably shifting away from the Federalists. The Party was basically “aging early,” becoming stiff, cranky, lacking in new ideas. Read more »

Oppenheimer II: “Work…frantic, bad and graded A”

by Ashutosh Jogalekar

“Oppenheimer, Julius Robert”, by David A. Wargowski, December 7, 2018

This is the second in a series of posts about J. Robert Oppenheimer’s life and times. All the others can be found here.

In the fall of 1922, after the New Mexico sojourn had strengthened his body and mind, Oppenheimer entered Harvard with an insatiable appetite for knowledge; in the words of a friend, “like a Goth looting Rome”. He wore his clothes on a spare frame – he weighed no more than 120 pounds at any time during his life – and had striking blue eyes. Harvard required its students to take four classes every semester for a standard graduation schedule. Robert would routinely take six classes every semester and audit a few more. Nor were these easy classes; a typical semester might include, in addition to classes in mathematics, chemistry and physics, ones in French literature and poetry, English history and moral philosophy.

The best window we have into Oppenheimer’s personality during his time at Harvard comes from the collection of his letters during this time edited by Alice Kimball Smith and Charles Weiner. They are mostly addressed to his Ethical Culture School teacher, Herbert Smith, and to his friends Paul Horgan and Francis Fergusson. Fergusson and Horgan were both from New Mexico where Robert had met them during his earlier trip. Horgan was to become an eminent historian and novelist who would win the Pulitzer Prize twice; Fergusson who departed Harvard soon as a Rhodes Scholar became an important literary and theater critic. They were to be Oppenheimer’s best friends at Harvard.

The letters to Fergusson, Horgan and Smith are fascinating and provide penetrating insights into the young scholar’s scientific, literary and emotional development. In them Oppenheimer exhibits some of the traits that he was to become well known for later; these include a prodigious diversity of reading and knowledge and a tendency to dramatize things. Also, most of the letters are about literature rather than science, which indicates that Oppenheimer had still not set his heart on becoming a scientist. He also regularly wrote poetry that he tried to get published in various sources. Read more »

Beyond artificial intelligence?

by William Benzon

I’m stumped. I’ve hit a wall. More than one most likely.

I don’t know how to think about artificial intelligence. Well, that’s a rather broad statement. After all, I’ve done quite a bit of thinking about it. Six of my most recent articles here at 3QD about been about it, and who know how many blog posts, working papers, and formal academic papers going back for decades. I’ve thought a lot about it. And yet I’ve hit a wall.

First I should say that it’s only relatively recently that I’ve given a great deal of focal attention to artificial intelligence as such. What I’ve been interested in all these years has been the human mind, which I’ve often approached from a computational point of view, as I explained in From “Kubla Khan” through GPT and beyond. In pursuing that interest I read widely in the cognitive sciences, including A.I. My objective was always to understand the human mind and never to create an artificial human being.

It’s the prospect creating of an artificial human being that has me stumped. Of course, an artificial intelligence isn’t necessarily an artificial human being. Computer systems that plays chess or Jeopardy at a championship level are artificial intelligences, but they certainly aren’t artificial human beings. ChatGPT or any of recent large language models (LLMs) are artificial intelligences, but they aren’t artificial human beings.

But the capability of these recent systems, certainly the LLMs, but other systems as well, are so startling that, it seems to me, that they have changed the valence, for lack of a better word, of inquiry into the computational view of the human mind. What do I mean by that, by valence? My dictionary links the term with chemistry and with linguistics. In both contexts the term is about capacity for combination, the ability of one element to join with others in forming chemical compounds, the ability of a word to combine with others in a sentence. Something like that. Read more »

Parenthood, Conservatism, and the Existing World

by David Kordahl

Portrait of the Family Hinlopen (Gabriël Metsu, c. 1663)

I’m writing this column in the cool semi-darkness of a municipal auditorium. I will be here for several hours, and my main duty is to stay put. This is the dress rehearsal for a dance recital where my daughters (ages four and six) will perform. When the time comes, I will take a video with my phone.

There is nothing especially noteworthy about this, as I am just one among the dozens of parents in this room, and the millions of Americans elsewhere, who regularly schlep their children from activity to activity. Recently, however, I came across a journal article claiming that these hours spent taking care of children may have political consequences reaching far beyond the cost of dance lessons. “Experimental and cross-cultural evidence that parenthood and parental care increase social conservatism,” a psychology study from 2022 by the international collaboration of Kerry et al., argues that, across the globe, parenthood makes people more conservative.

Specifically, the article claims that parents, on average, have more conservative attitudes than their non-parents on questions involving promiscuity, homosexuality, prostitution, and abortion. Moreover, the article suggests that this relationship may be causal, that parenthood might induce people to adopt conservative attitudes (though only on social issues—not on economics). Read more »

30 Times

by Akim Reinhardt

S.S. Edmund Fitzgerald OnlineI can’t sing. Or so I always thought. A notorious karaoke warbler, I would sometimes pick a country tune, preferably Hank Williams, so that when my voice cracked, I could pretend I was yodeling. Then one night, I stepped up to the bar’s microphone and sang a Gordon Lightfoot song.

I wasn’t terrible. For once. Why? It turns out that most pop songs are for tenors, and I’m a baritone with a range similar to Dean Martin and Fats Domino, and even Lou Rawls and Johnny Cash if they don’t drift too low, but especially Gordon Lightfoot. No, I still can’t sing particularly well. But thanks to crooning one by Gord, I know which songs won’t make me croak and quaver.

The legend lives on from the Chippewa on down
Of the big lake they called Gitche Gumee
The lake, it is said, never gives up her dead
When the skies of November turn gloomy

With a load of iron ore twenty-six thousand tons more
Than the Edmund Fitzgerald weighed empty
That good ship and true was a bone to be chewed
When the gales of November came early

Lightfoot did meticulous research while writing “The Wreck of the Edumund Fitzgerald.” For example, on its final, ill-fated trip, the Edmund Fitzgerald did in fact leave a factory in Wisconsin headed for Cleveland, and carried 26,000 tons of iron. Later, he even made small changes to the lyrics in live performances as new facts about the ship’s sinking eventually came to light. But his research wasn’t perfect. “Chippewa” is a French/English corruption of “Ojibwe.” He got closer on the Ojibwemowin (Ojibwe Language) name for what Anglo settlers call Lake Superior: Gichigame. Read more »

Lost in Translation: In Praise of Learning Languages

Chinese Oracle Bone Script

by Leanne Ogasawara

 1.

In Arkady Martine’s 2019 debut novel A Memory called Empire, newly-appointed Ambassador Mahit Dzmare travels via jumpgate from her home planet at the edge of the Teixcalaani empire to the capital. Teixcalaan is like the sun around which all the other planets in the empire revolve. The city of cities, it is the center of the Teixcalaani universe.

Arriving in the capital, the new ambassador is immediately invited to attend a gathering of the glittering literati at court. Mahit stands in awe of what to her eyes is the pinnacle of civilization. The gathering doubles as a poetry contest. And as she listens to a recitation of a poem of self-sacrifice at war, which simultaneously spells the name of one of the lost soldiers in the opening glyphs of each line, Mahit realizes that right here, in this very place, is everything she has wanted since she was a young girl back on her home planet at the edge of the empire.

The novel is, not surprisingly, dedicated to “anyone who has ever fallen in love with a culture that is devouring their own.”

Memory Called Empire received the 2020 Hugo Award for Best Novel. It has also gotten some attention from linguists.

The written language of the empire, Teixcalaanli, is based on the Aztec hieroglyphics of Mesoamerican language Nahuatl, which was the language spoken in central Mexico before the arrival of the Spanish. A logosyllabic tongue written in glyphs like Chinese, the script allows for wonderful poetic artistry and playful wordplay. Read more »