This is the third in a series of posts about J. Robert Oppenheimer’s life and times. All the others can be found here.
In 1925, there was no better place to do experimental physics than Cambridge, England. The famed Cavendish Laboratory there has been created in 1874 by funds donated by a descendant of the eccentric scientist-millionaire Henry Cavendish. It had been led by James Clerk Maxwell and J. J. Thomson, both physicists of the first rank. In 1924, the booming voice of Ernest Rutherford reverberated in its hallways. During its heyday and even beyond, the Cavendish would boast a record of scientific accomplishments unequalled by any other single laboratory before or since; the current roster of Nobel Laureates associated with the institution stands at thirty. By the 1920s Rutherford was well on his way to becoming the greatest experimental physicist in history, having discovered the laws of radioactive transformation, the atomic nucleus and the first example of artificially induced nuclear reactions. His students, half a dozen Nobelists among them, would include Niels Bohr – one of the few theorists the string-and-sealing-wax Rutherford admired – and James Chadwick who discovered the neutron.
Robert Oppenheimer returned back to New York in 1925 after a vacation in New Mexico to disappointment. While he had been accepted into Christ College, Cambridge, as a graduate student, Rutherford had rejected his application to work in his laboratory in spite of – or perhaps because of – the recommendation letter from his undergraduate advisor, Percy Bridgman, that painted a lackluster portrait of Oppenheimer as an experimentalist. Instead it was recommended that Oppenheimer work with the physicist J. J. Thomson. Thomson, a Nobel Laureate, was known for his discovery of the electron, a feat he had accomplished in 1897; by 1925 he was well past his prime. Oppenheimer sailed for England in September. Read more »
One of the amusing things about academic conferences – for a European – is to meet with American scholars. Five minutes into an amicable conversation with an American scholar and they will inevitably confide in a European one of two complaints: either how all their fellow American colleagues are ‘philistines’ (a favourite term) or (but sometimes and) how taxing it is to be always called out as an ‘erudite’ by said fellow countrymen. As Arthur Schnitzler demonstrated in his 1897 play Reigen (better known through Max Ophühls film version La Ronde from 1950), social circles are quickly closed in a confined space; and so, soon enough, by the end of day two of the conference, by pure mathematical calculation, as Justin Timberlake sings, ‘what goes around, comes around’, all the Americans in the room turn out to be both philistines and erudite.
A contradiction in terms? Not so fast and not so sure. Firstly, such confidences are made to the European in the room, identified as anyone from the old continent whose mother-tongue is not English and who is thereby draped with the aura of natural bilingualism – the key to culture. Like Theseus’ ship, the European is both up-to-date and very very old and therefore, involuntary tingling from the direct contact with centuries of history and civilisation. A position which, from the point of view of the American colleague, is not without its own contradictions: for if the oceanic separation awakes a nagging complex of inferiority from the erudite philistine, who fills the distance with fantasies of the old world and the riches of Culture, which (white) America has been lusting over from the Bostonians to Indiana Jones, it is a distance which also buttresses the sense of self-satisfaction suffused within the American psyche, and which, in the world of academia, has evolved into the self-made scholar. Yet another oxymoronic formula which brings us back to ‘the enigma wrapped up in a mystery’ that is the erudite philistine.
Two fundamental principles lie at the heart of this strange bird, the first, that you are always secretly guilty of what you blame others; the second, that you are always somebody else’s philistine. Read more »
The ubiquity of a phenomenon sometimes causes its mystery to grow stale. Consider the strange lines along which a magnet orients iron filings in its vicinity: what we take in stride would, if experienced afresh, seem like the purest magic. Indeed, humble iron is probably the closest we get to genuine sci-fi ‘unobtainium’. Not only can the right treatment increase its durability by a multiple, enabling technologies unthinkable without it, but if you take a rod of iron, and hit it hard with a hammer, it suddenly acquires the ability to attract other bits of iron—as if the imparted force is transformed and stored in a mysterious field surrounding it. Moreover, if you then take that empowered core, and pass it through loops of very thinly wrought iron, you find that you can suddenly draw tiny bolts of lightning from the wire—or use the energy to power a light, or a motor, or, indeed, our civilization.
So, what is it that imbues iron with these near-magical capabilities? What yields it the power to attract or repel? Or, in the immortal words of the Insane Clown Posse, fucking magnets, how do they work?
Pace Violent J and Shaggy 2 Dope, the answer to this question is well known—in that particular sense of ‘well known’ that physicists use when they mean ‘don’t worry about it, the math says it works’. But even among physicists, the answer isn’t usually spelled out all that clearly—which is a bit of a shame: as we will see, it’s a cardinal example of how quantum mechanics, far from being just a ‘theory of the small’, is directly responsible for many everyday phenomena. Ultimately, the key to the question ‘why do magnets attract things’ lies in the phenomenon of quantum interference—though we’ll have to plot a bit of a roundabout course to get there. Read more »
It is ironic that one of the primary obstacles to meritocracy in science, the oversupply of talent, is simultaneously a key contributor to the myth that merit reigns in science.
To explain, I’ll proceed in three steps. First, I’ll devote most of my efforts to provide reason for thinking that there IS an oversupply of talent in science. Second, I’ll briefly explain why the oversupply of talent is problematic for meritocratic conceptions of science. Third, I’ll suggest a reason why such oversupplies of talent paradoxically fuel the myth of meritocracy.
Is there a surplus of talent in science? In 2019 alone, there were roughly 23,000 Ph.D.s awarded in the natural sciences, computer science, and mathematics – and an additional 10,000 or so Ph.D.s awarded in engineering. This vastly dwarfs the number of research positions – particularly academic research positions – available.
Indeed, as a 2021 article in Areo magazine notes, “between 1982 and 2011 in the US, around 800,000 science and engineering PhDs were awarded, but only about 100,000 new tenure-track faculty positions were created in those fields.”Read more »
What should we do in cases where increasingly sophisticated and potentially autonomous AI-systems perform ‘actions’ that, under normal circumstances, would warrant the ascription of moral responsibility? That is, who (or what) is responsible when, for example, a self-driving car harms a pedestrian? An intuitive answer might be: Well, it is of course the company who created the car who should be held responsible! They built the car, trained the AI-system, and deployed it.
However, this answer is a bit hasty. The worry here is that the autonomous nature of certain AI-systems means that it would be unfair, unjust, or inappropriate to hold the company or any individual engineers or software developers responsible. To go back to the example of the self-driving car; it may be the case that due to the car’s ability to act outside of the control of the original developers, their responsibility would be ‘cancelled’, and it would be inappropriate to hold them responsible.
Moreover, it may be the case that the machine in question is not sufficiently autonomous or agential for it to be responsible itself. This is certainly true of all currently existing AI-systems and may be true far into the future. Thus, we have the emergence of a ‘responsibility gap’: Neither the machine nor the humans who developed it are responsible for some outcome.
In this article I want to offer some brief reflections on the ‘problem’ of responsibility gaps. Read more »
I don’t want to write this and you don’t want to read it. But this is the world we live in. As I write this, it’s been three weeks since a man without a criminal record legally purchased a trunk-full of guns, opened fire at the Allen Premium Outlets mall and killed eight people, including three children, and wounded seven others, all in the space of about three minutes. A place that previously had been known mostly for its contribution to traffic jams on Stacy Rd. will now be linked forever to white-shrouded bodies and blood splatters on the concrete outside the H&M store.
Because I live in a very conservative state, I’ve often had to swallow my true self, my beliefs, my reason, my outrage, my sadness, and my intelligence when listening to those around me speak about current events. A few years back, a conservative I know told me he was disturbed by the decision of his family’s school district to allow a trans person to serve as a substitute teacher for his daughter’s third-grade class. “I wasn’t prepared to talk to her about…all that yet,” he said, gesturing vaguely to indicate the existence of trans people. “I mean, she’s only eight.”
For complex reasons that any liberal living and working alongside conservatives in the Bible Belt will understand, I kept my response to a minimum. There was no point in telling him that talking about trans people with his daughter didn’t need to be a fraught conversation; it was really just about accepting people as they are and recognizing that difference exists. Eight-year-olds can understand that. In fact, it’s important that they understand that, because otherwise they will struggle to develop the empathy and emotional intelligence they need to connect with others who don’t see the world exactly as they do.
I had cause to recall that particular conversation when my son and niece asked me about the mall shooting.Read more »
In the first round of this year’s NBA playoffs, Austin Reaves, an undrafted and little-known guard who plays for the Los Angeles Lakers, held the ball outside the three-point line. With under two minutes remaining, the score stood at 118-112 in the Lakers’ favor against the Memphis Grizzlies. Lebron James waited for the ball to his right. Instead of deferring to the star player, Reaves ignored James, drove into the lane, and hit a floating shot for his fifth field goal of the fourth quarter. He then turned around and yelled, “I’m him!”. The initial reaction one might have to this statement—“I’m him”—is a question: who are you? The phrase sounds strange to our ears. Who could you be but yourself? And if you are someone else, shouldn’t we know who this other person is? Who is the referent of the pronoun “him”? Perhaps because of its cryptic nature, “I’m him” is an evocative statement, and it has quickly spread throughout sports and gaming culture. In a 2022 episode of “The Shop,” Lebron James himself declared “I’m him.” On YouTube, there are numerous compilations with titles such as “NFL ‘I’m Him’ Moments.” If you search the phrase on Twitter, people use it in relation to sports, music, video games, and themselves. But what does it mean, and where did it come from?
An internet search turns up a few articles explaining the history of this statement. Apparently, a rapper named Kevin Gates was the first to popularize the phrase when he titled his 2019 album, I’mHim. In this instance, “Him” acted as an acronym for “His Imperial Majesty.” We could then understand people saying, “I’m Him” to be saying something along the lines of “I’m the king.” This expression would function much like another sports saying, “the GOAT,” meaning “the greatest of all time.” But I think there is more to it than this. Since Gates used the term, no other examples have used “him” as an acronym. Read more »
I’m bored; you’re bored; we’re all bored. By our books and movies and television shows, the endless blandness of the Netflix queue, by our music and theater and art. Culture now is strenuously cautious, nervously polite, earnestly worthy, ploddingly obvious, and above all, dismally predictable. It never dares to stray beyond the four corners of the already known. Robert Hughes spoke of the shock of the new, his phrase for modernism in the arts. Now there’s nothing that is shocking, and nothing that is new: irresponsible, dangerous; singular, original; the child of one weird, interesting brain. Decent we have, sometimes even good: well-made, professional, passing the time. But wild, indelible, commanding us without appeal to change our lives? I don’t think we even remember what that feels like.
Viruses are possibly even more maligned than bacteria, spoken of exclusively in terms of disease. Here, virologist Marilyn J. Roossinck ranges far beyond human pathogens to convince you how narrow that picture is. She instead reveals them as enigmatic entities that are intimately entwined with the entirety of Earth’s biosphere, exploiting and enabling it in equal measure. Backed by numerous infographics, the book alternates between chapters on basic principles of virology and brief portraits of noteworthy viruses. The result is an entry-level introduction to virology that fascinated me more than I expected.
“This is not a scenic drive,” said James Willcox, of adventure travel specialist Untamed Borders. “But what’s incredible about Route 1 is where it takes you: to the birthplace of some of the world’s earliest civilisations, the home of many of humankind’s greatest innovations.”
Willcox, who was charged with logistics and security for my journey, was briefing me before I embarked on a 530km, two-day road trip from Basra to Baghdad. My trip would be using Iraq’s first and longest freeway, the 1,200km-long Route 1, as a conduit to explore the heart of ancient Mesopotamia. Though the region has experienced decades of recent conflict, it was also once home to a series of illustrious historical empires (the Babylonians, Assyrians and Sumerians to name a few), and Willcox reassured me that the journey would be unforgettable so long as I followed some simple rules: “Keep a low profile, dress conservatively and don’t photograph any of the armed checkpoints,” he said.
I flew into Basra, Iraq’s largest port. The city straddles the Shatt al-Arab river, which is formed by the confluence of the Tigris and Euphrates – the two mighty waterways that inspired the name Mesopotamia (meaning “between two rivers” in Greek).
At the press conference following the Cannes premiere of Martin Scorsese’s Killers of the Flower Moon, someone asked Robert De Niro about his character, a kingpin of a sort with a tricky psyche. “It’s the banality of evil,” he said, describing the character’s moral ambiguity. “It’s the thing we have to watch out for. We see it today, of course. We all know who I’m going to talk about, but I’m not going to say his name.” (Everyone knew who he meant.)
The banality of evil was hot at Cannes this year. De Niro’s statement came on the heels of the premiere of Jonathan Glazer’s The Zone of Interest, which set Cannes critics abuzz about the same phrase. That movie — which I proposed might best be understood as an adaptation of Hannah Arendt’s Eichmann in Jerusalem, even more than the Martin Amis novel it’s loosely based on — is not much like Killers of the Flower Moon, at first blush. Glazer’s is short, taut horror that evokes the Holocaust by keeping it offscreen; Scorsese’s is epic, bloody, and relentless in its depiction of a series of murders from a century ago.
Thematically, however, they often rhyme. Both are about mankind’s ability to exterminate one another while deluding themselves into thinking they’re doing the right thing. Both are about atrocities so heinous they’re hard to wrap your mind around. And both feel eerily contemporary, in an age where prejudice, racism, and fascism are on the rise around the globe.
I’d like to have a word with you. Could we be alone for a minute? I have been lying until now. Do you believe
I believe myself? Do you believe yourself when you believe me? Lying is natural. Forgive me. Could we be alone forever? Forgive us all. The word is my enemy. I have never been alone; bribes, betrayals. I am lying even now. Can you believe that? I give you my word.
by James Tate from Strong Measures Harper Collins, 1986
Biden’s green industrial strategy is both impressive in scale and in its passage amidst highly complex political circumstances. After fractious negotiations, Congress authorized over $4 trillion in new investment between its four major pieces of economic legislation. Officially, around $500 billion of that spending goes towards climate-related spending, most of which is contained in the Inflation Reduction Act. According to our estimates, the US federal government will spend an average of over $100 billion each year on climate mitigation over the next ten years, about two and a half times the annual average from 2009-2017.
Even this figure is likely a significant underestimate. Since many of the tax credits contained in IRA are uncapped, total public expenditure is limited only by private demand for these incentives. If deployment of key clean energy technologies, such as solar or green hydrogen, were to be in line with a 2050 net-zero pathway, we estimate that total spending in the IRA alone could reach $1 trillion. A recent Brookings analysis reached a similar figure even without the net-zero constraint. This discrepancy is particularly large in renewable energy spending, which could see twice as much spending in a climate-aligned pathway—green hydrogen could see five times the spending, and the electric vehicle (EV) supply chain a staggering ten times the official Congressional Budget Office (CBO) estimates if consumers rush to buy cheaper EVs.