One of my favorite Earle performances is an acoustic cover of Paul Simon’s “Graceland,” which he recorded in 2017 for Hamburger Küchensessions, a German series in which musicians perform from the corner of a kitchen. Take a breath before you watch it. His eyes are fluttering, and he appears unshaven, a little jittery. But his voice is beautiful—fragile and strong. “Graceland” is a kind of American hymn, a song about tragedy and heartache and also the part of a person’s spirit that tells them to keep going anyway, always—to atone and reclaim, as many times as it takes. Earle speeds the song up, and doesn’t quite cling to the melody or the lyrics as Simon wrote them, but his rendition is heavy, spare, and stunning. “I’m going to Graceland,” he promises, over and over as the song ends. It feels good to think that he is there right now, received and at peace.
Here is another pair of stories, one that will be more familiar to American readers. Let’s begin this one in the 1980s, when a young Lindsey Graham first served with the Judge Advocate General’s Corps—the military legal service—in the U.S. Air Force. During some of that time, Graham was based in what was then West Germany, on the cutting edge of America’s Cold War efforts. Graham, born and raised in a small town in South Carolina, was devoted to the military: After both of his parents died when he was in his 20s, he got himself and his younger sister through college with the help of an ROTC stipend and then an Air Force salary. He stayed in the Reserves for two decades, even while in the Senate, sometimes journeying to Iraq or Afghanistan to serve as a short-term reserve officer. “The Air Force has been one of the best things that has ever happened to me,” he said in 2015. “It gave me a purpose bigger than myself. It put me in the company of patriots.” Through most of his years in the Senate, Graham, alongside his close friend John McCain, was a spokesperson for a strong military, and for a vision of America as a democratic leader abroad. He also supported a vigorous notion of democracy at home. In his 2014 reelection campaign, he ran as a maverick and a centrist, telling The Atlanticthat jousting with the Tea Party was “more fun than any time I’ve been in politics.”
While Graham was doing his tour in West Germany, Mitt Romney became a co-founder and then the president of Bain Capital, a private-equity investment firm. Born in Michigan, Romney worked in Massachusetts during his years at Bain, but he also kept, thanks to his Mormon faith, close ties to Utah. While Graham was a military lawyer, drawing military pay, Romney was acquiring companies, restructuring them, and then selling them. This was a job he excelled at—in 1990, he was asked to run the parent firm, Bain & Company—and in the course of doing so he became very rich. Still, Romney dreamed of a political career, and in 1994 he ran for the Senate in Massachusetts, after changing his political affiliation from independent to Republican. He lost, but in 2002 he ran for governor of Massachusetts as a nonpartisan moderate, and won. In 2007—after a gubernatorial term during which he successfully brought in a form of near-universal health care that became a model for Barack Obama’s Affordable Care Act—he staged his first run for president. After losing the 2008 Republican primary, he won the party’s nomination in 2012, and then lost the general election.
Both Graham and Romney had presidential ambitions; Graham staged his own short-lived presidential campaign in 2015 (justified on the grounds that “the world is falling apart”). Both men were loyal members of the Republican Party, skeptical of the party’s radical and conspiratorial fringe. Both men reacted to the presidential candidacy of Donald Trump with real anger, and no wonder: In different ways, Trump’s values undermined their own. Graham had dedicated his career to an idea of U.S. leadership around the world—whereas Trump was offering an “America First” doctrine that would turn out to mean “me and my friends first.”
In January, Robert Williams, an African-American man, was wrongfully arrested due to an inaccurate facial recognition algorithm, a computerized approach that analyzes human faces and identifies them by comparison to database images of known people. He was handcuffed and arrested in front of his family by Detroit police without being told why, then jailed overnight after the police took mugshots, fingerprints, and a DNA sample. The next day, detectives showed Williams a surveillance video image of an African-American man standing in a store that sells watches. It immediately became clear that he was not Williams. Detailing his arrest in the Washington Post, Williams wrote, “The cops looked at each other. I heard one say that ‘the computer must have gotten it wrong.’” Williams learned that in investigating a theft from the store, a facial recognition system had tagged his driver’s license photo as matching the surveillance image. But the next steps, where investigators first confirm the match, then seek more evidence for an arrest, were poorly done and Williams was brought in. He had to spend 30 hours in jail and post a $1,000 bond before he was freed.
What makes the Williams arrest unique is that it received public attention, reports the American Civil Liberties Union.1 With over 4,000 police departments using facial recognition, it is virtually certain that other people have been wrongly implicated in crimes. In 2019, Amara Majeed, a Brown University student, was falsely identified by facial recognition as a suspect in a terrorist bombing in Sri Lanka. Sri Lankan police retracted the mistake, but not before Majeed received death threats. Even if a person goes free, his or her personal data remains listed among criminal records unless special steps are taken to expunge it.
Democracy imposes a substantial moral burden on citizens. They must regard one another as political equals, even when they disagree deeply about justice. Each side is likely to see the opposition as not only wrong about the issue, but on the side of injustice. How can citizens both stand up for justice and yet embrace a political arrangement that gives injustice an equal say? Political sympathy, a disposition to recognize in our opposition an attempt to live according to their conception of value, is proposed as a way to lighten democracy’s burden.
The frustrations of democracy are familiar. Politicians pander, surrogates spin, opponents object, and citizens pick sides much in the way that they attach themselves to sports teams. Meanwhile, the spectacle of democracy seems to obstruct its point, which is competent and representative government. Careful reflection on the common good tends to get lost amidst the politicking. Yet as citizens we have a responsibility to try to sort through all of the noise so that we can inform ourselves of the issues of the day and exercise our political power wisely. We typically cannot count on our fellow citizens to satisfy this duty. Accordingly, the task of collective self-government exposes us to a great deal of irritation.
What’s more, when our side loses at the polls, democracy’s only consolation is that we may continue working on behalf of our political convictions. We mustn’t resign or withdraw, but instead turn the page and begin anew. Democratic citizenship takes persistence. In both emotional and practical terms, democracy is not only frustrating, it’s also exhausting.
Yet democracy is demanding in another way as well.
Human civilization is only a few thousand years old (depending on how we count). So if civilization will ultimately last for millions of years, it could be considered surprising that we’ve found ourselves so early in history. Should we therefore predict that human civilization will probably disappear within a few thousand years? This “Doomsday Argument” shares a family resemblance to ideas used by many professional cosmologists to judge whether a model of the universe is natural or not. Philosopher Nick Bostrom is the world’s expert on these kinds of anthropic arguments. We talk through them, leading to the biggest doozy of them all: the idea that our perceived reality might be a computer simulation being run by enormously more powerful beings.
“One way the internet distorts our picture of ourselves is by feeding the human tendency to overestimate our knowledge of how the world works,” writes philosophy professor Michael Patrick Lynch, author of the book The Internet of Us: Knowing More and Understanding Less in the Age of Big Data, in The Chronicle of Higher Education. “The Internet of Us becomes one big reinforcement mechanism, getting us all the information we are already biased to believe, and encouraging us to regard those in other bubbles as misinformed miscreants. We know it all—the internet tells us so.”
In other words, the internet encourages epistemic arrogance—the belief that one knows much more than one does. The internet’s tailored social media feeds and algorithms have herded us into echo chambers where our own views are cheered and opposing views are mocked. Sheltered from serious challenge, celebrated by our chosen mob, we gradually lose the capacity for accurate self-assessment and begin to believe ourselves vastly more knowledgeable than we actually are.
At it’s core, temperance means finding, respecting and defending boundaries –especially one’s own. This is sometimes equated with a form of non-action, or as abstention from excess and extremes –especially in relation to bodily pleasures connected to alcohol, food and sex. As a concept, however, temperance offers a much richer array of meanings. Philosopher Alasdair Macintyre, for example, has defined temperance as the choice to not use power which is at one’s disposal. Another possibility is to understand temperance as a practice of dispensation –one which includes being without that which one desires, admitting that every fulfilled desire has consequences, and inquiring into desires and their origins. The latter is an especially significant aspect of the Greek sophrosyne, which could be translated both as temperance and self-knowledge. Yet another alternative is to view temperance as a practice of actively being with others, a form of self-restraint which turns one’s attention away from oneself towards others. From that perspective, many currently popular expressions of self-restraint, such as exercise, dieting or voluntary celibacy, are not temperate since they are motivated by the individual’s own desires. Neither do enforced limitations such as laws banning smoking or drugs have anything to do with temperance, since the virtue involves the choice to act in a certain way as much as the action itself –acknowledging the boundary as well as adhering to it.
Waiting outside venues has been an integral part of the fangirl experience and something of an embodiment of their lifestyle from the very beginning. Columbus Day, New York City, 1944, a Frank Sinatra show, and the New York photojournalist Weegee was there when the first of the pop stars as we know them was about to take to the stage. “The line in front of the Paramount Theatre on Broadway starts forming at midnight,” he recalled. “By four in the morning, there are over 500 girls … They wear bobby sox (of course), bow ties (the same as Frankie wears) and have photos of Sinatra pinned to their dresses.” The three-thousand-seat house was quickly filled with girls waiting to see Sinatra for the first of his scheduled performances of that day. Apparently inside the theater the ammoniac smell of pee was heavy in the air because girls refused to leave their seats for food, water, anything, unless they were physically moved by attendants. Over the duration of the day, a riot bubbled over outside where, according to records, police had to deal with thirty to thirty-five thousand young female Sinatra fans, lining the streets around the theater, calling out to be let in.
There’s a trope of the British Asian identity narrative, once captured with such originality and brilliance in Hanif Kureishi’s Buddha of Suburbia and much replicated – in Monica Ali’s Brick Lane,for example, or Ayub Khan-Din’s East Is East – that now fills most British Asians of my generation (I’m 40) with dread. It’s the one about the second-generation immigrant held back by the ignorance of parents or a community that’s either comically absurd or violently fundamentalist. Against this backdrop, the second-generation hero or heroine emerges once they find the strength to stand apart from this reactionary past and assimilate into the mainstream of British life.
This narrative certainly had its time, but we have come to dread it, because through it we dehumanise ourselves and demean the journeys that have made us who we are. It’s also a lie, since no degree of immigrant assimilation can overturn the racism that is systemic in British life, and that our mainstream culture has the habit of perpetuating.
For me, the potted biography on the dust jacket of Mohsin Zaidi’s book (subtitled A Memoir of a Gay Muslim’s Journey to Acceptance) was enough to provoke the first stirrings of that familiar sense of foreboding. Here we learn that the author, raised in a “devout Muslim community”, was the “first person from his school to go to Oxford University” before going on to become, an accomplished criminal barrister, a board member of Stonewall, the UK’s biggest LGBT rights charity, and a governor of his former school in east London. It’s only a blurb, but there is a subtly Islamophobic framing here that opposes Zaidi’s “devout Muslim” background with the progressive attainment of his life and career.
With the refinement and reduced cost of sequencing technology, science has reached another inflection point: population-scale genomics, where clinical-grade assays can be used to advance healthcare, while fueling research. The Healthy Nevada Project (HNP) epitomizes that movement. A population health initiative run by the Renown Institute for Health Innovation (Renown IHI), a partnership between Renown Health and the Desert Research Institute, the project aims to combine genomic, environmental and medical data from 250,000 participants to assess the influence of genetics on health and disease. The study also uses sequencing data to screen participants for medically actionable genetic conditions, such as familial hypercholesterolemia (FH), hereditary breast and ovarian cancer syndrome (HBOC), and Lynch syndrome (LS).
…In July, Helix and the HNP team reported findings from the first 27,000 participants in an article published in Nature Medicine1. Among HNP participants, the researchers found that approximately 1 in 75 carried a variant predisposing them to HBOC, LS, or FH. Of those predisposed, nearly 22 percent had already begun to develop symptoms of disease. Importantly, 90 percent of those with medically actionable results would not have known about their increased risks for disease had they not participated in the study; patients often don’t qualify for genetic screening under current clinical guidelines.
An American parent who named their child “Kamala Devi” in the 1960s is likely to have been either Indian or a hippie. When I first heard Kamala Harris’ name during the 2020 Democratic presidential race, I assumed it was the latter since she was generically described as a “black woman”. What I have since learned is that her mother, Shyamala Gopalan, was an Indian immigrant. I barely understand my own personal life, so I won’t pretend to have any insight into Kamala Harris’s. Maybe she’s close to her Indian family, maybe she’s not. I’ll never know, and frankly neither will you. The only thing I can safely conclude from this information is that Kamala Harris has an Indian side.
As a country of immigrants, Americans are familiar with diversity even if they are not always accepting of it. Diversity across different people is, however, different from diversity within the same person. Americans seem disoriented by individual diversity. My admittedly rudimentary understanding is that they alleviate this unease by reducing the dimensionality of ethnic, racial or cultural identity in a person. The thankfully obsolete one drop rule is an example of this. Harris’s mother seems to have understood it early on. As Harris explains in a New Yorker profile, her mother “knew that her adopted homeland would see Maya [Kamala’s sister] and me as black girls, and she was determined to make sure we would grow into confident, proud black women.”
We know by now that the description of Harris as a “black woman” is incomplete. The opening paragraph of “Kamala’s story” on her new campaign website describes her as both “Black” and “Indian”, and notes her parents’ immigration to the U.S. from Jamaica and India. Then, in her DNC convention acceptance speech, she gave a shout out to “my uncles, my aunts and my chithis”. Padma Lakshmi, the model-turned-celebrity-chef, explained through tears of joy that chithi meant “auntie” in Tamil. Harris didn’t announce the fact that her mother’s side of the family is Brahmin, but that would have been obvious to any Tamilian looking at Harris’s Indian family photo. So now, on her Indian side alone, we have geography, language, religion and caste in the mix. Sorting all of that out has got to be confusing for anyone. Read more »
Lincoln consistently scores top or at least top 3 in every ranking of US presidents. This high standing has long puzzled me. After all, this is the leader who presided over a long brutal civil war that killed 620,000 of his own people. For context, as a percentage of the population, that is more American lives than all other presidents put together have managed to expend in all America’s other wars. On the face of it, that is a massive failure of statesmanship. The usual response is that the war was a necessary sacrifice to end the supreme evil of slavery. I do not find this convincing.
I. The civil war was not inevitable: Lincoln caused it by his election and choices
War happens because politics fails. Lincoln’s Republican party was perceived as anti-Southern, as evidenced by its immediate promise to impose import tariffs that would hurt the South’s economy (to the benefit of the industrial North) and long-term commitment to the abolition of slavery. Although not the most radical candidate the Republicans could have nominated (more on which below) Lincoln was thus an extraordinarily divisive presidential candidate who was elected entirely by Northern voters. The Southern states launched their secession as soon as they heard Lincoln had won because they believed he (and his Republican party) intended to pursue an anti-South agenda that would further cement their political marginalisation within the USA. Perhaps reflecting his party’s political ignorance of the South and his administration’s inexperience, Lincoln seems not to have understood this. He therefore dramatically underestimated popular support for secession in the South and assumed a show of force would quickly crush it.
It is reasonable to suppose that if a different party’s candidate had won the 1860 election, secession would not have happened then (or perhaps at all), despite the long-running tensions between South and North. Moreover, it seems likely that a different president could have managed any secession crisis without resorting to war because that president would have had the trust and legitimacy in the South that Lincoln lacked. (For example, one of Congress’s various efforts to resolve the secession crisis by constitutionally protecting slavery might have succeeded.) Or the South could have been allowed to secede, on the same principle that the colonies had claimed in declaring their independence from Britain. The South’s goal was just to leave the USA, and this was perfectly compatible with the continued existence and flourishing of the country that remained (though perhaps not with its pride). It was the South’s choice to declare independence from the USA, but it was the choice of President Lincoln’s government to go to war to force them to stay. Read more »
first their concerted honks— unseen, …………then as apparitions they rise from foliage at the foot of the hill framed in a window sash they rise to the cackles of crows already at breakfast in our yard arrayed upon green, black notes of an almost endless chord, ostinato of the articulated sounds of vees that appear overhead, levitating geese in glass, all of a boundless chord drawn-out, a day in the life ……………………..they vanish over the window top truck tires hum on tar dopplering down around a bend played out a last crow lifts off speaking a ……………………..last word
As the quadrennial presidential and vice presidential debate season nears once more, we’ll soon see opinion pieces reminding us that debates have at most a very slight impact on the outcome of presidential contests.
Typically, the writers of those pieces frame their discussion as a warning not to place too much importance on the debates. I want to make a different argument about the presidential and vice presidential debates: we would do better to get rid of them altogether.
The crux of my argument against the presidential and vice presidential debates is that they don’t serve any of the audiences who might tune into them.
It used to be the guilty fantasy of theory bros and voluntary beggars. When turning points expired or wavered they’d hold it to their chests at night, a Molotov they didn’t dare ignite: Could revolution, usually so plodding, be brought about by some proud, precise Nothing?
By standing very still and thinking, could you will the universe into your orbit? Rather than Changing or its hotter cousin, Rearranging, why not secrete a thin disinterest and let society struggle to absorb it? The best part was, it made the change-obsessed look foolish for their interest in lobbying, petitions, and the rest— in lieu of noisy sawing, one sharp swish to make the engagé seem babyish and also, one supposes, slay the state, though even that might be too animate, too quick to grant the state its dying wish. Put better: one smart drop of crimson paint to make the canvas as a whole look faint.
When it came true, as guilty wishes often do, the shrieks of celebration shattered glass: ten million bros in witty dresses deadpanning in manufactured messes or scribbling alien Os in fields of grass— anything at all that had once seemed Nothing now marked, from skydiver eyes, a happening alloyed with Nothing, stronger than Nothing alone, shuddering at so many hertzes it made dogs moan,
dogs and the unlucky some born with an ear for pandemonium, who hear at all hours what they cannot see, chords rich and fat and spoiled as gravity, and feel, in bed or in the shower, a gnawing lust for slow chromatic scales of sawing.