How science has shifted our sense of identity

Nathaniel Comfort in Nature:

In the iconic frontispiece to Thomas Henry Huxley’s Evidence as to Man’s Place in Nature (1863), primate skeletons march across the page and, presumably, into the future: “Gibbon, Orang, Chimpanzee, Gorilla, Man.” Fresh evidence from anatomy and palaeontology had made humans’ place on the scala naturae scientifically irrefutable. We were unequivocally with the animals — albeit at the head of the line. Nicolaus Copernicus had displaced us from the centre of the Universe; now Charles Darwin had displaced us from the centre of the living world. Regardless of how one took this demotion (Huxley wasn’t troubled; Darwin was), there was no doubting Huxley’s larger message: science alone can answer what he called the ‘question of questions’: “Man’s place in nature and his relations to the Universe of things.”

Huxley’s question had a prominent place in the early issues of Nature magazine. Witty and provocative, ‘Darwin’s bulldog’ was among the most in-demand essayists of the day. Norman Lockyer, the magazine’s founding editor, scored a coup when he persuaded his friend to become a regular contributor. And Huxley knew a soapbox when he saw one. He hopped up and used Nature’s pages to make his case for Darwinism and the public utility of science. It was in the seventh issue — 16 December 1869 — that Huxley advanced a scheme for what he called ‘practical Darwinism’ and we call eugenics. Convinced that continued dominance of the British Empire would depend on the “energetic enterprising” English character, he mused about selecting for a can-do attitude among Britons1. Acknowledging that the law, not to mention ethics, might get in the way, he nevertheless wrote: “it may be possible, indirectly, to influence the character and prosperity of our descendants.” Francis Galton — Darwin’s cousin and an outer planet of Huxley’s solar system — was already writing about similar ideas and would come to be known as the father of eugenics. When this magazine appeared, then, the idea of ‘improving’ human heredity was on many people’s minds — not least as a potent tool of empire.

More here.

Trump’s America Is Not The Handmaid’s Tale

Cathy Young in The Atlantic:

A protester dressed as a character from the novel-turned-TV series “The Handmaid’s Tale” walks into the Senate Hart building during a rally against Supreme Court nominee Brett Kavanaugh on Capitol Hill in Washington, DC on October 4, 2018. – Top Republicans voiced confidence Thursday that Brett Kavanaugh will be confirmed to the US Supreme Court this weekend, as they asserted that an FBI probe had found nothing to support sex assault allegations against Donald Trump’s nominee.”Judge Kavanaugh should be confirmed on Saturday,” Senator Chuck Grassley of Iowa, the chairman of the Senate Judiciary Committee, told reporters. (Photo by ANDREW CABALLERO-REYNOLDS / AFP) (Photo credit should read ANDREW CABALLERO-REYNOLDS/AFP/Getty Images)

Woven through The Testaments—the new novel by the eminent Canadian author Margaret Atwood and a sequel to her 1985 classic, The Handmaid’s Tale—are harrowing flashbacks in which women deemed undesirable by the new men in charge are herded into a football stadium, held in brutal and degrading captivity, and periodically gunned down in the arena. These powerful, sickening scenes evoke both radical Islamist regimes and South American juntas. But they take place in a dystopian version of the United States; the makeshift prison is Harvard’s football stadium. To legions of Atwood’s American fans, this nightmarish vision is terrifyingly real. Yet the fantasy of martyrdom that Atwood taps into is self-indulgent and ultimately self-defeating—raising the question of why so many fans are so eager to believe the worst.

…At the heart of the Handmaid phenomenon is the belief—endorsed by the stars and producers of the TV series, and by Atwood herself—that Gilead is not mere fiction, but is almost upon us in real life. These parallels were endlessly hyped when the Hulu series, conceived and filmed before the 2016 election, premiered in April 2017, several months after Donald Trump’s inauguration. At the time, it was hailed in major publications as “timely,” “prescient,” and “alarmingly close to home,” despite bearing no resemblance to the actual alarming things happening under the Trump presidency. Notably, Atwood’s 1985 novel itself was partly inspired by the rise of the Christian right in the United States in the 1980s. And, for all its qualities—keen insights into the realities of totalitarianism, nuanced character dynamics, a sympathetic everywoman heroine struggling to survive under horrific oppression—it fails utterly if taken seriously as a potential scenario for America’s slide into religious dictatorship.

More here.

Tuesday, October 8, 2019

Julian Baggini on Patricia Churchland’s radical approach to the study of human consciousness

Julian Baggini in Prospect:

The nature of mind and consciousness had been one of the biggest and trickiest issues in philosophy for a century. Neuroscience was developing fast, but most philosophers resisted claims that it was solving the philosophical problems of mind. Scientists who trod on philosophers’ toes were accused of “scientism”: the belief that the only true explanations are scientific explanations and that once you had described the science of a phenomenon there was nothing left to say. Those rare philosophers like the Churchlands, who shared many of the enthusiasms and interest of these scientists, were even more despised. A voice in the head of Patricia Churchland told her how to deal with these often vicious critics: “outlast the bastards.”

Churchland’s work tried to take the philosophical implications of the new brain research seriously without falling into the scientistic traps. It quickly generated a huge amount of interest, from admirers and detractors alike. For her supporters, mostly scientists, studying the brain was essential to understanding how we perceive the world. For her detractors, mostly philosophers, the whole project of “neurophilosophy” was fundamentally naïve and misguided: it was all neuro and no philosophy, reducing humans to mere machines. Churchland still sometimes gets mocked as “the Queen of Neuromania,” as Raymond Tallis acidly described her; Colin McGinn once dismissed her work as “neuroscience cheerleading.”

Yet over the years, Churchland has received due recognition for avoiding the traps that lie in each extreme.

More here.

Can a machine learn to write for New Yorker Magazine?

The safety of any new technology often hinges on how it’s regulated. If machines can learn to think for themselves, that might be a concern. But if we really want to replicate human intelligence—as most of us want to—there are several directions that researchers might explore.

The italicized paragraph above was written by a machine. Here’s the story by John Seabrook in The New Yorker:

Machine translation, an enduring dream of A.I. researchers, was, until three years ago, too error-prone to do much more than approximate the meaning of words in another language. Since switching to neural machine translation, in 2016, Google Translate has begun to replace human translators in certain domains, like medicine. A recent study published in Annals of Internal Medicine found Google Translate accurate enough to rely on in translating non-English medical studies into English for the systematic reviews that health-care decisions are based on.

Ilya Sutskever, OpenAI’s chief scientist, is, at thirty-three, one of the most highly regarded of the younger researchers in A.I. When we met, he was wearing a T-shirt that said “The Future Will Not Be Supervised.” Supervised learning, which used to be the way neural nets were trained, involved labelling the training data—a labor-intensive process. In unsupervised learning, no labelling is required, which makes the method scalable. Instead of learning to identify cats from pictures labelled “cat,” for example, the machine learns to recognize feline pixel patterns, through trial and error.

More here.

The story of Jamal Khashoggi’s murder

Evan Ratliff in Insider:

One year ago, the journalist Jamal Khashoggi walked into the Saudi Consulate in Istanbul and never walked out. In the months that followed, the facts of his disappearance and murder would emerge in fragments: an international high-tech spy game, a diabolical plot, a gruesome killing, and a preposterous cover-up reaching the highest levels of the Saudi government, aided by the indifference and obstinacy of the White House. Eventually those fragments came to comprise a macabre mosaic.

This June, the United Nations special rapporteur for extrajudicial, summary, or arbitrary executions issued a 100-page report detailing the Khashoggi affair. The report, the product of five months of independent investigation spanning six countries, added to the thrum of international indignation about Khashoggi’s murder. But so far it has largely failed to galvanize it into action.

Here is the story, as we know it, illustrated by Chris Koehler and told as a nonfiction narrative by the author Evan Ratliff.

More here.

On Patrick White, Australia’s Great Unread Novelist

Madeleine Watts at Literary Hub:

In some respects this reflects a national pathology. Unlike an American or British child, an Australian student can go through thirteen years of education without reading much of their country’s literature at all (of the more than twenty writers I studied in high school, only two were Australian). This is symptomatic of the country’s famed “cultural cringe,” a term first coined in the 1940s by the critic A.A. Phillips to describe the ways that Australians tend to be prejudiced against home-grown art and ideas in favor of those imported from the UK and America. Australia’s attitude to the arts has, for much of the last two centuries, been moral. “What these idiots didn’t realize about White was that he was the most powerful spruiker for morality that anybody was going to read in an Australian work,” argued David Marr, White’s biographer, during a talk at the Wheeler Centre in 2013. “And here were these petty little would-be moral tyrants whinging about this man whose greatest message about this country in the end was that we are an unprincipled people.”

But if White could critique the country and name Australians as unprincipled, it was something he had earned by going back.

more here.

The Perseverance of Eve Babitz’s Vision

Molly Lambert at The Paris Review:

My god, isn’t it fun to read Eve Babitz? Just holding one of her books in your hand is like being in on a good secret. Babitz knows all the good secrets—about Los Angeles, charismatic men, and supposedly glamorous industries like film, music, and magazines. Cool beyond belief but friendly and unintimidating, Babitz hung out with all the best rock stars, directors, and artists of several decades. And she wrote just as lovingly about the rest of LA—the broad world that exists outside the bubble of “the Industry.” Thanks to New York Review Books putting together a collection of this work, we are lucky enough to have more of Babitz’s writing to read.

Alongside the Thelemic occultist Marjorie Cameron (whose husband, Jack Parsons, cofounded the Jet Propulsion Laboratory) and the Bay Area Beat painter Jay DeFeo (Babitz’s romantic rival), Babitz was one of a handful of female artists associated with LA’s landmark Ferus Gallery, which showed local contemporary artists and launched the careers of people like Ed Ruscha and Ed Kienholz.

more here.

What the Hell Was Modernism?

Jerry Saltz at New York Magazine:

Can a museum devoted to modernism survive the death of the movement? Can it bring that death about? Ever since the beginnings of the Renaissance in the 14th century, most art movements have lasted one generation, sometimes two. Today, after more than 130 years, modernism is, at least by some measures, insanely and incongruously popular — a world brand. The first thing oligarchs do to signal sophistication, and to cleanse and store money, is collect and build personal museums of modern art, and there’s nothing museumgoers love more than a survey of a mid-century giant. In the U.S., modernism represents the triumph of American greatness and wealth, and it is considered the height of 20th-century European culture — which Americans bought and brought over (which is to say, poached).

Kids sport tattoos of artworks by Gustav Klimt, Henri Matisse, Salvador Dalí, Edvard Munch, Piet Mondrian, and Andy Warhol (you might not think of him as a modernist, but we’ll get to that). Our cities are crowded with glass-walled luxury riffs on high-modernist architecture, the apartments inside full of knockoffs of “mid-century-modern” furniture.

more here.

These Butterflies Evolved to Eat Poison. How Could That Have Happened?

Carl Zimmer in The New York Times:

The caterpillar of the monarch butterfly eats only milkweed, a poisonous plant that should kill it. The caterpillars thrive on the plant, even storing its toxins in their bodies as a defense against hungry birds. For decades, scientists have marveled at this adaptation. On Thursday, a team of researchers announced they had pinpointed the key evolutionary steps that led to it. Only three genetic mutations were necessary to turn the butterflies from vulnerable to resistant, the researchers reported in the journal Nature. They were able to introduce these mutations into fruit flies, and suddenly they were able to eat milkweed, too. Biologists hailed it as a tour-de-force that harnessed gene-editing technology to unscramble a series of mutations evolving in some species and then test them in yet another.

“The gold standard is to directly test mutations in the organism,” said Joseph W. Thornton, an evolutionary biologist at the University of Chicago. The new study “finally elevates our standards.” Insects began dining on plants over 400 million years ago, spurring the evolution of many botanical defenses, including harsh chemicals. Certain plants, including milkweed, make particularly nasty toxins known as cardiac glycosides. The right dose can stop a beating heart or disrupt the nervous system. For thousands of years, African hunters have put these poisons on the tips of arrows. Agatha Christie wrote a murder mystery featuring foxglove, which produces cardiac glycosides.

More here.

Why Are Rich People So Mean?

Christopher Ryan in Wired:

Now, you may be thinking, “Fuck those guys and the private jets they rode in on.” Fair enough. But here’s the thing: those guys are already fucked. Really. They worked like hell to get where they are—and they’ve got access to more wealth than 99.999 percent of the human beings who have ever lived—but they’re still not where they think they need to be. Without a fundamental change in the way they approach their lives, they’ll never reach their ever-receding goals. And if the futility of their situation ever dawns on them like a dark sunrise, they’re unlikely to receive a lot of sympathy from their friends and family.

What if most rich assholes are made, not born? What if the cold-heartedness so often associated with the upper crust—let’s call it Rich Asshole Syndrome—isn’t the result of having been raised by a parade of resentful nannies, too many sailing lessons, or repeated caviar overdoses, but the compounded disappointment of being lucky but still feeling unfulfilled? We’re told that those with the most toys are winning, that money represents points on the scoreboard of life. But what if that tired story is just another facet of a scam in which we’re all getting ripped off?

The Spanish word aislar means both “to insulate” and “to isolate,” which is what most of us do when we get more money.

More here. (Note: Thanks to Mariam Mahmood in Lahore!)

Tuesday Poem

Shrine

That small deceptive bend
in what seems like a fast straight, where the boy-
racers would come to grief – remember?

Now the scars on the big sycamore tree
have all grown over

and the last of the silk flowers tied to it
are tattered, grey, that were kept
renewed through all these years.

I saw her once: a red-haired woman
middle-aged, in a pink top
wading into the ditch, her armful of artificial sunflowers
held high above the nettles

and her car parked in the bend
where everything northbound had to swerve around it
into the hidden traffic
coming the other way.

Like she could care
her heart dead
to the world, her only thought in that corner

not to forget him
not to permit forgetting him.

Where has she gone
to leave his garlands fading?
Has she laid down outliving him
after so long, her beautiful

careless boy?
What but death could keep her away
from the place of pilgrimage he gave her
by this cold road?

And who is left behind now
to remember all that sorrow
or to lay flowers
(and where in the world) for her?

by Judith Taylor 
from Not in Nightingale Country
Red Squirrel Press

Sunday, October 6, 2019

How wiping out $1.5 trillion in student debt would boost the economy

Jillian Berman in Market Watch:

Since at least the Great Recession a decade ago, borrowers, activists and others have been building a case that erasing debt acquired during students college years is a matter of economic justice. More recently, researchers have found that canceling some or all of the nation’s outstanding student debt has the potential to boost gross domestic product, narrow the widening racial wealth gap and liberate millions of Americans from a financial albatross that previous generations never had to contend with.

Sens. Elizabeth Warren and Bernie Sanders, both vying for the Democratic nomination for president, are proposing to wipe away some or all of the country’s burdensome debt as a key part of their campaigns. They plan to pay for it — and accompanying proposals to make public college tuition-free — by raising taxes on the wealthy. Warren estimates her plan would cost $1.25 trillion over 10 years, and Sanders says his would cost $2.2 trillion.

Critics question whether those proposals would fix the underlying problems in the way Americans pay for college, and also whether student-debt cancellation would entail a giveaway to well-off families.

More here.

The Fall of the Meritocracy

Sarah Leonard in The New Republic:

In 1958, sociologist Michael Young wrote a dark satire called The Rise of the Meritocracy. The term “meritocracy” was Young’s own coining, and he chose it to denote a new aristocracy based on expertise and test-taking instead of breeding and titles. In Young’s book, set in 2034, Britain is forced to evolve by international economic competition. The elevation of IQ over birth first serves as a democratizing force championed by socialists, but ultimately results in a rigid caste system. The state uses universal testing to identify and elevate meritocrats, leaving most of England’s citizens poor and demoralized, without even a legitimate grievance, since, after all, who could argue that the wise should not rule? Eventually, a populist movement emerges. The story ends in bloody revolt and the assassination of the fictional author before he can review his page proofs.

In 2001, Young would take to The Guardian to declare his disappointment that Tony Blair, alongside politicians in the United States, had adopted the term without irony. The New Labour government now talked of “breaking down the barriers to success” and declared that “the Britain of the elite is over. The new Britain is a meritocracy.” Blair’s Cabinet, like George W. Bush’s and then Obama’s, was packed with graduates of the most elite educational institutions in the world. Young’s vision of IQ testing may have been a little off, but elite education had become the prerequisite for power. “It is good sense to appoint individual people to jobs on their merit,” Young wrote. “It is the opposite when those who are judged to have merit of a particular kind harden into a new social class without room in it for others.”

Sixty years on from Young’s book, a pure product of meritocratic training has emerged to write what he hopes will be the epitaph for his class.

More here.

The People’s Republic of China Was Born in Chains

Frank Dikötter in The New Republic:

A landlord is executed near Fukang, Xinjiang, circa 1949. Keystone/Hulton Archive/Getty Images

China today, for any visitor who remembers the country from 20 or 30 years ago, seems hardly recognizable. One of the government’s greatest accomplishments is to have distanced itself so successfully from the Mao era that it seems almost erased. Instead of collective poverty and marching Red Guards, there are skyscrapers, new airports, highways, railway stations, and bullet trains. Yet scratch the glimmering surface and the iron underpinnings of the one-party state become apparent. They have barely changed since 1949, despite all the talk about “reform and opening up.” The legacy of liberation is a country still in chains.

Just what was China liberated from in 1949? It wasn’t the Japanese, defeated four years earlier by the Allies, including the Nationalists and their leader, Chiang Kai-shek. It wasn’t colonialism—all the foreign concessions in the country had been dissolved, some as early as 1929. The Republic of China was a sovereign state and a permanent member of the U.N. Security Council.

Nor was it tyranny. In 1912, when China became Asia’s first republic, it had an electorate of 40 million people, or 10 percent of the population, a level of popular representation not reached by Japan until 1928 and India until 1935. Participatory politics, despite many setbacks, continued to thrive over the following decades. When the National Assembly met in May 1948, upwards of 1,400 delegates from all parts of China adopted a constitution that contained an elaborate bill of rights.

More here.

How and Why Companies Will Engineer Your Emotions

Richard Johnson in IEEE Spectrum:

Affective computing systems are being developed to recognize, interpret, and process human experiences and emotions. They all rely on extensive human behavioral data, captured by various kinds of hardware and processed by an array of sophisticated machine learning software applications.

AI-based software lies at the heart of each system’s ability to interpret and act on users’ emotional cues. These systems identify and link nuances in behavioral data with the associated emotion.

The most obvious types of hardware for collecting behavior data are cameras and other scanning devices that monitor facial expressions, eye movements, gestures, and postures.  This data can be processed to identify subtle micro expressions that a human assessment might struggle to identify consistently.

What’s more, high-end audio equipment records variances and textures in users’ voices. Some insurance companies are experimenting with call voice analytics that can detect if someone is lying to their claim handers.

More here.  [Thanks to Brian Whitney.]

Out of Time

Astra Taylor in Lapham’s Quarterly:

Twelve years, or so the scientists told us in 2018, which means now we are down to eleven. That’s how long we have to pull back from the brink of climate catastrophe by constraining global warming to a maximum of 1.5 degrees Celsius. Eleven years to prevent the annihilation of coral reefs, greater melting of the permafrost, and species apocalypse, along with the most dire consequences for human civilization as we know it. Food shortages, forest fires, droughts and monsoons, intensified war and conflict, billions of refugees—we have barely begun to conceive of the range of dystopian futures looming on the horizon.

One person who looks squarely and prophetically at the potential ramifications of climate change and insists on a response is Greta Thunberg, the sixteen-year-old Swedish environmentalist who launched a global wave of youth climate strikes. In April 2019 she gave a tour de force address in the British Parliament, invoking not just her peers who were regularly missing class to protest government inaction but those yet to be born. “I speak on behalf of future generations,” Thunberg said. “Many of you appear concerned that we are wasting valuable lesson time, but I assure you we will go back to school the moment you start listening to science and give us a future.”

Thunberg accepts what many influential adults seem unable to face: the inevitability of change. Change is coming, either in the form of adaptation or annihilation; we can respond proactively or reactively to this discomfiting fact. Perhaps she can accept this because she is so young. Eleven years, a little over a decade, is the time for a human infant to become a preteen and for a preteen to become a young adult. For Thunberg, eleven years is more than two-thirds of her life, a veritable expanse that, projected forward, will involve crossing the threshold from adolescence to the first stage of maturity. Yet for a relatively contented middle-aged or elderly adult, eleven years isn’t as substantial—not quite the blink of an eye but a continuation of the present, a deeper dive into one’s golden years. At a certain point, stasis is the goal, to ward off decline. But decline awaits us all—as the economist John Maynard Keynes bluntly put it, “In the long run we are all dead.” Everyone’s time on earth must come to an end. The question is, What do we do with such knowledge?

More here.