Olga Tokarczuk and Peter Handke win Nobel prizes in literature

Alison Flood in The Guardian:

The Polish novelist and activist Olga Tokarczuk and the controversial Austrian author Peter Handke have both won the Nobel prize in literature.

The choice of Tokarczuk and Handke comes after the Swedish Academy promised to move away from the award’s “male-oriented” and “Eurocentric” past.

Tokarczuk, an activist, public intellectual, and critic of Poland’s politics, won the 2018 award, and was cited by the committee for her “narrative imagination that with encyclopedic passion represents the crossing of boundaries as a form of life”. She is a bestseller in her native Poland, and has become much better known in the UK after winning the International Booker prize for her sixth novel Flights. The Nobel committee’s Anders Olsson said her work, which “centres on migration and cultural transitions”, was “full of wit and cunning”.

Picking Handke as 2019’s winner, cited for “an influential work that with linguistic ingenuity has explored the periphery and the specificity of human experience”, has already provoked controversy. The Ambassador of Kosovo to the US, Vlora Çitaku, called the decision “scandalous … a preposterous and shameful decision”.

More here.

Physicists who say the multiverse exists set a dangerous precedent: science based on zero empirical evidence

Jim Baggott in Aeon:

There is no agreed criterion to distinguish science from pseudoscience, or just plain ordinary bullshit, opening the door to all manner of metaphysics masquerading as science. This is ‘post-empirical’ science, where truth no longer matters, and it is potentially very dangerous.

It’s not difficult to find recent examples. On 8 June 2019, the front cover of New Scientist magazine boldly declared that we’re ‘Inside the Mirrorverse’. Its editors bid us ‘Welcome to the parallel reality that’s hiding in plain sight’.

How you react to such headlines likely depends on your familiarity not only with aspects of modern physics, but also with the sensationalist tendencies of much of the popular-science media. Needless to say, the feature in question is rather less sensational than its headline suggests. It’s about the puzzling difference in the average time that subatomic particles called neutrons will freely undergo radioactive decay, depending on the experimental technique used to measure this – a story unlikely to pique the interests of more than a handful of New Scientist’s readers.

But, as so often happens these days, a few physicists have suggested that this is a problem with ‘a very natural explanation’. They claim that the neutrons are actually flitting between parallel universes. They admit that the chances of proving this are ‘low’, or even ‘zero’, but it doesn’t really matter.

More here.

Dealing With China Isn’t Worth the Moral Cost

Farhad Manjoo in the New York Times:

A parade of American presidents on the left and the right argued that by cultivating China as a market — hastening its economic growth and technological sophistication while bringing our own companies a billion new workers and customers — we would inevitably loosen the regime’s hold on its people. Even Donald Trump, who made bashing China a theme of his campaign, sees the country mainly through the lens of markets. He’ll eagerly prosecute a pointless trade war against China, but when it comes to the millions in Hong Kong who are protesting China’s creeping despotism over their territory, Trump prefers to stay mum.

Well, funny thing: It turns out the West’s entire political theory about China has been spectacularly wrong. China has engineered ferocious economic growth in the past half century, lifting hundreds of millions of its citizens out of miserable poverty. But China’s growth did not come at any cost to the regime’s political chokehold.

A darker truth is now dawning on the world: China’s economic miracle hasn’t just failed to liberate Chinese people. It is also now routinely corrupting the rest of us outside of China.

More here.

The Annual Debutante Ball in Laredo, Texas

Jordan Kisner at The Believer:

There are many debutante balls in Texas, and a number of pageants that feature historical costumes, but the Society of Martha Washington Colonial Pageant and Ball in Laredo is the most opulently patriotic among them. In the late 1840s, a number of European American settlers from the East were sent to staff a new military base in southwest Texas, a region that had recently been ceded to the United States after the Mexican-American War. They found themselves in a place that was tenuously and unenthusiastically American. Feeling perhaps a little forlorn at being so starkly in the minority, these new arrivals established a local chapter of the lamentably named Improved Order of Red Men. (Members of the order dressed as “Indians,” called their officers “chiefs,” and began their meetings, or “powwows,” by banging a tomahawk instead of a gavel.)

The Improved Order of Red Men fashioned itself as a torchbearer for colonial-era American patriotism, and its young Laredo chapter was eager to enshrine that culture down at the border. So it formed the Washington’s Birthday Celebration Association (WBCA). 

more here.

Lithium: A Doctor, a Drug, and a Breakthrough

A J Lees at Literary Review:

Lithium is a silvery-white metal that is so light it can float on water and so soft it can be cut with a butter knife. Along with hydrogen and helium it was produced during the Big Bang and so formed the universe before the emergence of the galaxies. It is employed to harden glass and to thicken grease, but its best-known industrial use is in the manufacture of rechargeable batteries. Lithium salts are found in considerable quantities in brine and igneous granite and the element is present in trace quantities in the human body. Lithium is also one of the few metals – along with platinum for cancer, gold for rheumatoid arthritis and bismuth for dyspepsia – that are used as medicines.

In 1949, a 37-year-old Australian doctor called John Cade produced a paper reporting that lithium quietened patients suffering from acute manic excitement. He reminded readers that lithium salts had been commonly used in the 19th century to treat gout and other disorders believed to be associated with high uric acid levels but had disappeared from the pharmacopoeia due to safety concerns. 

more here.

The Anxieties and Complexities of Furthering Diversity in The Literary World

Colin Grant at the TLS:

So why are publishers suddenly bending over backwards to fill their schedules with people of colour? Interviewing a range of writers, publishers and other industry professionals throws up complex and sometimes disturbing answers. It’s certainly not only about capitalizing on a trend. Sharmaine Lovegrove, publisher of the Little, Brown imprint Dialogue Books, tells me that it’s partly the result of an evolving culture of shame and embarrassment: “Agents are asking people who have a little bit of a social media presence to come up quite quickly with ideas that they can sell to publishers who are desperate because no list wants to be all white, as it has been”. With the dread of being associated with the hashtag #publishingsowhite, the simplest and cheapest way of adding black names to their lists is to put out anthologies packed with malleable first-time authors and one or two seasoned writers seeded through the collections. (A – presumably – more expensive way is to recruit Stormzy, who launched the #Merky Books imprint at Penguin Random House last year.)

But social media is not the best school for a grounding in literary technique and critical engagement. There’s an obvious correlation between the constraints of writing on Twitter (with its 280 characters), blogs or Facebook and the bite-sized ambitions of the essays that constitute some recent collections.

more here.

What Saul Bellow Saw

Ruth Wisse in Mosaic:

In May 1949, a year after the establishment of the state of Israel, the American Jewish literary critic Leslie Fiedler published in Commentary an essay about the fundamental challenge facing American Jewish writers: that is, novelists, poets, and intellectuals like Fiedler himself. Entitled “What Can We Do About Fagin?”—Fagin being the Jewish villain of Charles Dickens’s novel Oliver Twist—the essay shows that the modern Jew who adopts English as his language is joining a culture riddled with negative stereotypes of . . . himself. These demonic images figure in some of the best works of some of the best writers, and form an indelible part of the English literary tradition—not just in the earlier form of Dickens’ Fagin, or still earlier of Shakespeare’s Shylock, but in, to mention only two famous modern poets, Ezra Pound’s wartime broadcasts inveighing against “Jew slime” or such memorable lines by T.S. Eliot as “The rats are underneath the piles. The jew is underneath the lot” and the same venerated poet’s 1933 admonition that, in any well-ordered society, “reasons of race and religion combine to make any large number of free-thinking Jews undesirable.”

How should Jewish writers proceed on this inhospitable ground?

There was a paradox in the timing of Fiedler’s essay, since this was actually the postwar moment when Jews were themselves beginning to move into the forefront of Anglo-American culture. The “New York Intellectuals”—the first European-style intelligentsia on American soil, clustered around several magazines and publishing houses—were beginning to gain prominence as writers, thinkers, critics, and professors.

More here.

The structure of DNA

Georgina Ferry in Nature:

On 25 April 1953, James Watson and Francis Crick announced1 in Nature that they “wish to suggest” a structure for DNA. In an article of just over a page, with one diagram (Fig. 1), they transformed the future of biology and gave the world an icon — the double helix. Recognizing at once that their structure suggested a “possible copying mechanism for the genetic material”, they kick-started a process that, over the following decade, would lead to the cracking of the genetic code and, 50 years later, to the complete sequence of the human genome. Until that time, biologists had still to be convinced that the genetic material was indeed DNA; proteins seemed a better bet. Yet the evidence for DNA was already available. In 1944, the Canadian–US medical researcher Oswald Avery and his colleagues had shown2 that the transfer of DNA from a virulent to a non-virulent strain of bacterium conferred virulence on the latter. And in 1952, the biologists Alfred Hershey and Martha Chase had published evidence3 that phage viruses infect bacteria by injecting viral DNA.

Watson, a 23-year-old US geneticist, arrived at the Cavendish Laboratory at the University of Cambridge, UK, in autumn 1951. He was convinced that the nature of the gene was the key problem in biology, and that the key to the gene was DNA. The Cavendish was a physics lab, but also housed the Medical Research Council’s Unit for Research on the Molecular Structure of Biological Systems, headed by chemist Max Perutz. Perutz’s group was using X-ray crystallography to unravel the structures of the proteins haemoglobin and myoglobin. His team included a 35-year-old graduate student who had given up physics and retrained in biology, and who was much happier working out the theoretical implications of other people’s results than doing experiments of his own: Francis Crick. In Crick, Watson found a ready ally in his DNA obsession.

More here.

Thursday Poem

Psalm for God’s Mother​

​​plead with god in secret. o, moonlight. how you witness me
crack open like no other. here, i am on my knees praying god
will make me boy. my grandmother overhears and i know
god said no. o, body, wretched, unholy thing. you have never
survived a man’s gaze; so if god is a man, tell him i don’t want
him watching me change                      into myself. coming out
of a suffocating womanhood i have been forced to call home.
tonight it is a drowning. a royal asphyxiation. body drenched
in an unknowing of future. i am not allowed boyhood. i do it
anyway. the moonlight listens and i yell: if god is a man, tell him
i’d like to meet his mother. o, goddess. woman of the changing
leaves. turn me over like springtime. i am body ever-churning.
o, mother. press your hands to my chest. push my body into
wax-coated wings, pristine. please. i don’t want to see the shame
in them. yes, mother. i run off the cliff and fly this time. the sun
cannot stop me. i am free. i gift myself a new name, etched
on the back of my hand with a quill from my spine and mother:
i am still life as the sun melts my wax. behind me, every feather
becomes a bird. they sing and i become that sound. fill the air.
i smile and now i am the thing illuminating. o, goddess, i am
the sun. i will not die. on earth, my mother is warmed in my
light. eternity passes. and passes. and passes. and i am always
the sun.                                                                        her son.

by Charlie Blodnieks
from
Muzzel Magazine, Summer 2019

Wednesday, October 9, 2019

How to Stop Superhuman A.I. Before It Stops Us

Stuart Russell in the New York Times:

The arrival of superhuman machine intelligence will be the biggest event in human history. The world’s great powers are finally waking up to this fact, and the world’s largest corporations have known it for some time. But what they may not fully understand is that how A.I. evolves will determine whether this event is also our last.

The problem is not the science-fiction plot that preoccupies Hollywood and the media — the humanoid robot that spontaneously becomes conscious and decides to hate humans. Rather, it is the creation of machines that can draw on more information and look further into the future than humans can, exceeding our capacity for decision making in the real world.

To understand how and why this could lead to serious problems, we must first go back to the basic building blocks of most A.I. systems. The “standard model” in A.I., borrowed from philosophical and economic notions of rational behavior, looks like this:

“Machines are intelligent to the extent that their actions can be expected to achieve their objectives.”

Because machines, unlike humans, have no objectives of their own, we give them objectives to achieve. In other words, we build machines, feed objectives into them, and off they go.

More here.

Sean Carroll’s Mindscape Podcast: Kate Jeffery on Entropy, Complexity, and Evolution

Sean Carroll in Preposterous Universe:

Our observable universe started out in a highly non-generic state, one of very low entropy, and disorderliness has been growing ever since. How, then, can we account for the appearance of complex systems such as organisms and biospheres? The answer is that very low-entropy states typically appear simple, and high-entropy states also appear simple, and complexity can emerge along the road in between. Today’s podcast is more of a discussion than an interview, in which behavioral neuroscientist Kate Jeffery and I discuss how complexity emerges through cosmological and biological evolution. As someone on the biological side of things, Kate is especially interested in how complexity can build up and then catastrophically disappear, as in mass extinction events.

More here.

Top 1% Gained $21 Trillion in Wealth Since 1989 While Bottom Half Lost $900 Billion

Jake Johnson in Common Dreams:

Adding to the mountain of statistical evidence showing the severity of U.S. inequality, an analysis published Friday found that the top one percent of Americans gained $21 trillion in wealth since 1989 while the bottom 50 percent lost $900 billion.

Matt Bruenig, founder of the left-wing think tank People’s Policy Project, broke down the Federal Reserve’s newly released “Distributive Financial Accounts” data series and found that, overall, “the top one percent owns nearly $30 trillion of assets while the bottom half owns less than nothing, meaning they have more debts than they have assets.”

The growth of wealth inequality over the past 30 years, Bruenig found, is “eye-popping.”

“Between 1989 and 2018, the top one percent increased its total net worth by $21 trillion,” Bruenig wrote. “The bottom 50 percent actually saw its net worth decrease by $900 billion over the same period.”

More here.

How science has shifted our sense of identity

Nathaniel Comfort in Nature:

In the iconic frontispiece to Thomas Henry Huxley’s Evidence as to Man’s Place in Nature (1863), primate skeletons march across the page and, presumably, into the future: “Gibbon, Orang, Chimpanzee, Gorilla, Man.” Fresh evidence from anatomy and palaeontology had made humans’ place on the scala naturae scientifically irrefutable. We were unequivocally with the animals — albeit at the head of the line. Nicolaus Copernicus had displaced us from the centre of the Universe; now Charles Darwin had displaced us from the centre of the living world. Regardless of how one took this demotion (Huxley wasn’t troubled; Darwin was), there was no doubting Huxley’s larger message: science alone can answer what he called the ‘question of questions’: “Man’s place in nature and his relations to the Universe of things.”

Huxley’s question had a prominent place in the early issues of Nature magazine. Witty and provocative, ‘Darwin’s bulldog’ was among the most in-demand essayists of the day. Norman Lockyer, the magazine’s founding editor, scored a coup when he persuaded his friend to become a regular contributor. And Huxley knew a soapbox when he saw one. He hopped up and used Nature’s pages to make his case for Darwinism and the public utility of science. It was in the seventh issue — 16 December 1869 — that Huxley advanced a scheme for what he called ‘practical Darwinism’ and we call eugenics. Convinced that continued dominance of the British Empire would depend on the “energetic enterprising” English character, he mused about selecting for a can-do attitude among Britons1. Acknowledging that the law, not to mention ethics, might get in the way, he nevertheless wrote: “it may be possible, indirectly, to influence the character and prosperity of our descendants.” Francis Galton — Darwin’s cousin and an outer planet of Huxley’s solar system — was already writing about similar ideas and would come to be known as the father of eugenics. When this magazine appeared, then, the idea of ‘improving’ human heredity was on many people’s minds — not least as a potent tool of empire.

More here.

Trump’s America Is Not The Handmaid’s Tale

Cathy Young in The Atlantic:

A protester dressed as a character from the novel-turned-TV series “The Handmaid’s Tale” walks into the Senate Hart building during a rally against Supreme Court nominee Brett Kavanaugh on Capitol Hill in Washington, DC on October 4, 2018. – Top Republicans voiced confidence Thursday that Brett Kavanaugh will be confirmed to the US Supreme Court this weekend, as they asserted that an FBI probe had found nothing to support sex assault allegations against Donald Trump’s nominee.”Judge Kavanaugh should be confirmed on Saturday,” Senator Chuck Grassley of Iowa, the chairman of the Senate Judiciary Committee, told reporters. (Photo by ANDREW CABALLERO-REYNOLDS / AFP) (Photo credit should read ANDREW CABALLERO-REYNOLDS/AFP/Getty Images)

Woven through The Testaments—the new novel by the eminent Canadian author Margaret Atwood and a sequel to her 1985 classic, The Handmaid’s Tale—are harrowing flashbacks in which women deemed undesirable by the new men in charge are herded into a football stadium, held in brutal and degrading captivity, and periodically gunned down in the arena. These powerful, sickening scenes evoke both radical Islamist regimes and South American juntas. But they take place in a dystopian version of the United States; the makeshift prison is Harvard’s football stadium. To legions of Atwood’s American fans, this nightmarish vision is terrifyingly real. Yet the fantasy of martyrdom that Atwood taps into is self-indulgent and ultimately self-defeating—raising the question of why so many fans are so eager to believe the worst.

…At the heart of the Handmaid phenomenon is the belief—endorsed by the stars and producers of the TV series, and by Atwood herself—that Gilead is not mere fiction, but is almost upon us in real life. These parallels were endlessly hyped when the Hulu series, conceived and filmed before the 2016 election, premiered in April 2017, several months after Donald Trump’s inauguration. At the time, it was hailed in major publications as “timely,” “prescient,” and “alarmingly close to home,” despite bearing no resemblance to the actual alarming things happening under the Trump presidency. Notably, Atwood’s 1985 novel itself was partly inspired by the rise of the Christian right in the United States in the 1980s. And, for all its qualities—keen insights into the realities of totalitarianism, nuanced character dynamics, a sympathetic everywoman heroine struggling to survive under horrific oppression—it fails utterly if taken seriously as a potential scenario for America’s slide into religious dictatorship.

More here.

Tuesday, October 8, 2019

Julian Baggini on Patricia Churchland’s radical approach to the study of human consciousness

Julian Baggini in Prospect:

The nature of mind and consciousness had been one of the biggest and trickiest issues in philosophy for a century. Neuroscience was developing fast, but most philosophers resisted claims that it was solving the philosophical problems of mind. Scientists who trod on philosophers’ toes were accused of “scientism”: the belief that the only true explanations are scientific explanations and that once you had described the science of a phenomenon there was nothing left to say. Those rare philosophers like the Churchlands, who shared many of the enthusiasms and interest of these scientists, were even more despised. A voice in the head of Patricia Churchland told her how to deal with these often vicious critics: “outlast the bastards.”

Churchland’s work tried to take the philosophical implications of the new brain research seriously without falling into the scientistic traps. It quickly generated a huge amount of interest, from admirers and detractors alike. For her supporters, mostly scientists, studying the brain was essential to understanding how we perceive the world. For her detractors, mostly philosophers, the whole project of “neurophilosophy” was fundamentally naïve and misguided: it was all neuro and no philosophy, reducing humans to mere machines. Churchland still sometimes gets mocked as “the Queen of Neuromania,” as Raymond Tallis acidly described her; Colin McGinn once dismissed her work as “neuroscience cheerleading.”

Yet over the years, Churchland has received due recognition for avoiding the traps that lie in each extreme.

More here.

Can a machine learn to write for New Yorker Magazine?

The safety of any new technology often hinges on how it’s regulated. If machines can learn to think for themselves, that might be a concern. But if we really want to replicate human intelligence—as most of us want to—there are several directions that researchers might explore.

The italicized paragraph above was written by a machine. Here’s the story by John Seabrook in The New Yorker:

Machine translation, an enduring dream of A.I. researchers, was, until three years ago, too error-prone to do much more than approximate the meaning of words in another language. Since switching to neural machine translation, in 2016, Google Translate has begun to replace human translators in certain domains, like medicine. A recent study published in Annals of Internal Medicine found Google Translate accurate enough to rely on in translating non-English medical studies into English for the systematic reviews that health-care decisions are based on.

Ilya Sutskever, OpenAI’s chief scientist, is, at thirty-three, one of the most highly regarded of the younger researchers in A.I. When we met, he was wearing a T-shirt that said “The Future Will Not Be Supervised.” Supervised learning, which used to be the way neural nets were trained, involved labelling the training data—a labor-intensive process. In unsupervised learning, no labelling is required, which makes the method scalable. Instead of learning to identify cats from pictures labelled “cat,” for example, the machine learns to recognize feline pixel patterns, through trial and error.

More here.

The story of Jamal Khashoggi’s murder

Evan Ratliff in Insider:

One year ago, the journalist Jamal Khashoggi walked into the Saudi Consulate in Istanbul and never walked out. In the months that followed, the facts of his disappearance and murder would emerge in fragments: an international high-tech spy game, a diabolical plot, a gruesome killing, and a preposterous cover-up reaching the highest levels of the Saudi government, aided by the indifference and obstinacy of the White House. Eventually those fragments came to comprise a macabre mosaic.

This June, the United Nations special rapporteur for extrajudicial, summary, or arbitrary executions issued a 100-page report detailing the Khashoggi affair. The report, the product of five months of independent investigation spanning six countries, added to the thrum of international indignation about Khashoggi’s murder. But so far it has largely failed to galvanize it into action.

Here is the story, as we know it, illustrated by Chris Koehler and told as a nonfiction narrative by the author Evan Ratliff.

More here.