Mary Shelley foresaw that artificial intelligence would be made monstrous, not by human hubris but by human cruelty

Eileen Hunt Botting in Aeon:

Mary Wollstonecraft Shelley’s 200-year-old creature is more alive than ever. In his new role as the bogeyman of artificial intelligence (AI), ‘the monster’ made by Victor Frankenstein is all over the internet. The British literary critic Frances Wilson even called him ‘the world’s most rewarding metaphor’. Though issued with some irony, this title suited the creature just fine.

From the editors of The Guardian to the engineers at Google have come stiff warnings about AI: it’s a monster in the closet. Hidden in computer consoles and in the shadows of the world wide web, from Moscow to Palo Alto, AI is growing stronger, faster, smarter and more dangerous than its clever programmers. Worse than the bioengineered and radiated creatures of Cold War B-movies, AI is the Frankenstein’s creature for our century. It will eventually emerge – like a ghost from its machine – to destroy its makers and the whole of humanity.

Thematically, not much has changed since 1818, when the 20-year-old Shelley’s first novel went to print. As with Frankenstein; or, the Modern Prometheus, apocalyptic media concerning AI relies for its big scare on the domestic conventions of gothic literature. The robots will rise up to destroy the world and your precious privacy at home. Cue Alexa, the Amazonian robot who knows every matter of your personal taste.

More here.

Why a little evil is good — and a lot of empathy is bad

Linda Rodriguez McRobbie in the Boston Globe:

HALLOWEEN IS THE ONE night of the year when we’re allowed to be just a little evil. But, say some scientists, it’s not just on Halloween that we give ourselves permission to be bad. It was the “everyday sadism,” the pleasure that people find in others’ pain, that struck psychologist Delroy Paulhus, who studies evil professionally.

The head of a University of British Columbia research lab that examines “dark” personality traits, Paulhus was part of a team of researchers who in 2002 identified the “dark triad,” three distinct antisocial personality traits: narcissism, or aggressive self-promotion; Machiavellianism, the desire to manipulate those around you; and callous, self-aggrandizing, impulsive psychopathy.

In 2013, based on research that came out of his lab, the trio was joined by a fourth — “everyday sadism.” In a set of experiments led by Erin Buckels, a scientist in Paulhus’s lab, participants were asked whether they’d rather kill bugs, help an exterminator kill them, clean toilets, or plunge their hands in ice water for 60 seconds. Fifty-three percent of the respondents said they’d either kill the bugs or help the exterminator; those who elected to kill the bugs, some 26.8 percent, were then presented with three woodlice — named Tootsie, Muffin, and Ike — and a coffee grinder (unbeknownst to the participants, the bugs were shielded from a crunchy death by a plastic insert over the blades). Not only did the 26 percent “kill” some or all of the bugs, but some of them also professed to enjoy it.

More here.

Future Politics: Living Together in a World Transformed by Tech

Clara Hendrickson in the Boston Review:

When we consider the future that technological change will bring about, it is tempting to envision a world taken over by robots, where the singularity has given way to superintelligent agents and human extinction. This is the image of our future we have grown accustomed to seeing in cinematic depictions, but it is not the future that British barrister Jamie Susskind wants us to worry about. Instead, in Future Politics: Living Together in a World Transformed by Tech, Susskind focuses on how digital technologies control human life rather than eliminate it.

All digital systems, after all, have their origin in code, and code, Susskind contends, does not merely direct the actions of machines or algorithmic platforms, it also directs our behavior and thought. For example, code can force us to do things we would not otherwise do. A self-driving car engineered to operate below the speed limit ensures its users obey the law. Code can also scrutinize our choices and persuade us to change our behavior. A smart fridge that monitors our eating habits, shaming our guilty pleasures, might lead us to abandon our late-night snacking routine. And code, of course, can shape our perception of the world. Search engines and algorithmic newsfeeds control the flow of information, determining what we see and know.

More here.

Bruno Latour, the Post-Truth Philosopher, Mounts a Defense of Science

Ava Kofman in the New York Times:

In the summer of 1996, during an international anthropology conference in southeastern Brazil, Bruno Latour, France’s most famous and misunderstood philosopher, was approached by an anxious-looking developmental psychologist. The psychologist had a delicate question, and for this reason he requested that Latour meet him in a secluded spot — beside a lake at the Swiss-style resort where they were staying. Removing from his pocket a piece of paper on which he’d scribbled some notes, the psychologist hesitated before asking, “Do you believe in reality?”

For a moment, Latour thought he was being set up for a joke. His early work, it was true, had done more than that of any other living thinker to unsettle the traditional understanding of how we acquire knowledge of what’s real. It had long been taken for granted, for example, that scientific facts and entities, like cells and quarks and prions, existed “out there” in the world before they were discovered by scientists. Latour turned this notion on its head. In a series of controversial books in the 1970s and 1980s, he argued that scientific facts should instead be seen as a product of scientific inquiry. Facts, Latour said, were “networked”; they stood or fell not on the strength of their inherent veracity but on the strength of the institutions and practices that produced them and made them intelligible. If this network broke down, the facts would go with them.

Still, Latour had never seen himself as doing anything so radical, or absurd, as calling into question the existence of reality. As a founder of the new academic discipline of science and technology studies, or S.T.S., Latour regarded himself and his colleagues as allies of science. Of course he believed in reality, he told the psychologist, convinced that the conversation was in jest.

More here.

The Demise of FilmStruck Is Part of a Bigger Pattern

David Sims at The Atlantic:

Swedish actress Bibi Andersson and Norwegian actress Liv Ullmann on the set of Persona, written and directed by Ingmar Bergman. (Photo by Sunset Boulevard/Corbis via Getty Images)

Turner and Warner Bros. Digital Networks’ statement on the closing of FilmStruck struck a similarly corporate tone: “While FilmStruck has a very loyal fanbase, it remains largely a niche service. We plan to take key learnings from FilmStruck to help shape future business decisions in the direct-to-consumer space and redirect this investment back into our collective portfolios.”

Those “key learnings” remain uncertain. Some, or all, of the WarnerMedia archives will likely be included in whatever new service they put together. But that project is at least a year from fruition, and it’ll lack the curation that made FilmStruck so special to subscribers trying to make a dent in its voluminous catalog. That kind of care and attention will be difficult to replicate on a larger scale. As companies work to assemble their respective streaming behemoths, FilmStruck will in retrospect feel like little more than a blip, a more specialized media moment between two eras ruled by giant networks.

more here.

How Horror Changed After WWI

W. Scott Poole at Literary Hub:

“The war has left its imprint in our souls [with] all these visions of horror it has conjured up around us,” wrote French author Pierre de Mazenod in 1922, describing the Great War. His word, horreur, appears in various forms in an incredible number of accounts of the war, written by English, German, Austrian, French, Russian, and American veterans. The years following the Great War became the first time in human history the word “horror” and its cognates appeared on such a massive scale. Images of catastrophe abounded. The Viennese writer Stefan Zweig, one of the stars in the firmament of central Europe’s decadent and demonic café culture before 1914, wrote of how “bridges are broken between today and tomorrow and the day before yesterday” in the conflict’s wake. Time was out of joint. When not describing the war as horror, the imagery of all we would come to associate with the word appeared. One French pilot passing over the ruined city of Verdun described the landscape as a haunted waste and a creature of nightmare, “the humid skin of a monstrous toad.”

more here.

The Maverick Lou Harrison

Sudip Bose at The American Scholar:

The mesmeric work of Lou Harrison (1917–2003) stands apart from so much of the music written in 20th-century America—so singular is its idiom, so striking are its borderless, cross-cultural sounds—yet despite a swell of interest coinciding with the composer’s centennial last year, his scores are all too rarely heard. He was always something of an outsider, this unrepentant free spirit and individualist. Harrison studied with Arnold Schoenberg in the 1940s, when the 12-tone master was ensconced in Los Angeles, and gave serial techniques a serious go, but his best music—lyrical, melodic, indebted to the sounds of Southeast Asia—inhabits a different world from so much of the postmodern avant-garde.

Harrison was born in Portland, Oregon, and studied a variety of instruments during a brief stint at San Francisco State College. A class he took with the modernist Henry Cowell—the subject was world music, which would one day be Harrison’s métier—proved fortuitous, leading to private lessons with the composer. Under Cowell’s tutelage, he became enthralled with the music of Charles Ives.

more here.

The Selfish Dataome

Caleb Scharf in Nautilus:

You’ve heard the argument before: Genes are the permanent aristocracy of evolution, looking after themselves as fleshy hosts come and go. That’s the thesis of a book that, last year, was christened the most influential science book of all time: Richard Dawkins’ The Selfish Gene. But we humans actually generate far more actionable information than is encoded in all of our combined genetic material, and we carry much of it into the future. The data outside of our biological selves—call it the dataome—could actually represent the grander scaffolding for complex life. The dataome may provide a universally recognizable signature of the slippery characteristic we call intelligence, and it might even teach us a thing or two about ourselves. It is also something that has a considerable energetic burden. That burden challenges us to ask if we are manufacturing and protecting our dataome for our benefit alone, or, like the selfish gene, because the data makes us do this because that’s what ensures its propagation into the future. Take, for instance, William Shakespeare.

Shakespeare died on April 23, 1616 and his body was buried two days later in Holy Trinity Church in Stratford-Upon-Avon. His now-famous epitaph carries a curse to anyone who dares “move my bones.” And as far as we know, in the past 400 years, no one has risked incurring Will’s undead wrath. But he has most certainly lived on beyond the grave. At the time of his death Shakespeare had written a total of 37 plays, among other works. Those 37 plays contain a total of 835,997 words. In the centuries that have come after his corporeal life an estimated 2 to 4 billion physical copies of his plays and writings have been produced. All of those copies have been composed of hundreds of billions of sheets of paper acting as vessels for more than a quadrillion ink-rich letters.

More here.

Thursday Poem

Micah’s Prophesy

.
Time subsides and you fall back into the hammock
of another easy truth. There are so many ways to
disguise this. One reigning idea dictates what you will
think, so you go blundering from one war to another,
one rape or abuse to another. My dream for you is clothed
with shadows. Listen,- your final dawn will arrive rudely.
What became of me wasn’t worth the telling. But, I’ll say
this: the real dungeons are our own words, the real chains
the ones we use to encircle our hearts. There are letters
in my alphabet you’ll never know. I saw a whole
army collapse like a huge lung.  I saw bodies fall like
chips from a woodsman’s axe. There was a king who
believed me, and one who didn’t. You know their fates.
Your own kings pencil in their beliefs for later erasure.
After each tragedy they hand out antique apologies.
Someone shoots in a theater and soon it plays like fiction.
Someone else pulverizes symbols they don’t understand.
When you break the world it doesn’t just get fixed.
There is a truth, if you listen, but it arrives with no
postmark and no return address, no provision for revision.
Even your windows mutter things you refuse to understand.
I can say: there is little patience with your skeletal words.
I can say: you should already know this by reading
what has already been written on the dungeon walls of
your own hearts and the watermarks of your own souls.
The harp plays on, but the question is, who’s listening?

by Richard Jackson
from Echotheo Review

Fodor’s Legacy

David Lobina in Inference Review:

The cognitive revolution that gripped linguistics, psychology, and philosophy in the 1950s and 1960s owes much to Noam Chomsky and the intellectual milieu then forming around him at MIT. Chomsky’s prominence is well earned. Jerry Fodor’s contributions may prove as enduring. In On Concepts, Modules, and Language, Roberto de Almeida and Lila Gleitman have brought together some of Fodor’s former colleagues and collaborators, including Chomsky, to discuss his work. Each contributor was asked to engage critically with one of two big topics close to Fodor’s heart: the architecture of the human mind, and the language of thought.

Fodor is well known among psychologists for his thesis that there is a tripartite division to the human mind, a position he first defended in The Modularity of Mind.1 First, there are the sense organs that convert light and sound into signals to the nervous system. The study of these organs is best left to the psychophysicist. Then there are modules that further analyze the input. This is, Fodor argues, the domain of psychology. Visual perception and language comprehension are more-or-less modular: they are independent of our overall knowledge. Fodor’s stance went against the dominant view.

What Fodor had in mind can be explained in intuitive terms.

More here.

The Ghost Story Persists in American Literature. Why?

Parul Sehgal in the New York Times:

In 1960, the literary critic Leslie Fiedler delivered a eulogy for the ghost story in his classic study “Love and Death in the American Novel.” “An obsolescent subgenre,” he declared, with conspicuous relish; a “naïve” little formas outmoded as its cheap effects, the table-tapping and flickering candlelight. Ghost stories belong to — brace yourself for maximum Fiedlerian venom — “middlebrow craftsmen,” who will peddle them to a rapidly dwindling audience and into an extinction that can’t come soon enough.

Not since Herman Melville’s publishers argued for less whale and more maidens in “Moby-Dick” (“young, perhaps voluptuous,” they dared to dream) has a literary judgment been so impressively off the mark.

Literature — the top-shelf, award-winning stuff — is positively ectoplasmic these days, crawling with hauntings, haints and wraiths of every stripe and disposition. These ghosts can be nosy and lubricious, as in George Saunders’s “Lincoln in the Bardo,” which followed a group of spectral busybodies in purgatory, observing the arrival of Abraham Lincoln’s newly deceased young son. They can be confused by their fates, as in Martin Riker’s new novel, “Samuel Johnson’s Eternal Return,” in which a man is unsettled to discover that his essence has migrated into the body of the man who killed him.

More here.

Top Climate Scientists Warn Governments Of ‘Blatant Anti-Nuclear Bias’ In Latest IPCC Climate Report

Michael Shellenberger in Forbes:

Some of the scientists most often cited by the Intergovernmental Panel on Climate Change (IPCC) have taken the unusual step of warning leaders of G-20 nations that a recent IPCC report uses a double standard when it comes to its treatment of nuclear as compared to renewables.

“The anti-nuclear bias of this latest IPCC release is rather blatant,” said  Kerry Emanuel, a climate scientist at the Massachusetts Institute of Technology, “and reflects the ideology of the environmental movement. History may record that this was more of an impediment to decarbonization than climate denial.”

Other signers of the letter include Tom Wigley, a widely-cited climate scientist who has contributed to IPCC reports on 13 separate occasions, David Lea, professor of Earth Sciences at University of California, Santa Barbara, and Peter Raven, Winner of the National Medal of Science, 2001.

More here.

Mary Jane Doherty reads “The Snow Man” by Wallace Stevens

One must have a mind of winter
To regard the frost and the boughs
Of the pine-trees crusted with snow;

And have been cold a long time
To behold the junipers shagged with ice,
The spruces rough in the distant glitter

Of the January sun; and not to think
Of any misery in the sound of the wind,
In the sound of a few leaves,

Which is the sound of the land
Full of the same wind
That is blowing in the same bare place

For the listener, who listens in the snow,
And, nothing himself, beholds
Nothing that is not there and the nothing that is.

Amélie Nothomb’s Big moods and Obliterating Encounters

Charlotte Shane at Bookforum:

NO WORKING WRITER believes in the shattering power of an encounter—with another person, with a new sensation, with possibility—more than Amélie Nothomb, the prolific Paris-based Belgian who’s published a novel a year since 1992’s Hygiène de l’assassin (rendered in English as Hygiene and the Assassin, though a more accurate title would be The Assassin’s Purity). Her first book offered an impressive blueprint of what would define her subsequent work: arrogant, infuriating personalities; vicious character clashes; childhood love so obsessive that it bleeds out over an adult’s entire history; and philosophical declarations about war. (Nothomb’s fervent worship of “war,” used to describe any grand conflict, is as distinctive a signature as her actual name.) “My books are more harmful than war,” brags the author at the center of Hygiene, “because they make you want to die, whereas war, in fact, makes you want to live.” His demeanor is so provoking that it incites murder, which is another Nothomb theme. People are always destroying one another. She’s killed a self-named avatar off on at least two separate occasions.

more here.

Adrienne Rich’s Ways of Being

Mark Ford at the NYRB:

Adrienne Rich, New York City, 1973

The poem’s most haunting moments, however, are fraught with a more confused and confessional charge—it was composed between 1958 and 1960 and seems to me to reveal the influence of Robert Lowell’s own watershed sequence, Life Studies(1959). As in Lowell, these moments explicitly dramatize what would become a 1960s slogan: The personal is political. The frustrated daughter-in-law has “let the tapstream scald her arm,/a match burn to her thumbnail,//or held her hand above the kettle’s snout/right in the woolly steam.” Her impulse toward self-harm is figured as an internalization of the culture’s instinctive violence toward women: “A thinking woman sleeps with monsters./The beak that grips her, she becomes.” This poem is Rich’s first deliberate act of resistance against these monsters and the beaks that would grip her; it’s her radical answer to Yeats’s rhetorical question at the end of his sonnet “Leda and the Swan.”

more here.

The Draw of the Gothic

Sarah Perry at The Paris Review:

To understand the literary Gothic – to even begin to account for its curious appeal, and its simultaneous qualities of seduction and repulsion – it is necessary to undertake a little time travel. We must go back beyond the builders putting the capstone on Pugin’s Palace of Westminster, and on past the last lick of paint on the iced cake of Horace Walpole’s Strawberry Hill House; back again another six hundred years past the rap of the stone-mason’s hammer on the cathedral at Reims, in order to finally alight on a promontory above the city of Rome in 410AD. The city is on fire. There are bodies in the streets and barbarians at the gates. Pope Innocent I, hedging his bets, has consented to a little pagan worship that is being undertaken in private. Over in Bethlehem, St Jerome hears that Rome has fallen. ‘The city which had taken the whole world’, he writes, ‘was itself taken.’ The old order – of decency and lawfulness meted out with repressive colonial cruelty – has gone. The Goths have taken the Forum.

more here.

Which Agatha Christie should we read this month?

Sam Jordison in The Guardian:

It’s dark. It’s cold. As I write this the rain is lashing down outside my window and beyond that – ugh! The world. Brexit, Trump, Putin. Danger, fear and uncertainty. I want warmth, I want comfort and I want to feel that somehow, somewhere, order might be restored. I want, in other words, to read a novel by Agatha Christie. I’m hoping that the queen of cosy crime will be an excellent subject for the reading group this month, as well as a useful tonic. She’s certainly popular and productive enough. Christie put out no fewer than 66 detective novels and 14 short-story collections between the publication of her first novel (The Mysterious Affair At Styles) in 1920 and her death in 1976. More than 2bn copies of those books have been sold so far. Which means that only Shakespeare with his Complete Works, and God with his Bible, have shifted more units. Meanwhile, Agatha Christie TV and film adaptations continually enrapture audiences around the world and The Mouse Trap is still playing in the West End after 66 years, making it the longest-running play in history (and I still don’t know how it ends, so don’t tell me).

More than that, Christie remains an intriguing and complex writer. So far as I can remember, anyway. The truth is that I haven’t read any of her novels since I went through a spate of borrowing them from Lancaster library when I was a schoolboy in the 1990s. The pages of those reeked of stale smoke from the elderly crime fans who had torn through them before me. It was another age. Yet, while I’m assuming the physical books will no longer be so pungently evocative, they will still speak of time past – with all the fascination and occasional discomfort that entails. I’m also hoping that they will be ripping yarns, they’ll gleefully shred the Golden Age rules of crime fiction and they’ll contain character studies sharp enough to use as murder weapons.

More here.

Happy with a 20% chance of sadness

Matt Kaplan in Nature:

In the winter of 1994, a young man in his early twenties named Tim was a patient in a London psychiatric hospital. Despite a happy and energetic demeanour, Tim had bipolar disorder and had recently attempted suicide. During his stay, he became close with a visiting US undergraduate psychology student called Matt. The two quickly bonded over their love of early-nineties hip-hop and, just before being discharged, Tim surprised his friend with a portrait that he had painted of him. Matt was deeply touched. But after returning to the United States with portrait in hand, he learned that Tim had ended his life by jumping off a bridge. Matthew Nock now studies the psychology of self-harm at Harvard University in Cambridge, Massachusetts. Even though more than two decades have passed since his time with Tim, the portrait still hangs in his office as a constant reminder of the need to develop a way to predict when people are likely to try and kill themselves. There are plenty of known risk factors for suicide — heavy alcohol use, depression and being male among them — but none serve as tell-tale signs of imminent suicidal thoughts. Nock thinks that he is getting close to solving that.

Since January 2016, he has been using wristbands and a phone application to study the behaviour of consenting patients who are at risk of suicide, at Massachusetts General Hospital in Boston. And he has been running a similar trial at the nearby Franciscan Children’s Hospital this year. So far, he says, although his results have not yet been published, the technology seems able to predict a day in advance, and with reasonable accuracy, when participants will report thinking of killing themselves.

More here.