Automated Ethics

Bridle-drone

Tom Chatfield in Aeon:

Back in August 2012, Google announced that it had achieved 300,000 accident-free miles testing its self-driving cars. The technology remains some distance from the marketplace, but the statistical case for automated vehicles is compelling. Even when they’re not causing injury, human-controlled cars are often driven inefficiently, ineptly, antisocially, or in other ways additive to the sum of human misery.

What, though, about more local contexts? If your vehicle encounters a busload of schoolchildren skidding across the road, do you want to live in a world where it automatically swerves, at a speed you could never have managed, saving them but putting your life at risk? Or would you prefer to live in a world where it doesn’t swerve but keeps you safe? Put like this, neither seems a tempting option. Yet designing self-sufficient systems demands that we resolve such questions. And these possibilities take us in turn towards one of the hoariest thought-experiments in modern philosophy: the trolley problem.

In its simplest form, coined in 1967 by the English philosopher Philippa Foot, the trolley problem imagines the driver of a runaway tram heading down a track. Five men are working on this track, and are all certain to die when the trolley reaches them. Fortunately, it’s possible for the driver to switch the trolley’s path to an alternative spur of track, saving all five. Unfortunately, one man is working on this spur, and will be killed if the switch is made.

In this original version, it’s not hard to say what should be done: the driver should make the switch and save five lives, even at the cost of one. If we were to replace the driver with a computer program, creating a fully automated trolley, we would also instruct it to pick the lesser evil: to kill fewer people in any similar situation. Indeed, we might actively prefer a program to be making such a decision, as it would always act according to this logic while a human might panic and do otherwise.

More here.

Big Data: Are We Making a Big Mistake?

24930f00-b495-11e3-a09a-00144feabdc0

Tim Harford in the FT Magazine:

Cheerleaders for big data have made four exciting claims, each one reflected in the success of Google Flu Trends: that data analysis produces uncannily accurate results; that every single data point can be captured, making old statistical sampling techniques obsolete; that it is passé to fret about what causes what, because statistical correlation tells us what we need to know; and that scientific or statistical models aren’t needed because, to quote “The End of Theory”, a provocative essay published in Wired in 2008, “with enough data, the numbers speak for themselves”.

Unfortunately, these four articles of faith are at best optimistic oversimplifications. At worst, according to David Spiegelhalter, Winton Professor of the Public Understanding of Risk at Cambridge university, they can be “complete bollocks. Absolute nonsense.”

Found data underpin the new internet economy as companies such as Google, Facebook and Amazon seek new ways to understand our lives through our data exhaust. Since Edward Snowden’s leaks about the scale and scope of US electronic surveillance it has become apparent that security services are just as fascinated with what they might learn from our data exhaust, too.

Consultants urge the data-naive to wise up to the potential of big data. A recent report from the McKinsey Global Institute reckoned that the US healthcare system could save $300bn a year – $1,000 per American – through better integration and analysis of the data produced by everything from clinical trials to health insurance transactions to smart running shoes.

But while big data promise much to scientists, entrepreneurs and governments, they are doomed to disappoint us if we ignore some very familiar statistical lessons.

“There are a lot of small data problems that occur in big data,” says Spiegelhalter. “They don’t disappear because you’ve got lots of the stuff. They get worse.”

More here.

BRET EASTON ELLIS AND LINDSAY LOHAN MAKE A FEATURE FILM

Article_anolikLili Anolik at The Believer:

Bret Easton Ellis is modern literature’s little rascal supreme. He seems to do things for no reason other than the fun of it. Take, for example, the many references in his books to his other books, references made in such a super-subtle yet obsessive way he could be doing it only to amuse himself. His minor characters are often recurring. Sean Bateman, for example, one of the protagonists in The Rules of Attraction, has, it is glancingly mentioned, an older brother, Patrick, the gifter of a brown Ralph Lauren tie about which Sean has ambivalent feelings. Patrick then lands the lead role as the Psycho who also happens to be an American in Ellis’s next work. Ellis did the same thing with Victor Johnson, Lauren Hynde’s mostly offstage boyfriend in The Rules of Attraction, moving him from the periphery of that novel (he’s backpacking through Europe for much of the narrative) to front-and-center in Glamorama. Ellis even gives him a stage name, Victor Ward—which is stronger, more macho-sounding, and, with fewer syllables, fits better on a marquee—as is commensurate with his change in status from bit player to star. What or whom, one wonders, did these characters have to do in order to secure their big breaks? If any writer would have a casting couch for his fictional creations, it would be Ellis.

more here.

Immanuel Velikovsky’s strange quest for a scientific theory of everything

Findlen_visionaryfringe_ba_imgPaula Findlen at The Nation:

In the 1940s, a curiously enigmatic figure haunted New York City’s great libraries, his mind afire with urgent questions whose resolution might reveal, once and for all, the most ancient secrets of the universe in their crystalline clarity. This scholar eschewed the traditional disciplinary boundaries that define the intellectual terrain of the specialist; instead, he read widely, skimming the surface of countless works of science, myth and history to craft an answer to an overwhelming question: Had our planet been altered repeatedly by cosmic catastrophes whose traces could be found in the earliest human records?

A fantastic theory began to emerge, redolent of the efforts of an earlier age to unify knowledge, yet speaking to the preoccupations of a world contemplating the chaos of another gruesome European war. The solar system, it was revealed, did not operate according to Newton’s universal laws of gravitation, nor did life on Earth evolve gradually and continuously, as Darwin had written. Instead, the cosmos was like a giant atom, periodically discharging photons whose energy disrupted and redirected the movements of celestial bodies, even causing the reversal of Earth’s magnetic poles. A planet was a kind of super-electron.

more here.

The new war literature by veterans

140407_r24824_p465George Packer at The New Yorker:

Soldiers who set out to write the story of their war also have to navigate a minefield of clichés: all of them more or less true but open to qualification; many sowed long before the soldiers were ever deployed, because every war is like every other war. That’s one of them. War is hell is another. War begins in illusion and ends in blood and tears. Soldiers go to war for their country’s cause and wind up fighting for one another. Soldiers are dreamers (Sassoon said that). No one returns from war the same person who went. War opens an unbridgeable gap between soldiers and civilians. There’s no truth in war—just each soldier’s experience. “You can tell a true war story by its absolute and uncompromising allegiance to obscenity and evil” (from “How to Tell a True War Story,” in O’Brien’s story collection “The Things They Carried”).

Irony in modern American war literature takes many forms, and all risk the overfamiliarity that transforms style into cliché. They begin with Hemingway’s rejection, in “A Farewell to Arms,” of the high, old language, his insistence on concreteness: “I had seen nothing sacred, and the things that were glorious had no glory and the sacrifices were like the stockyards at Chicago if nothing was done with the meat except to bury it. There were many words that you could not stand to hear and finally only the names of places had dignity.”

more here.

The need for the public practice of the humanities in India

Prashant Keshavmurthy in Chapati Mystery:

Shibli NomaniIn 1892, Maulana Shibli Nu’māni, an internationally celebrated Indian Muslim historian, (Urdu-Persian) literary critic and theologian of his day, traveled by sea from Bombay to the Ottoman Empire, journeying through Cyprus, Istanbul, Syria and Egypt. Of this journey he kept a journal that he later published under the title of Safarnāma-i rūm va misr va shām (A Travel Account of Turkey, Egypt and Syria).2 He claims that he had not intended to write a travel account but that European prejudices with regard to the Turks had led him to do so. Even well-meaning Europeans, he observes, remain bound by the Islamophobic prejudices they are raised with. His aims in writing it are therefore corrective and pedagogical: to correct prejudiced European travel accounts of Turkey that form the basis for European histories, and to instruct Indian Muslims by documenting exemplary “progress” among Turkish Muslims. The Turkey or Ottoman state of Shibli’s time, we must remember, was the only one of the three great early modern Islamic states – the other two being Safavid Iran and Mughal India – to still be extant. Moreover, its emperor, Abduḥamīd II (1876 – 1909), had only recently achieved radical advances in the movement to modernize or “reorganize” – “reorganization” or tanzīmāt bespeaking the bureaucratic character of this modernity – of his state on European models. Shibli intends therefore to focus on the “developments and reforms” of the Muslim world, especially Turkey.

The turn of the century preoccupation with lost Mughal sovereignty among North India’s Reformist Muslims – a sovereignty they understood as Muslim in the wake of the formal end of the Mughal state in 1857 – led them to regard the still regnant Ottoman empire with special attention: in it they saw a Muslim empire that was modeling itself through technological and institutional reforms on Europe, the very ambition of Sayyid Aḥmad Khān, the founder of what became Aligarh Muslim University, and his colleagues like Shibli Nu’māni. Shibli thus discusses formerly Ottoman Cyprus, when he passes through it, in terms of the history of its political sovereignty under Muslim and then British rule. Furthermore, everywhere in his travels he singles out educational syllabi, technology, and such empirical aspects of a society as clothing and food, treating them as indices of a polity’s development. Shibli desires and is at pains to discover signs of a continuous Muslim world. That he conflates all Arabs in the Ottoman territories with Muslims and vice versa signals this desire.

More here.

The Video Game Engine in Your Head

Joshua Hartshome in Scientific American:

VideoFor years now, physicists and engineers have been building computer simulations of physics in order to understand the behavior of objects in the world. Want to see if a bridge would be stable during an earthquake? Enter it into the simulation, apply earthquake dynamics, and see what happens. Recently, the prestigious Proceedings of the National Academy of Sciences published work by MIT psychologists (and my labmates) Peter Battaglia, Jessica Hamrick, and Joshua Tenenbaum, arguing that all humans do roughly the same thing when trying to understand or make predictions about the physical world. The primary difference is that we run our simulations in our brains rather than in digital computers, but the basic algorithms are roughly equivalent. The analogy runs deep: To model human reasoning about the physical world, the researchers actually used an open-source computer game physics engine — the software that applies the laws of physics to objects in video games in order to make them interact realistically (think Angry Birds).

Battaglia and colleagues found that their video game-based computer model matches human physical reasoning far better than any previous theory. The authors asked people to make a number of predictions about the physical world: will tower of blocks stand or fall over, what direction would it fall over, and where would the block that landed the farthest away land; which object would most likely fall off of a table if the table was bumped; and so on. In each case, human judgments closely matched the prediction of the computer simulation … but not necessarily the actual world, which is where it gets interesting.

More here.

Wednesday Poem

Under our boot soles

In memory of Jim Thomas

Once you stepped out an open window onto nothing
we could see from our desks, and for a whole
long second you floated and didn't fall
through two floors of air to the earth's something.

You never fell. You were just going smoking
before class on the unseen roof.
All of us saw you make that roof when you didn't fall.
You took drags, looked down, looked up, thinking.

Then you stepped back through the open window
and read us the end of “Song of Myself”
where the spotted hawk swoops and grass grows

under a boot. You were all voice, we were all ears.
Up ahead words with hollow bones wait
once you step onto nothing. We could hear.
.

by Dennis Finnell
from Ruins Assembling
Shape and Nature Press, Geenfield, Ma.2014

Tuesday, April 1, 2014

How Darwinian is Cultural Evolution?

Charles_Darwin_aged_51

Nicolas Claidière, Thomas C. Scott-Phillips, and Dan Sperber over at the Philosophical Transactions of the Royal Society (image via Wikimedia Commons):

Abstract

Darwin-inspired population thinking suggests approaching culture as a population of items of different types, whose relative frequencies may change over time. Three nested subtypes of populational models can be distinguished: evolutionary, selectional and replicative. Substantial progress has been made in the study of cultural evolution by modelling it within the selectional frame. This progress has involved idealizing away from phenomena that may be critical to an adequate understanding of culture and cultural evolution, particularly the constructive aspect of the mechanisms of cultural transmission. Taking these aspects into account, we describe cultural evolution in terms of cultural attraction, which is populational and evolutionary, but only selectional under certain circumstances. As such, in order to model cultural evolution, we must not simply adjust existing replicative or selectional models but we should rathergeneralize them, so that, just as replicator-based selection is one form that Darwinian selection can take, selection itself is one of several different forms that attraction can take. We present an elementary formalization of the idea of cultural attraction.

1. Population thinking applied to culture

In the past 50 years, there have been major advances in the study of cultural evolution inspired by ideas and models from evolutionary biology. Modelling cultural evolution involves, as it would for any complex phenomenon, making simplifying assumptions; many factors have to be idealized away. Each particular idealization involves a distinct trade-off between gaining clarity and insight into hopefully major dimensions of the phenomenon and neglecting presumably less important dimensions. Should one look for the best possible idealization? There may not be one. Different sets of simplifying assumptions may each uniquely yield worthwhile insights. In this article, we briefly consider some of the simplifications that are made in current models of cultural evolution and then suggest how important dimensions of the phenomenon that have been idealized away might profitably be introduced in a novel approach that we see as complementary rather than as alternative to current approaches. All these approaches, including the one we are advocating, are Darwinian, but in different ways that are worth spelling out.

Much clarity has been gained by drawing on the analogy between cultural and biological evolution (an analogy suggested by Darwin himself: ‘The formation of different languages and of distinct species, and the proofs that both have been developed through a gradual process, are curiously parallel’. This has made it possible to draw inspiration from formal methods in population genetics with appropriate adjustments and innovations. Of course, the analogy with biological evolution is not perfect. For example, variations in human cultural evolution are often intentionally produced in the pursuit of specific goals and hence are much less random than in the biological case.

More here.

Gloomy Terrors or the Most Intense Pleasure?

Appleton-after-Sully-436x744

Via Andrew Sullivan, Philip Schofield discusses Jeremy Bentham's writings on religion and sex, over at the Oxford University Press blog:

In 1814, just two hundred years ago, the radical philosopher Jeremy Bentham (1748–1832) began to write on the subject of religion and sex, and thereby produced the first systematic defence of sexual liberty in the history of modern European thought. Bentham’s manuscripts have now been published for the first time in authoritative form. He pointed out that ‘regular’ sexual activity consisted in intercourse between one male and one female, within the confines of marriage, for the procreation of children. He identified the source of the view that only ‘regular’ or ‘natural’ sexual activity was morally acceptable in the Mosaic Law and in the teachings of the self-styled Apostle Paul. ‘Irregular’ sexual activity, on the other hand, had many variations: intercourse between one man and one woman, when neither of them were married, or when one of them was married, or when both of them were married, but not to each other; between two women; between two men; between one man and one woman but using parts of the body that did not lead to procreation; between a human being and an animal of another species; between a human being and an inanimate object; and between a living human and a dead one. In addition, there was the ‘solitary mode of sexual gratification’, and innumerable modes that involved more than two people. Bentham’s point was that, given that sexual gratification was for most people the most intense and the purest of all pleasures and that pleasure was a good thing (the only good thing in his view), and assuming that the activity was consensual, a massive amount of human happiness was being suppressed by preventing people, whether from the sanction of the law, religion, or public opinion, from engaging in such ‘irregular’ activities as suited their taste.

Bentham was writing at a time when homosexuals, those guilty of ‘the crime against nature’, were subject to the death penalty in England, and were in fact being executed at about the rate of two per year, and were vilified and ridiculed in the press and in literature. If an activity did not cause harm, Bentham had argued as early as the 1770s and 1780s, then it should not be subject to legal punishment, and had called for the decriminalization of homosexuality. By the mid-1810s he was prepared to link the problem not only with law, but with religion. The destruction of Sodom and Gomorrah was taken by ‘religionists’, as Bentham called religious believers, to prove that God had issued a universal condemnation of homosexuality. Bentham pointed out that what the Bible story condemned was gang rape.

More here.

What slang says about us

Nicholas Shakespeare in The Telegraph:

SlangSlang’s first compilers were chippy individualists, routinely beset by financial worries and complex marital lives. They were never grandees like the 70-odd team beavering away still on the Oxford English Dictionary in Great Clarendon Street (less than 30 yards from where I live in Oxford). They numbered Francis Grose (1731-91), the son of a Swiss jeweller, who was so fat that his servant had to strap him into bed every night; Pierce Egan (1772-1849), a boxing journalist and editor of Real Life in London; and John William Hotten (1832-73), a workaholic pornographer (The Romance of Chastisement) who died from a surfeit of pork chops, and was remembered, unfairly, by the phrase: “Hotten: rotten, and forgotten”. Even so, they shared many characteristics of lexicographers like William Chester Minor (1834-1920), one of the OED’s founding fathers, who was, quite conclusively, bonkers. As one of Jonathon Green’s mentors, Anthony Burgess, cautions: “The study of language may beget madness.”

Super-geeks (from geek, meaning fool) to a man, slang’s lexicographers tend to be self-appointed guardians who, while cheerfully plagiarising each other in their project to demonstrate the importance and scope of slang, have yet to agree on a definition of what, precisely, slang is, or was – or even its origin. Hotten believed slang to be a gipsy term for the gipsies’ secret language; the Oxford philologist Walter Skeat attributed it to the Icelandic slunginn (cunning), while Eric Partridge (1894-1979), a New Zealand ex-soldier, ex-publisher and ex-bankrupt, believed it was the past participle of the Norwegian/Old Norse verb sling, so giving the concept of a “thrown” language. Into this tradition, Green (from greens, meaning sexual intercourse, b 1948) fits seamlessly. “What goes in a slang dictionary and what does not is often a matter of individual choice,” he writes. “Ultimately slang seems to be what you think it is.”

More here.

Spite Is Good. Spite Works

Natalie Angier in The New York Times:

SpiteThe “Iliad” may be a giant of Western literature, yet its plot hinges on a human impulse normally thought petty: spite. Achilles holds a festering grudge against Agamemnon (“He cheated me, wronged me … He can go to hell…”) turning down gifts, homage, even the return of his stolen consort Briseis just to prolong the king’s suffering. Now, after decades of focusing on such staples of bad behavior as aggressiveness, selfishness, narcissism and greed, scientists have turned their attention to the subtler and often unsettling theme of spite — the urge to punish, hurt, humiliate or harass another, even when one gains no obvious benefit and may well pay a cost. Psychologists are exploring spitefulness in its customary role as a negative trait, a lapse that should be embarrassing but is often sublimated as righteousness, as when you take your own sour time pulling out of a parking space because you notice another car is waiting for it and you’ll show that vulture who’s boss here, even though you’re wasting your own time, too.

Evolutionary theorists, by contrast, are studying what might be viewed as the brighter side of spite, and the role it may have played in the origin of admirable traits like a cooperative spirit and a sense of fair play. The new research on spite transcends older notions that we are savage, selfish brutes at heart, as well as more recent suggestions that humans are inherently affiliative creatures yearning to love and connect. Instead, it concludes that vice and virtue, like the two sides of a V, may be inextricably linked.

More here.

Monday, March 31, 2014

Sunday, March 30, 2014

An Attempt to Discover the Laws of Literature

Rothman-moretti-2-580

Joshua Rothman profiles Franco Moretti's efforts at 'distant reading' in the New Yorker (via Andrew Sullivan):

Franco Moretti, a professor at Stanford, whose essay collection “Distant Reading” just won the National Book Critics Circle Award for criticism, fascinates critics in large part because he does want to answer the question definitively. He thinks that literary criticism ought to be a science. In 2005, in a book called “Graphs, Maps, Trees: Abstract Models for a Literary History,” he used computer-generated visualizations to map, among other things, the emergence of new genres. In 2010, he founded the Stanford Literary Lab, which is dedicated to analyzing literature with software. The basic idea in Moretti’s work is that, if you really want to understand literature, you can’t just read a few books or poems over and over (“Hamlet,” “Anna Karenina,” “The Waste Land”). Instead, you have to work with hundreds or even thousands of texts at a time. By turning those books into data, and analyzing that data, you can discover facts about literature in general—facts that are true not just about a small number of canonized works but about what the critic Margaret Cohen has called the “Great Unread.” At the Literary Lab, for example, Moretti is involved in a project to map the relationships between characters in hundreds of plays, from the time of ancient Greece through the nineteenth century. These maps—which look like spiderwebs, rather than org charts—can then be compared; in theory, the comparisons could reveal something about how character relationships have changed through time, or how they differ from genre to genre. Moretti believes that these types of analyses can highlight what he calls “the regularity of the literary field. Its patterns, its slowness.” They can show us the forest rather than the trees.

Moretti’s work has helped to make “computational criticism,” and the digital humanities more generally, into a real intellectual movement. When, the week before last, Stanford announced that undergraduates would be able to enroll in “joint majors” combining computer science with either English or music, it was hard not to see it as a sign of Moretti’s influence. Yet Moretti has critics. They point out that, so far, the results of his investigations have been either wrong or underwhelming. (A typical Moretti finding is that, in eighteenth-century Britain, for instance, the titles of novels grew shorter as the market for novels grew larger—a fact that is “interesting” only in quotes.) And yet these sorts of objections haven’t dimmed the enthusiasm for Moretti’s work.

More here. Also see this pushback from Micah Mattix in The American Conservative.

The Top of the World

Cover00

Doug Henwood reviews Thomas Piketty's Capital in the Twenty-First Century, in Bookforum:

The core message of this enormous and enormously important book can be delivered in a few lines: Left to its own devices, wealth inevitably tends to concentrate in capitalist economies. There is no “natural” mechanism inherent in the structure of such economies for inhibiting, much less reversing, that tendency. Only crises like war and depression, or political interventions like taxation (which, to the upper classes, would be a crisis), can do the trick. And Thomas Piketty has two centuries of data to prove his point.

In more technical terms, the central argument of Capital in the Twenty-First Century is that as long as the rate of return on capital, r, exceeds the rate of broad growth in national income, g—that is, r > g—capital will concentrate. It is an empirical fact that the rate of return on capital—income in the form of profits, dividends, rents, and the like, divided by the value of the assets that produce the income—has averaged 4–5 percent over the last two centuries or so. It is also an empirical fact that the growth rate in GDP per capita has averaged 1–2 percent. There are periods and places where growth is faster, of course: the United States in younger days, Japan from the 1950s through the 1980s, China over the last thirty years. But these are exceptions—and the two earlier examples have reverted to the mean. So if that 4–5 percent return is largely saved rather than being bombed, taxed, or dissipated away, it will accumulate into an ever-greater mass relative to average incomes. That may seem like common sense to anyone who’s lived through the last few decades, but it’s always nice to have evidence back up common sense, which isn’t always reliable.

There’s another trend that intensifies the upward concentration of wealth: Fortunes themselves are ratcheting upward; within the proverbial 1 percent, the 0.1 percent are doing better than the remaining 0.9 percent, and the 0.01 percent are doing better than the remaining 0.09 percent, and so on. The bigger the fortune, the higher the return.

More here. Also see these reviews of the book by Jacob Hacker, Paul Pierson, Heather Boushey, and Branko Milanovic.

“Nymphomaniac: Vol. 1”: Fishers of Men, Meaning

Screen-shot-2014-03-20-at-4.04.32-PM-243x366

Lowry Pressly in The LA Review of Books:

It’s funny — and quite telling — that now that von Trier has made an unmistakably Sadean film, the majority of critical attention is focused not on the sadistic but on the allegedly pornographic aspects of the film. Though there is plenty of sex in Nymphomaniac — just not as much in the pared down version distributed here in the US as many expected or hoped for — as in the more transgressive works of Sade, the site of the film’s eroticism is in its discourse, in the telling of the story and not intermittent montages of T&A. Thus, from Juliette: “You have killed me with voluptuousness. Let’s sit down and discuss.” If he could hear the film press titter, surely the Marquis would be rolling (with mordant laughter) in his grave. And given that he was given a full Christian burial against his express wishes, that’s probably not all he’d be doing.

The term “nymphomania” comes to us (or persists, rather) as the result of a Victorian renaming of an ancient construction of female sexuality as psychopathology, which survived even as far as a few editions of the Diagnostic and Statistical Manual of Mental Disorders. (It was finally abandoned in 1987.) As a diagnosis, nymphomania was applied to displays of female sexuality that were considered “excessive,” which could mean anything from the harboring of sexual fantasies to being attracted to men other than one’s husband. Like most diagnoses that infer a disfigurement of the subject from observations of her behavior, it tells us more about the society that came up with it than about nymphomaniacs themselves. Nymphomania reminds us that what we recognize as deviant in others unsettles us. We often find it easier, or at least psychologically safer, to posit a pathological source for the behavior rather than confront it in ourselves.

More here.