On the Notion of ‘Belief’ in Religion

446px-Martin_Buber_portraitGary Gutting talks to Howard Wettstein, a professor of philosophy at the University of California, Riverside, and the author of “The Significance of Religious Experience”, over at the NYT's The Stone (image, portrait of Martin Buber, from Wikimedia Commons):

H.W.: I had a close friend in Jerusalem, the late Rabbi Mickey Rosen, whose relation to God was similarly intimate. To watch him pray was to have a glimpse of such intimacy. To pray with him was to taste it; God was almost tangible. As with Feynman, Mickey had no patience with the philosophers’ questions. God’s reality went without saying. God’s existence as a supernatural being was quite another thing. “Belief,” he once said to me, “is not a Jewish notion.” That was perhaps a touch of hyperbole. The point, I think, was to emphasize that the propositions we assent to are hardly definitive of where we stand. He asked of his congregants only that they sing with him, song being somewhat closer to the soul than assent.

This brings to mind Buber’s emphasis on the distinction between speaking to God, something that is readily available to all of us, and significant speech/thought about God, something that Buber took to be impossible.

G.G.: But you can’t in fact speak to someone who doesn’t exist — I can’t speak to Emma Bovary, although I can pretend to or think I can. Further, why would you even want to pray to someone you didn’t believe exists? On your account praying to God seems like playacting, not genuine religious commitment.

H.W.: Were I to suggest that God does not exist, that God fails to exist, then what you suggest would have real purchase. My thought is otherwise; it’s rather that “existence” is, pro or con, the wrong idea for God.

My relation to God has come to be a pillar of my life, in prayer, in experience of the wonders and the awfulness of our world. And concepts like the supernatural and transcendence have application here. But (speaking in a theoretical mode) I understand such terms as directing attention to the sublime rather than referring to some nonphysical domain. To see God as existing in such a domain is to speak as if he had substance, just not a natural or physical substance. As if he were composed of the stuff of spirit, as are, perhaps, human souls. Such talk is unintelligible to me. I don’t get it.

The theism-atheism-agnosticism trio presumes that the real question is whether God exists. I’m suggesting that the real question is otherwise and that I don’t see my outlook in terms of that trio.

More here.

Automated Ethics

Bridle-drone

Tom Chatfield in Aeon:

Back in August 2012, Google announced that it had achieved 300,000 accident-free miles testing its self-driving cars. The technology remains some distance from the marketplace, but the statistical case for automated vehicles is compelling. Even when they’re not causing injury, human-controlled cars are often driven inefficiently, ineptly, antisocially, or in other ways additive to the sum of human misery.

What, though, about more local contexts? If your vehicle encounters a busload of schoolchildren skidding across the road, do you want to live in a world where it automatically swerves, at a speed you could never have managed, saving them but putting your life at risk? Or would you prefer to live in a world where it doesn’t swerve but keeps you safe? Put like this, neither seems a tempting option. Yet designing self-sufficient systems demands that we resolve such questions. And these possibilities take us in turn towards one of the hoariest thought-experiments in modern philosophy: the trolley problem.

In its simplest form, coined in 1967 by the English philosopher Philippa Foot, the trolley problem imagines the driver of a runaway tram heading down a track. Five men are working on this track, and are all certain to die when the trolley reaches them. Fortunately, it’s possible for the driver to switch the trolley’s path to an alternative spur of track, saving all five. Unfortunately, one man is working on this spur, and will be killed if the switch is made.

In this original version, it’s not hard to say what should be done: the driver should make the switch and save five lives, even at the cost of one. If we were to replace the driver with a computer program, creating a fully automated trolley, we would also instruct it to pick the lesser evil: to kill fewer people in any similar situation. Indeed, we might actively prefer a program to be making such a decision, as it would always act according to this logic while a human might panic and do otherwise.

More here.

Big Data: Are We Making a Big Mistake?

24930f00-b495-11e3-a09a-00144feabdc0

Tim Harford in the FT Magazine:

Cheerleaders for big data have made four exciting claims, each one reflected in the success of Google Flu Trends: that data analysis produces uncannily accurate results; that every single data point can be captured, making old statistical sampling techniques obsolete; that it is passé to fret about what causes what, because statistical correlation tells us what we need to know; and that scientific or statistical models aren’t needed because, to quote “The End of Theory”, a provocative essay published in Wired in 2008, “with enough data, the numbers speak for themselves”.

Unfortunately, these four articles of faith are at best optimistic oversimplifications. At worst, according to David Spiegelhalter, Winton Professor of the Public Understanding of Risk at Cambridge university, they can be “complete bollocks. Absolute nonsense.”

Found data underpin the new internet economy as companies such as Google, Facebook and Amazon seek new ways to understand our lives through our data exhaust. Since Edward Snowden’s leaks about the scale and scope of US electronic surveillance it has become apparent that security services are just as fascinated with what they might learn from our data exhaust, too.

Consultants urge the data-naive to wise up to the potential of big data. A recent report from the McKinsey Global Institute reckoned that the US healthcare system could save $300bn a year – $1,000 per American – through better integration and analysis of the data produced by everything from clinical trials to health insurance transactions to smart running shoes.

But while big data promise much to scientists, entrepreneurs and governments, they are doomed to disappoint us if we ignore some very familiar statistical lessons.

“There are a lot of small data problems that occur in big data,” says Spiegelhalter. “They don’t disappear because you’ve got lots of the stuff. They get worse.”

More here.

BRET EASTON ELLIS AND LINDSAY LOHAN MAKE A FEATURE FILM

Article_anolikLili Anolik at The Believer:

Bret Easton Ellis is modern literature’s little rascal supreme. He seems to do things for no reason other than the fun of it. Take, for example, the many references in his books to his other books, references made in such a super-subtle yet obsessive way he could be doing it only to amuse himself. His minor characters are often recurring. Sean Bateman, for example, one of the protagonists in The Rules of Attraction, has, it is glancingly mentioned, an older brother, Patrick, the gifter of a brown Ralph Lauren tie about which Sean has ambivalent feelings. Patrick then lands the lead role as the Psycho who also happens to be an American in Ellis’s next work. Ellis did the same thing with Victor Johnson, Lauren Hynde’s mostly offstage boyfriend in The Rules of Attraction, moving him from the periphery of that novel (he’s backpacking through Europe for much of the narrative) to front-and-center in Glamorama. Ellis even gives him a stage name, Victor Ward—which is stronger, more macho-sounding, and, with fewer syllables, fits better on a marquee—as is commensurate with his change in status from bit player to star. What or whom, one wonders, did these characters have to do in order to secure their big breaks? If any writer would have a casting couch for his fictional creations, it would be Ellis.

more here.

Immanuel Velikovsky’s strange quest for a scientific theory of everything

Findlen_visionaryfringe_ba_imgPaula Findlen at The Nation:

In the 1940s, a curiously enigmatic figure haunted New York City’s great libraries, his mind afire with urgent questions whose resolution might reveal, once and for all, the most ancient secrets of the universe in their crystalline clarity. This scholar eschewed the traditional disciplinary boundaries that define the intellectual terrain of the specialist; instead, he read widely, skimming the surface of countless works of science, myth and history to craft an answer to an overwhelming question: Had our planet been altered repeatedly by cosmic catastrophes whose traces could be found in the earliest human records?

A fantastic theory began to emerge, redolent of the efforts of an earlier age to unify knowledge, yet speaking to the preoccupations of a world contemplating the chaos of another gruesome European war. The solar system, it was revealed, did not operate according to Newton’s universal laws of gravitation, nor did life on Earth evolve gradually and continuously, as Darwin had written. Instead, the cosmos was like a giant atom, periodically discharging photons whose energy disrupted and redirected the movements of celestial bodies, even causing the reversal of Earth’s magnetic poles. A planet was a kind of super-electron.

more here.

The new war literature by veterans

140407_r24824_p465George Packer at The New Yorker:

Soldiers who set out to write the story of their war also have to navigate a minefield of clichés: all of them more or less true but open to qualification; many sowed long before the soldiers were ever deployed, because every war is like every other war. That’s one of them. War is hell is another. War begins in illusion and ends in blood and tears. Soldiers go to war for their country’s cause and wind up fighting for one another. Soldiers are dreamers (Sassoon said that). No one returns from war the same person who went. War opens an unbridgeable gap between soldiers and civilians. There’s no truth in war—just each soldier’s experience. “You can tell a true war story by its absolute and uncompromising allegiance to obscenity and evil” (from “How to Tell a True War Story,” in O’Brien’s story collection “The Things They Carried”).

Irony in modern American war literature takes many forms, and all risk the overfamiliarity that transforms style into cliché. They begin with Hemingway’s rejection, in “A Farewell to Arms,” of the high, old language, his insistence on concreteness: “I had seen nothing sacred, and the things that were glorious had no glory and the sacrifices were like the stockyards at Chicago if nothing was done with the meat except to bury it. There were many words that you could not stand to hear and finally only the names of places had dignity.”

more here.

The need for the public practice of the humanities in India

Prashant Keshavmurthy in Chapati Mystery:

Shibli NomaniIn 1892, Maulana Shibli Nu’māni, an internationally celebrated Indian Muslim historian, (Urdu-Persian) literary critic and theologian of his day, traveled by sea from Bombay to the Ottoman Empire, journeying through Cyprus, Istanbul, Syria and Egypt. Of this journey he kept a journal that he later published under the title of Safarnāma-i rūm va misr va shām (A Travel Account of Turkey, Egypt and Syria).2 He claims that he had not intended to write a travel account but that European prejudices with regard to the Turks had led him to do so. Even well-meaning Europeans, he observes, remain bound by the Islamophobic prejudices they are raised with. His aims in writing it are therefore corrective and pedagogical: to correct prejudiced European travel accounts of Turkey that form the basis for European histories, and to instruct Indian Muslims by documenting exemplary “progress” among Turkish Muslims. The Turkey or Ottoman state of Shibli’s time, we must remember, was the only one of the three great early modern Islamic states – the other two being Safavid Iran and Mughal India – to still be extant. Moreover, its emperor, Abduḥamīd II (1876 – 1909), had only recently achieved radical advances in the movement to modernize or “reorganize” – “reorganization” or tanzīmāt bespeaking the bureaucratic character of this modernity – of his state on European models. Shibli intends therefore to focus on the “developments and reforms” of the Muslim world, especially Turkey.

The turn of the century preoccupation with lost Mughal sovereignty among North India’s Reformist Muslims – a sovereignty they understood as Muslim in the wake of the formal end of the Mughal state in 1857 – led them to regard the still regnant Ottoman empire with special attention: in it they saw a Muslim empire that was modeling itself through technological and institutional reforms on Europe, the very ambition of Sayyid Aḥmad Khān, the founder of what became Aligarh Muslim University, and his colleagues like Shibli Nu’māni. Shibli thus discusses formerly Ottoman Cyprus, when he passes through it, in terms of the history of its political sovereignty under Muslim and then British rule. Furthermore, everywhere in his travels he singles out educational syllabi, technology, and such empirical aspects of a society as clothing and food, treating them as indices of a polity’s development. Shibli desires and is at pains to discover signs of a continuous Muslim world. That he conflates all Arabs in the Ottoman territories with Muslims and vice versa signals this desire.

More here.

The Video Game Engine in Your Head

Joshua Hartshome in Scientific American:

VideoFor years now, physicists and engineers have been building computer simulations of physics in order to understand the behavior of objects in the world. Want to see if a bridge would be stable during an earthquake? Enter it into the simulation, apply earthquake dynamics, and see what happens. Recently, the prestigious Proceedings of the National Academy of Sciences published work by MIT psychologists (and my labmates) Peter Battaglia, Jessica Hamrick, and Joshua Tenenbaum, arguing that all humans do roughly the same thing when trying to understand or make predictions about the physical world. The primary difference is that we run our simulations in our brains rather than in digital computers, but the basic algorithms are roughly equivalent. The analogy runs deep: To model human reasoning about the physical world, the researchers actually used an open-source computer game physics engine — the software that applies the laws of physics to objects in video games in order to make them interact realistically (think Angry Birds).

Battaglia and colleagues found that their video game-based computer model matches human physical reasoning far better than any previous theory. The authors asked people to make a number of predictions about the physical world: will tower of blocks stand or fall over, what direction would it fall over, and where would the block that landed the farthest away land; which object would most likely fall off of a table if the table was bumped; and so on. In each case, human judgments closely matched the prediction of the computer simulation … but not necessarily the actual world, which is where it gets interesting.

More here.

Wednesday Poem

Under our boot soles

In memory of Jim Thomas

Once you stepped out an open window onto nothing
we could see from our desks, and for a whole
long second you floated and didn't fall
through two floors of air to the earth's something.

You never fell. You were just going smoking
before class on the unseen roof.
All of us saw you make that roof when you didn't fall.
You took drags, looked down, looked up, thinking.

Then you stepped back through the open window
and read us the end of “Song of Myself”
where the spotted hawk swoops and grass grows

under a boot. You were all voice, we were all ears.
Up ahead words with hollow bones wait
once you step onto nothing. We could hear.
.

by Dennis Finnell
from Ruins Assembling
Shape and Nature Press, Geenfield, Ma.2014

How Darwinian is Cultural Evolution?

Charles_Darwin_aged_51

Nicolas Claidière, Thomas C. Scott-Phillips, and Dan Sperber over at the Philosophical Transactions of the Royal Society (image via Wikimedia Commons):

Abstract

Darwin-inspired population thinking suggests approaching culture as a population of items of different types, whose relative frequencies may change over time. Three nested subtypes of populational models can be distinguished: evolutionary, selectional and replicative. Substantial progress has been made in the study of cultural evolution by modelling it within the selectional frame. This progress has involved idealizing away from phenomena that may be critical to an adequate understanding of culture and cultural evolution, particularly the constructive aspect of the mechanisms of cultural transmission. Taking these aspects into account, we describe cultural evolution in terms of cultural attraction, which is populational and evolutionary, but only selectional under certain circumstances. As such, in order to model cultural evolution, we must not simply adjust existing replicative or selectional models but we should rathergeneralize them, so that, just as replicator-based selection is one form that Darwinian selection can take, selection itself is one of several different forms that attraction can take. We present an elementary formalization of the idea of cultural attraction.

1. Population thinking applied to culture

In the past 50 years, there have been major advances in the study of cultural evolution inspired by ideas and models from evolutionary biology. Modelling cultural evolution involves, as it would for any complex phenomenon, making simplifying assumptions; many factors have to be idealized away. Each particular idealization involves a distinct trade-off between gaining clarity and insight into hopefully major dimensions of the phenomenon and neglecting presumably less important dimensions. Should one look for the best possible idealization? There may not be one. Different sets of simplifying assumptions may each uniquely yield worthwhile insights. In this article, we briefly consider some of the simplifications that are made in current models of cultural evolution and then suggest how important dimensions of the phenomenon that have been idealized away might profitably be introduced in a novel approach that we see as complementary rather than as alternative to current approaches. All these approaches, including the one we are advocating, are Darwinian, but in different ways that are worth spelling out.

Much clarity has been gained by drawing on the analogy between cultural and biological evolution (an analogy suggested by Darwin himself: ‘The formation of different languages and of distinct species, and the proofs that both have been developed through a gradual process, are curiously parallel’. This has made it possible to draw inspiration from formal methods in population genetics with appropriate adjustments and innovations. Of course, the analogy with biological evolution is not perfect. For example, variations in human cultural evolution are often intentionally produced in the pursuit of specific goals and hence are much less random than in the biological case.

More here.

Gloomy Terrors or the Most Intense Pleasure?

Appleton-after-Sully-436x744

Via Andrew Sullivan, Philip Schofield discusses Jeremy Bentham's writings on religion and sex, over at the Oxford University Press blog:

In 1814, just two hundred years ago, the radical philosopher Jeremy Bentham (1748–1832) began to write on the subject of religion and sex, and thereby produced the first systematic defence of sexual liberty in the history of modern European thought. Bentham’s manuscripts have now been published for the first time in authoritative form. He pointed out that ‘regular’ sexual activity consisted in intercourse between one male and one female, within the confines of marriage, for the procreation of children. He identified the source of the view that only ‘regular’ or ‘natural’ sexual activity was morally acceptable in the Mosaic Law and in the teachings of the self-styled Apostle Paul. ‘Irregular’ sexual activity, on the other hand, had many variations: intercourse between one man and one woman, when neither of them were married, or when one of them was married, or when both of them were married, but not to each other; between two women; between two men; between one man and one woman but using parts of the body that did not lead to procreation; between a human being and an animal of another species; between a human being and an inanimate object; and between a living human and a dead one. In addition, there was the ‘solitary mode of sexual gratification’, and innumerable modes that involved more than two people. Bentham’s point was that, given that sexual gratification was for most people the most intense and the purest of all pleasures and that pleasure was a good thing (the only good thing in his view), and assuming that the activity was consensual, a massive amount of human happiness was being suppressed by preventing people, whether from the sanction of the law, religion, or public opinion, from engaging in such ‘irregular’ activities as suited their taste.

Bentham was writing at a time when homosexuals, those guilty of ‘the crime against nature’, were subject to the death penalty in England, and were in fact being executed at about the rate of two per year, and were vilified and ridiculed in the press and in literature. If an activity did not cause harm, Bentham had argued as early as the 1770s and 1780s, then it should not be subject to legal punishment, and had called for the decriminalization of homosexuality. By the mid-1810s he was prepared to link the problem not only with law, but with religion. The destruction of Sodom and Gomorrah was taken by ‘religionists’, as Bentham called religious believers, to prove that God had issued a universal condemnation of homosexuality. Bentham pointed out that what the Bible story condemned was gang rape.

More here.

What slang says about us

Nicholas Shakespeare in The Telegraph:

SlangSlang’s first compilers were chippy individualists, routinely beset by financial worries and complex marital lives. They were never grandees like the 70-odd team beavering away still on the Oxford English Dictionary in Great Clarendon Street (less than 30 yards from where I live in Oxford). They numbered Francis Grose (1731-91), the son of a Swiss jeweller, who was so fat that his servant had to strap him into bed every night; Pierce Egan (1772-1849), a boxing journalist and editor of Real Life in London; and John William Hotten (1832-73), a workaholic pornographer (The Romance of Chastisement) who died from a surfeit of pork chops, and was remembered, unfairly, by the phrase: “Hotten: rotten, and forgotten”. Even so, they shared many characteristics of lexicographers like William Chester Minor (1834-1920), one of the OED’s founding fathers, who was, quite conclusively, bonkers. As one of Jonathon Green’s mentors, Anthony Burgess, cautions: “The study of language may beget madness.”

Super-geeks (from geek, meaning fool) to a man, slang’s lexicographers tend to be self-appointed guardians who, while cheerfully plagiarising each other in their project to demonstrate the importance and scope of slang, have yet to agree on a definition of what, precisely, slang is, or was – or even its origin. Hotten believed slang to be a gipsy term for the gipsies’ secret language; the Oxford philologist Walter Skeat attributed it to the Icelandic slunginn (cunning), while Eric Partridge (1894-1979), a New Zealand ex-soldier, ex-publisher and ex-bankrupt, believed it was the past participle of the Norwegian/Old Norse verb sling, so giving the concept of a “thrown” language. Into this tradition, Green (from greens, meaning sexual intercourse, b 1948) fits seamlessly. “What goes in a slang dictionary and what does not is often a matter of individual choice,” he writes. “Ultimately slang seems to be what you think it is.”

More here.

Spite Is Good. Spite Works

Natalie Angier in The New York Times:

SpiteThe “Iliad” may be a giant of Western literature, yet its plot hinges on a human impulse normally thought petty: spite. Achilles holds a festering grudge against Agamemnon (“He cheated me, wronged me … He can go to hell…”) turning down gifts, homage, even the return of his stolen consort Briseis just to prolong the king’s suffering. Now, after decades of focusing on such staples of bad behavior as aggressiveness, selfishness, narcissism and greed, scientists have turned their attention to the subtler and often unsettling theme of spite — the urge to punish, hurt, humiliate or harass another, even when one gains no obvious benefit and may well pay a cost. Psychologists are exploring spitefulness in its customary role as a negative trait, a lapse that should be embarrassing but is often sublimated as righteousness, as when you take your own sour time pulling out of a parking space because you notice another car is waiting for it and you’ll show that vulture who’s boss here, even though you’re wasting your own time, too.

Evolutionary theorists, by contrast, are studying what might be viewed as the brighter side of spite, and the role it may have played in the origin of admirable traits like a cooperative spirit and a sense of fair play. The new research on spite transcends older notions that we are savage, selfish brutes at heart, as well as more recent suggestions that humans are inherently affiliative creatures yearning to love and connect. Instead, it concludes that vice and virtue, like the two sides of a V, may be inextricably linked.

More here.