The Independent Republic of New York

ScreenHunter_587 Apr. 04 12.15As New York—a city that often has more in common with Europe than with the United States—prepares to be invaded by the red-state hordes during an election that has much of the city fearing the prospect of four more years, a persistent fantasy resurfaces—should New York secede?

Jennifer Senior in New York Magazine:

New York has always felt like a nation apart. In a country that grows ever redder, it is the bluest of blue cities in one of the bluest of blue states, with the eccentrics to match. Eric Bogosian, with those three cubic feet of curls and black-leather car coat; Harvey Weinstein, with his public tantrums and highfalutin taste; Ed Koch; Lou Reed and Laurie Anderson; the Black Israelites preaching in Times Square; Mexican kitchen workers preparing sushi in Korean delis—could any of them find a home anywhere but New York? Even the New York Post: Where else could a right-wing Australian media mogul win over a left-wing, multiethnic cosmopolis with a toothsome rag of boldface names, sports scores, political scandals, tearjerkers, hectoring editorials, and front-page oopsie-daisies announcing the anointment of Dick Gephardt as John Kerry’s running mate? Only in New York, kids. Only in New York.

Psychically, then, New York already seems headed out of the union—so why not go all the way? If we’re so blue, perhaps it’s time to choose another color entirely. (Maybe black.)

More here.

Thursday, April 3, 2014

What is academic history for?

Paula A. Michaels at the Oxford University Press blog:

Writing on Saturday in The Age, popular historian Paul Ham launched a frontal assault on “academic history” produced by university-based historians primarily for consumption by their professional peers.

In his article, Ham muses on whether these writings ever “enlightened or defied anyone or just pinged the void of indifference” Lamenting its alleged inaccessibility and narrow audience, Ham asks with incredulity:

What is academic history for?

Ham’s is only the latest in a steady stream of attacks castigating historians and other scholars for their inability to engage the general public effectively. New York Times columnist Nicholas Kristof sent American academia into a collective apoplectic fit with a February column imploring academics to make a greater contribution to policy debates as public intellectuals.

Less convinced than Ham of the purposeful obscurantism of academic writing, Kristof nonetheless met with a sharp rebuke from the academy, which defended its track record for engagement and faulted Kristof for pointing only to the highest profile venues to judge scholars’ participation in debates beyond the Ivory Tower.

As political scientist Corey Robin observes:

there are a lot of gifted historians. And only so many slots for them at The New Yorker.

More here.

Can you die of a broken heart?

Jason G Goldman at the BBC:

ScreenHunter_586 Apr. 04 10.04In 1986, a 44-year-old woman was admitted to Massachusetts General Hospital. She felt fine all day, but in the afternoon she developed extreme crushing pain in her chest, radiating through her left arm. It's a classic sign of a heart attack, but the puzzling thing was that she didn't suffer from coronary heart disesase. There was no life-threatening clot in the arteries surrounding the heart.

It looked, from the outside, like a heart attack, but it wasn't. Describing theunusual case in the New England Journal of Medicine, Thomas Ryan, and John Fallon suggest the apparent damage to the heart muscle was emotional rather than physiological. Earlier that day, she had been informed that her 17-year-old son had committed suicide.

Could the woman have suffered from a broken heart? The answer, it turned out, was already hiding in plain sight. The Massachusetts case was surprising to doctors – but it wasn’t news to everybody.

For many years, doctors scorned the idea of a relationship between psychology and physiology. In their book Zoobiquity, Kathryn Bowers and Barbara Natterson-Horowitz described this attitude: “Among many physicians, the idea that emotions could cause actual physical events within the architecture of the heart was viewed with nearly the same sideways glance as an interest in healing crystals or homeopathy. Real cardiologists concentrated on real problems you could see: arterial plaque, embolising blood clots, and rupturing aortas. Sensitivity was for psychiatrists.”

Despite this, the evidence that extreme emotions can impact the heart goes back decades – only not among humans. It was wildlife biologists and veterinarians who first noticed that extreme emotions can wreak havoc on body physiology.

More here.

Akhil Sharma’s ‘Family Life’

Sonali Deraniyagala in the New York Times:

0406-bks-cover-master675“Where is Ajay? What was the point of having raised him?” an elderly woman grumbles to her husband about their adult son in the opening pages of Akhil Sharma’s semi-autobiographical new novel, “Family Life.” This book, deeply unnerving and gorgeously tender at its core, charts the young life of Ajay Mishra as he struggles to grow within a family shattered by loss and disoriented by a recent move from India to America. “Family Life” is equally the story of Ajay’s parents, whose response to grief renders them unable to find the space in which to cherish and raise him.

Sharma’s previous novel, “An Obedient Father,” was a remorseless, forceful tale of a corrupt Indian civil servant who molests his daughter and ruins lives, including his own. “Family Life,” while also about domestic torment, is gentler and of an altogether different quality.

When we first meet the Mishras, they are a young, middle-class family living in Delhi in the mid-1970s. India is under emergency rule, a time of gloom and uncertainty, but for 8-year-old Ajay and his older brother, Birju, life is playful and secure. Their mother lights their world, while their father seems so superfluous that Ajay wonders if he’s been assigned to them by the government.

More here.

The Story of Italy and Its Citrus Fruit

Keates_04_14Joanathan Keates at Literary Review:

Goethe's 'The Apprenticeship of Wilhelm Meister', a neglected masterpiece if ever there was, is known nowadays for a single line from a ballad sung by Mignon, the daughter of a wandering musician. 'Know'st thou the land where the lemon trees bloom?' begins her mysterious song, describing an imagined world of blue skies, marble statues and thunderous waterfalls, not without a lurking menace beneath its beauty. When Wilhelm asks her where she heard it, Mignon answers, 'Italy! If thou go to Italy, take me along with thee; for I am too cold here.'

Goethe's verses encapsulate the romantic hankering for what Browning hailed as 'the land of lands' and Forster identified as 'a place that's upset people since the beginning of the world'. Citrus fruit, as Helena Attlee clearly understands, is the ultimate metaphor for Italy as an object of desire among us shivering mortals on the wrong side of the Alps. In Goethe's day, northern Europeans with enough money created elegant orangeries, where the precious trees spent a coddled winter indoors before sweating gardeners trundled them onto the terrace for a few weeks of watery sunshine. Such buildings were a fantasy Hesperides. The real golden apples grew far to the south, where the ancestral wisdom of farmers, cooks, perfumers, engineers and entrepreneurs placed citrus fruits alongside the grape and the olive as an archetype of Mediterranean fertility.

more here.

some good new poems

F2f_Dchiasson_opn2Dan Chiasson at the New York Review of Books:

American poets tend to want the benefits of song—its emotionality, its melodiousness—without its costs: its triviality, its obliviousness, its feyness. This conflict drives Michael Ruby’s American Songbook, whose title reminds us that we have no body of popular American poems to match the body of American songs, by the Gershwins and Irving Berlin and Cole Porter and many others, whose tunes and lyrics many people know by heart. Ruby’s book presents his own poems, some of them loosely connected with popular songs. What would “Love for Sale,” the Porter tune Ella Fitzgerald made famous, sound like as a difficult postmodern poem? Here is the opening of Ruby’s “Love for Sale,” dedicated by him to Ella Fitzgerald:

defeats sight force
feet please
lonesome
pail (of milk
I peacock throne open shop to a small group

moon of gazing down
draughts the lit tunnel
wayward town of
apricots mortals
smirk during
speeches
I peacock throne go toys to work on vanishing

This is “composed,” more in William Carlos Williams’s sense than in Porter’s, out of noirish bits of city life, rank with desire.

more here.

This Renovation Plan Will Ruin MoMA

A_560x375Jerry Saltz at New York Magazine:

On January 8, Museum of Modern Art director Glenn Lowry and the architects Diller Scofidio + Renfro made public their scheme to redesign and expand MoMA. Since then, virtually no artists or architects, or art, design, or architecture critics, have lauded the plan. Nearly all the reaction has been negative. Yet no one’s raised a finger to do much of anything about it. We live in a time when power structures are impervious to and imperious about protest. Yet the Lowry–DS+R plan so irretrievably dooms MoMA to being a business-driven carnival that it feels like something really worth fighting against. Actions like this aren’t pie-in-the-sky or far-fetched. If 40 well-known artists whose work is in the collection signed a petition protesting the plans, it might have a real effect. This is MoMA’s Robert Moses moment, and five decades ago, artists were key to stopping his Lower Manhattan Expressway from being built. By the end of May, the problematic American Folk Art Museum on the MoMA site will likely be torn down, to be replaced with an even worse building for art. Then construction will begin. If this scheme is not stopped immediately, it’s going to go ahead.

So far, the public has seen a couple of drawings of the gleaming glass squash-court galleries that will replace AFAM.

more here.

Literacy Is Knowledge

Robert Pondiscio in City Journal:

BookEducators, policy makers and business leaders often fret about the state of math education,” the New York Times reported in May. “But reading comprehension may be a larger stumbling block.” Indeed, schools and teachers consistently have better luck improving student skills in math than in reading. A fresh reminder of the difficulty came in August, when New York released scores from its first round of tests aligned with the Common Core State Standards, now adopted by most states. Students in schools across the state fared poorly on the tests; some of the city’s most celebrated charter schools posted disappointing results as well. The silver lining is that by adopting reading curricula aligned with the Common Core and abandoning failed approaches to literacy instruction, New York City could be poised to lead a reading renaissance in the coming years—but only if city schools also make significant shifts in classroom instruction and exercise patience.

Math is relentlessly hierarchical—you can’t understand multiplication, for example, if you don’t understand addition. Reading is mercilessly cumulative. Virtually everything a child sees and hears, in and out of school, contributes to his vocabulary and language proficiency. A child growing up in a book-filled home with articulate, educated parents who fill his early years with reading, travel, museum visits, and other forms of enrichment arrives at school with enormous advantages in knowledge and vocabulary.

More here.

First comprehensive atlas of human gene activity released

From KurzweilAI:

FANTOM5-samplesA large international consortium of researchers has produced the first comprehensive, detailed map of the way genes work across the major cells and tissues of the human body. The findings describe the complex networks that govern gene activity, and the new information could play a crucial role in identifying the genes involved with disease. “Now, for the first time, we are able to pinpoint the regions of the genome that can be active in a disease and in normal activity, whether it’s in a brain cell, the skin, in blood stem cells or in hair follicles,” said Winston Hide, associate professor of bioinformatics and computational biology at Harvard School of Public Health (HSPH) and one of the core authors of the main paper in Nature.

“This is a major advance that will greatly increase our ability to understand the causes of disease across the body.” The research is outlined in a series of papers published March 27, 2014, two in the journal Nature and 16 in other scholarly journals. The work is the result of years of concerted effort among 250 experts from more than 20 countries as part of FANTOM 5 (Functional Annotation of the Mammalian Genome). The FANTOM project, led by the Japanese institution RIKEN, is aimed at building a complete library of human genes.

More here.

Thursday

When You Are Old

When you are old and grey and full of sleep,
And nodding by the fire, take down this book,
And slowly read, and dream of the soft look
Your eyes had once, and of their shadows deep;

How many loved your moments of glad grace,
And loved your beauty with love false or true,
But one man loved the pilgrim soul in you,
And loved the sorrows of your changing face;

And bending down beside the glowing bars,
Murmur, a little sadly, how Love fled
And paced upon the mountains overhead
And hid his face amid a crowd of stars.

by W.B. Yeats
.
.

Wednesday, April 2, 2014

Other People’s Pathologies

Lead

Ta-Nehisi Coates in The Atlantic:

Over the past week or so, Jonathan Chait and I have enjoyed an ongoing debateover the rhetoric the president employs when addressing African Americans. Here is my initial installment, Chait's initial rebuttal, my subsequent reply, and Chait'slatest riposte. Initially Chait argued that President Obama's habit of speaking about culture before black audiences was laudable because it would “urge positive habits and behavior” that are presumably found especially wanting in the black community.

Chait argued that this lack of sufficient “positive habits and behaviors” stemmed from cultural echoes of past harms, which now exist “independent” of white supremacy. Chait now concedes that this assertion is unsupportable and attempts to recast his original argument:

I attributed the enduring culture of poverty to the residue of slavery, terrorism, segregation, and continuing discrimination.

Not quite (my emphasis):

The argument is that structural conditions shape culture, and culture, in turn, can take on a life of its own independent of the forces that created it.It would be bizarre to imagine that centuries of slavery, followed by systematic terrorism, segregation, discrimination, a legacy wealth gap, and so on did notleave a cultural residue that itself became an impediment to success.

The phrase “culture of poverty” doesn't actually appear in Chait's original argument. Nor should it—the history he cites was experienced by all variety of African Americans, poor or not. Moreover, the majority of poor people in America have neither the experience of segregation nor slavery in their background. Chait isconflating two different things: black culture—which was shaped by, and requires, all the forces he named; and “a culture of poverty,” which requires none of them.

That conflation undergirds his latest column. Chait paraphrases my argument that “there is no such thing as a culture of poverty.” His evidence of this is quoting me attacking the “the notion that black culture is part of the problem.” This evidence only works if you believe “black culture” and “a culture of poverty” are somehow interchangeable.

More here.

On the Notion of ‘Belief’ in Religion

446px-Martin_Buber_portraitGary Gutting talks to Howard Wettstein, a professor of philosophy at the University of California, Riverside, and the author of “The Significance of Religious Experience”, over at the NYT's The Stone (image, portrait of Martin Buber, from Wikimedia Commons):

H.W.: I had a close friend in Jerusalem, the late Rabbi Mickey Rosen, whose relation to God was similarly intimate. To watch him pray was to have a glimpse of such intimacy. To pray with him was to taste it; God was almost tangible. As with Feynman, Mickey had no patience with the philosophers’ questions. God’s reality went without saying. God’s existence as a supernatural being was quite another thing. “Belief,” he once said to me, “is not a Jewish notion.” That was perhaps a touch of hyperbole. The point, I think, was to emphasize that the propositions we assent to are hardly definitive of where we stand. He asked of his congregants only that they sing with him, song being somewhat closer to the soul than assent.

This brings to mind Buber’s emphasis on the distinction between speaking to God, something that is readily available to all of us, and significant speech/thought about God, something that Buber took to be impossible.

G.G.: But you can’t in fact speak to someone who doesn’t exist — I can’t speak to Emma Bovary, although I can pretend to or think I can. Further, why would you even want to pray to someone you didn’t believe exists? On your account praying to God seems like playacting, not genuine religious commitment.

H.W.: Were I to suggest that God does not exist, that God fails to exist, then what you suggest would have real purchase. My thought is otherwise; it’s rather that “existence” is, pro or con, the wrong idea for God.

My relation to God has come to be a pillar of my life, in prayer, in experience of the wonders and the awfulness of our world. And concepts like the supernatural and transcendence have application here. But (speaking in a theoretical mode) I understand such terms as directing attention to the sublime rather than referring to some nonphysical domain. To see God as existing in such a domain is to speak as if he had substance, just not a natural or physical substance. As if he were composed of the stuff of spirit, as are, perhaps, human souls. Such talk is unintelligible to me. I don’t get it.

The theism-atheism-agnosticism trio presumes that the real question is whether God exists. I’m suggesting that the real question is otherwise and that I don’t see my outlook in terms of that trio.

More here.

Automated Ethics

Bridle-drone

Tom Chatfield in Aeon:

Back in August 2012, Google announced that it had achieved 300,000 accident-free miles testing its self-driving cars. The technology remains some distance from the marketplace, but the statistical case for automated vehicles is compelling. Even when they’re not causing injury, human-controlled cars are often driven inefficiently, ineptly, antisocially, or in other ways additive to the sum of human misery.

What, though, about more local contexts? If your vehicle encounters a busload of schoolchildren skidding across the road, do you want to live in a world where it automatically swerves, at a speed you could never have managed, saving them but putting your life at risk? Or would you prefer to live in a world where it doesn’t swerve but keeps you safe? Put like this, neither seems a tempting option. Yet designing self-sufficient systems demands that we resolve such questions. And these possibilities take us in turn towards one of the hoariest thought-experiments in modern philosophy: the trolley problem.

In its simplest form, coined in 1967 by the English philosopher Philippa Foot, the trolley problem imagines the driver of a runaway tram heading down a track. Five men are working on this track, and are all certain to die when the trolley reaches them. Fortunately, it’s possible for the driver to switch the trolley’s path to an alternative spur of track, saving all five. Unfortunately, one man is working on this spur, and will be killed if the switch is made.

In this original version, it’s not hard to say what should be done: the driver should make the switch and save five lives, even at the cost of one. If we were to replace the driver with a computer program, creating a fully automated trolley, we would also instruct it to pick the lesser evil: to kill fewer people in any similar situation. Indeed, we might actively prefer a program to be making such a decision, as it would always act according to this logic while a human might panic and do otherwise.

More here.

Big Data: Are We Making a Big Mistake?

24930f00-b495-11e3-a09a-00144feabdc0

Tim Harford in the FT Magazine:

Cheerleaders for big data have made four exciting claims, each one reflected in the success of Google Flu Trends: that data analysis produces uncannily accurate results; that every single data point can be captured, making old statistical sampling techniques obsolete; that it is passé to fret about what causes what, because statistical correlation tells us what we need to know; and that scientific or statistical models aren’t needed because, to quote “The End of Theory”, a provocative essay published in Wired in 2008, “with enough data, the numbers speak for themselves”.

Unfortunately, these four articles of faith are at best optimistic oversimplifications. At worst, according to David Spiegelhalter, Winton Professor of the Public Understanding of Risk at Cambridge university, they can be “complete bollocks. Absolute nonsense.”

Found data underpin the new internet economy as companies such as Google, Facebook and Amazon seek new ways to understand our lives through our data exhaust. Since Edward Snowden’s leaks about the scale and scope of US electronic surveillance it has become apparent that security services are just as fascinated with what they might learn from our data exhaust, too.

Consultants urge the data-naive to wise up to the potential of big data. A recent report from the McKinsey Global Institute reckoned that the US healthcare system could save $300bn a year – $1,000 per American – through better integration and analysis of the data produced by everything from clinical trials to health insurance transactions to smart running shoes.

But while big data promise much to scientists, entrepreneurs and governments, they are doomed to disappoint us if we ignore some very familiar statistical lessons.

“There are a lot of small data problems that occur in big data,” says Spiegelhalter. “They don’t disappear because you’ve got lots of the stuff. They get worse.”

More here.

BRET EASTON ELLIS AND LINDSAY LOHAN MAKE A FEATURE FILM

Article_anolikLili Anolik at The Believer:

Bret Easton Ellis is modern literature’s little rascal supreme. He seems to do things for no reason other than the fun of it. Take, for example, the many references in his books to his other books, references made in such a super-subtle yet obsessive way he could be doing it only to amuse himself. His minor characters are often recurring. Sean Bateman, for example, one of the protagonists in The Rules of Attraction, has, it is glancingly mentioned, an older brother, Patrick, the gifter of a brown Ralph Lauren tie about which Sean has ambivalent feelings. Patrick then lands the lead role as the Psycho who also happens to be an American in Ellis’s next work. Ellis did the same thing with Victor Johnson, Lauren Hynde’s mostly offstage boyfriend in The Rules of Attraction, moving him from the periphery of that novel (he’s backpacking through Europe for much of the narrative) to front-and-center in Glamorama. Ellis even gives him a stage name, Victor Ward—which is stronger, more macho-sounding, and, with fewer syllables, fits better on a marquee—as is commensurate with his change in status from bit player to star. What or whom, one wonders, did these characters have to do in order to secure their big breaks? If any writer would have a casting couch for his fictional creations, it would be Ellis.

more here.

Immanuel Velikovsky’s strange quest for a scientific theory of everything

Findlen_visionaryfringe_ba_imgPaula Findlen at The Nation:

In the 1940s, a curiously enigmatic figure haunted New York City’s great libraries, his mind afire with urgent questions whose resolution might reveal, once and for all, the most ancient secrets of the universe in their crystalline clarity. This scholar eschewed the traditional disciplinary boundaries that define the intellectual terrain of the specialist; instead, he read widely, skimming the surface of countless works of science, myth and history to craft an answer to an overwhelming question: Had our planet been altered repeatedly by cosmic catastrophes whose traces could be found in the earliest human records?

A fantastic theory began to emerge, redolent of the efforts of an earlier age to unify knowledge, yet speaking to the preoccupations of a world contemplating the chaos of another gruesome European war. The solar system, it was revealed, did not operate according to Newton’s universal laws of gravitation, nor did life on Earth evolve gradually and continuously, as Darwin had written. Instead, the cosmos was like a giant atom, periodically discharging photons whose energy disrupted and redirected the movements of celestial bodies, even causing the reversal of Earth’s magnetic poles. A planet was a kind of super-electron.

more here.

The new war literature by veterans

140407_r24824_p465George Packer at The New Yorker:

Soldiers who set out to write the story of their war also have to navigate a minefield of clichés: all of them more or less true but open to qualification; many sowed long before the soldiers were ever deployed, because every war is like every other war. That’s one of them. War is hell is another. War begins in illusion and ends in blood and tears. Soldiers go to war for their country’s cause and wind up fighting for one another. Soldiers are dreamers (Sassoon said that). No one returns from war the same person who went. War opens an unbridgeable gap between soldiers and civilians. There’s no truth in war—just each soldier’s experience. “You can tell a true war story by its absolute and uncompromising allegiance to obscenity and evil” (from “How to Tell a True War Story,” in O’Brien’s story collection “The Things They Carried”).

Irony in modern American war literature takes many forms, and all risk the overfamiliarity that transforms style into cliché. They begin with Hemingway’s rejection, in “A Farewell to Arms,” of the high, old language, his insistence on concreteness: “I had seen nothing sacred, and the things that were glorious had no glory and the sacrifices were like the stockyards at Chicago if nothing was done with the meat except to bury it. There were many words that you could not stand to hear and finally only the names of places had dignity.”

more here.

The need for the public practice of the humanities in India

Prashant Keshavmurthy in Chapati Mystery:

Shibli NomaniIn 1892, Maulana Shibli Nu’māni, an internationally celebrated Indian Muslim historian, (Urdu-Persian) literary critic and theologian of his day, traveled by sea from Bombay to the Ottoman Empire, journeying through Cyprus, Istanbul, Syria and Egypt. Of this journey he kept a journal that he later published under the title of Safarnāma-i rūm va misr va shām (A Travel Account of Turkey, Egypt and Syria).2 He claims that he had not intended to write a travel account but that European prejudices with regard to the Turks had led him to do so. Even well-meaning Europeans, he observes, remain bound by the Islamophobic prejudices they are raised with. His aims in writing it are therefore corrective and pedagogical: to correct prejudiced European travel accounts of Turkey that form the basis for European histories, and to instruct Indian Muslims by documenting exemplary “progress” among Turkish Muslims. The Turkey or Ottoman state of Shibli’s time, we must remember, was the only one of the three great early modern Islamic states – the other two being Safavid Iran and Mughal India – to still be extant. Moreover, its emperor, Abduḥamīd II (1876 – 1909), had only recently achieved radical advances in the movement to modernize or “reorganize” – “reorganization” or tanzīmāt bespeaking the bureaucratic character of this modernity – of his state on European models. Shibli intends therefore to focus on the “developments and reforms” of the Muslim world, especially Turkey.

The turn of the century preoccupation with lost Mughal sovereignty among North India’s Reformist Muslims – a sovereignty they understood as Muslim in the wake of the formal end of the Mughal state in 1857 – led them to regard the still regnant Ottoman empire with special attention: in it they saw a Muslim empire that was modeling itself through technological and institutional reforms on Europe, the very ambition of Sayyid Aḥmad Khān, the founder of what became Aligarh Muslim University, and his colleagues like Shibli Nu’māni. Shibli thus discusses formerly Ottoman Cyprus, when he passes through it, in terms of the history of its political sovereignty under Muslim and then British rule. Furthermore, everywhere in his travels he singles out educational syllabi, technology, and such empirical aspects of a society as clothing and food, treating them as indices of a polity’s development. Shibli desires and is at pains to discover signs of a continuous Muslim world. That he conflates all Arabs in the Ottoman territories with Muslims and vice versa signals this desire.

More here.

The Video Game Engine in Your Head

Joshua Hartshome in Scientific American:

VideoFor years now, physicists and engineers have been building computer simulations of physics in order to understand the behavior of objects in the world. Want to see if a bridge would be stable during an earthquake? Enter it into the simulation, apply earthquake dynamics, and see what happens. Recently, the prestigious Proceedings of the National Academy of Sciences published work by MIT psychologists (and my labmates) Peter Battaglia, Jessica Hamrick, and Joshua Tenenbaum, arguing that all humans do roughly the same thing when trying to understand or make predictions about the physical world. The primary difference is that we run our simulations in our brains rather than in digital computers, but the basic algorithms are roughly equivalent. The analogy runs deep: To model human reasoning about the physical world, the researchers actually used an open-source computer game physics engine — the software that applies the laws of physics to objects in video games in order to make them interact realistically (think Angry Birds).

Battaglia and colleagues found that their video game-based computer model matches human physical reasoning far better than any previous theory. The authors asked people to make a number of predictions about the physical world: will tower of blocks stand or fall over, what direction would it fall over, and where would the block that landed the farthest away land; which object would most likely fall off of a table if the table was bumped; and so on. In each case, human judgments closely matched the prediction of the computer simulation … but not necessarily the actual world, which is where it gets interesting.

More here.