Sad movies help us bond with those around us—and alleviate pain

Emily Underwood in Science:

FilmIf you were old enough to see a PG-13 movie in 1997, chances are you went to see Titanic. And chances are you cried. You might have even seen the film multiple times, doing your part to make it the highest-grossing sob fest in movie history. Now, a new study suggests why people want to see tragedies like Titanic over and over again: Watching dramas together builds social bonds and even raises our tolerance for physical pain. “Why on Earth would we waste so much of our time and money going back to novels and films that make us cry?” evolutionary psychologist Robin Dunbar of the University of Oxford in the United Kingdom and his team asked at the beginning of the new study. In their previous investigations of group activities like dancing, laughing, and singing, they found that feel-good chemicals called endorphins were released in the brain, leading to increased pain tolerance and stronger bonds between participants. Endorphins are also released when monkeys and other nonhuman primates groom, suggesting that this mechanism has evolved to boost social ties, Dunbar says. Watching a tragic drama unfold in a theater might harness the same system, the researchers hypothesized. So Dunbar and his colleagues recruited 169 people to watch Stuart: A Life Backwards. This made-for-TV film portrays the experiences of a disabled homeless man who was sexually abused as a child and struggles with lifelong drug use and imprisonment. He ultimately dies by throwing himself in front of a train. Based on a real man’s life, the story is “about as close to pure tragedy as Shakespeare,” Dunbar says. “People were leaving in tears.”

The researchers compared those viewers with a second group of 68 people who watched two rather sedate BBC documentaries: episode one of The Museum of Life—a behind-the-scenes look at the Natural History Museum in London—and Landscape Mysteries “In Search of Irish Gold,” which explores Irish geology and archaeology. Before and after watching the films, all participants took two tests: One measured their sense of belonging or bonding with their fellow audience members. Another was a measure of pain sensitivity, called the Roman chair, which Dunbar says is a well-established proxy for endorphin release. In it, participants brace themselves unsupported in a chairlike stance against a wall until their leg muscles burn painfully. The higher the endorphin level, the longer a person should be able to sustain the posture, Dunbar says. Participants who had watched Stuart: A Life Backwards were able to maintain the Roman chair roughly 18% longer than they had in their initial baseline test, compared with those who had watched the documentaries, the authors report today in the journal Royal Society Open Science. They also found a parallel increase in the volunteers’ sense of social bonding that was not seen in the control group, suggesting that watching the drama—and not the duller BBC shows—had boosted group coherence.

More here.

Wednesday Poem

Then Ay Know My Horse

Then Ay know my horse,
let alive and out of days,
hide now paled, hind legs slow
to drag, lower head to lift,
hoof-split, burred and rough from the dirt.

Strange when Ay speak to him.
Tremble runs under him.
What owned him fills him.

Same horse Ay tamed are you the same?
Mane-tangled, lank, and under brow,
hims eye as from a coal half-burnt
sparked up. Ay pulled my body on-

start, rear, run-
and did not loose but stormed and shaken
held as leaf to stem. Sky could hear
the finding cry Ay made.

by Joan Houlihan
from AY
Tupelo Press, 2014
.

Tuesday, September 20, 2016

My Father: A Life

Justin Erik Halldór Smith in his own blog:

ScreenHunter_2232 Sep. 20 20.02The dolphin fish (Coryphaena hippurus) has no inner life, so its death can only play out on the surface of its body, in a spectacular display of multicoloured flashes. But where there is cognition, memory, emotion, where there is a man, the light show sometimes happens on the inside, a fireworks display of the soul's contents, transformed and expressed in a way that the nursing staff will dismiss as hallucination, but which is in fact no less true than the life itself.

In the week leading up to Friday, September 2, 2016, I accompanied my father in his transition to death. I came back and he did not. I am not yet old, and was only there to help him across. But I am not yet fully back. I know things now that I did not know before, about him, about us, about the living and the dead, and about the category of being or of mental phantasm (what is the difference, really?) that the country folk call 'ghosts'.

I always knew I would write about him. Though it may seem too soon, too raw, against protocol, to do so is the closest thing to filial piety I have in me. To do so is to honour him, who long ago vested his own dream of writerliness in me. He set up this very website over a decade ago; his final post to Facebook, in mid-August, was a review of my most recent book in The Nation. The hard drive of his laptop, which I have taken into my possession, is filled with fragments of creative writing projects, not least a folder with hundreds of files (including a home-made cover) contributory to a novel, entitled Bananaman, that would have been about the CIA and the United Fruit Company's involvement in various Central American coups d'état, and about the creation of a certain popular peelable monoculture that my father somehow saw as key to understanding his American century. Bananaman will never see the light of day, but I think that at some point my father stopped expecting it would, and that, after some years of intergenerational competition, he could now just kick back, let me do all the work, and beam with paternal pride. So this is a coda to that, a necessary culmination of who each of us was for the other.

More here.

The Strange Second Life of String Theory

K. C. Cole in Quanta:

ScreenHunter_2231 Sep. 20 19.58String theory strutted onto the scene some 30 years ago as perfection itself, a promise of elegant simplicity that would solve knotty problems in fundamental physics — including the notoriously intractable mismatch between Einstein’s smoothly warped space-time and the inherently jittery, quantized bits of stuff that made up everything in it.

It seemed, to paraphrase Michael Faraday, much too wonderful not to be true: Simply replace infinitely small particles with tiny (but finite) vibrating loops of string. The vibrations would sing out quarks, electrons, gluons and photons, as well as their extended families, producing in harmony every ingredient needed to cook up the knowable world. Avoiding the infinitely small meant avoiding a variety of catastrophes. For one, quantum uncertainty couldn’t rip space-time to shreds. At last, it seemed, here was a workable theory of quantum gravity.

Even more beautiful than the story told in words was the elegance of the math behind it, which had the power to make some physicists ecstatic.

To be sure, the theory came with unsettling implications. The strings were too small to be probed by experiment and lived in as many as 11 dimensions of space. These dimensions were folded in on themselves — or “compactified” — into complex origami shapes. No one knew just how the dimensions were compactified — the possibilities for doing so appeared to be endless — but surely some configuration would turn out to be just what was needed to produce familiar forces and particles.

More here.

I Used to Be a Human Being

Andrew Sullivan in New York Magazine:

ScreenHunter_2230 Sep. 20 19.52I was sitting in a large meditation hall in a converted novitiate in central Massachusetts when I reached into my pocket for my iPhone. A woman in the front of the room gamely held a basket in front of her, beaming beneficently, like a priest with a collection plate. I duly surrendered my little device, only to feel a sudden pang of panic on my way back to my seat. If it hadn’t been for everyone staring at me, I might have turned around immediately and asked for it back. But I didn’t. I knew why I’d come here.

A year before, like many addicts, I had sensed a personal crash coming. For a decade and a half, I’d been a web obsessive, publishing blog posts multiple times a day, seven days a week, and ultimately corralling a team that curated the web every 20 minutes during peak hours. Each morning began with a full immersion in the stream of internet consciousness and news, jumping from site to site, tweet to tweet, breaking news story to hottest take, scanning countless images and videos, catching up with multiple memes. Throughout the day, I’d cough up an insight or an argument or a joke about what had just occurred or what was happening right now. And at times, as events took over, I’d spend weeks manically grabbing every tiny scrap of a developing story in order to fuse them into a narrative in real time. I was in an unending dialogue with readers who were caviling, praising, booing, correcting. My brain had never been so occupied so insistently by so many different subjects and in so public a way for so long.

I was, in other words, a very early adopter of what we might now call living-in-the-web. And as the years went by, I realized I was no longer alone. Facebook soon gave everyone the equivalent of their own blog and their own audience.

More here.

The Unsung Hero Left Out of ‘Sully’

Clive Irving in The Daily Beast:

49138975.cachedFear of flying has no easy antidote. It’s no good telling sufferers that flying is so much safer than driving on the expressway to the airport. That’s statistics, not the reality of the human condition. But there is one feel-good story that may be invoked to calm the fever: the moment in January 2009 when 150 airline passengers suddenly found themselves heading not for their destination, Charlotte, North Carolina but the ice-cold Hudson River in New York.

This is, of course, the Miracle on the Hudson. US Airways Flight 1549 had lost power from both engines after the engines ingested a flight of Canadian geese soon after taking off from La Guardia Airport.

Few passengers ever bother to know or even care about the names of the pilots to whom they entrust their lives. In this case none of the passengers (and much of the nation) would ever forget the name of the pilot, Captain Chesley Sullenberger, or just Sully.

Now Sully has rightly been elevated into the pantheon of those American heroes worthy of being played by Tom Hanks. TheClint Eastwood directed movie,Sully, is a surprise box office hit. It seems a good moment, then, to finally give credit to a hero so far unsung in this drama.

More here. [Thanks to Victoria Schindler.]

Village Atheists, Village Idiots

Thompson_AtheistB32.3_33rgb-838x1320Sam Kriss at The Baffler:

Something has gone badly wrong with our atheists. All these self-styled intellectual titans, scientists, and philosophers have fallen horribly ill. Evolutionist faith-flayer Richard Dawkins is a wheeling lunatic, dizzy in his private world of old-fashioned whimsy and bitter neofascism. Superstar astrophysicist and pop-science impresario Neil deGrasse Tyson is catatonic, mumbling in a packed cinema that the lasers wouldn’t make any sound in space, that a spider that big would collapse under its own weight, that everything you see is just images on a screen and none of it is real. Islam-baiting philosopher Sam Harris is paranoid, his flailing hands gesticulating murderously at the spectral Saracen hordes. Free-thinking biologist PZ Myers is psychotic, screeching death from a gently listing hot air balloon. And the late Christopher Hitchens, blinded by his fug of rhetoric, fell headlong into the Euphrates.

Critics have pointed out this clutch of appalling polemic and intellectual failings on a case-by-case basis, as if they all sprang from a randomized array of personal idiosyncrasies. But while one eccentric atheist might be explicable, for all of the world’s self-appointed smartest people to be so utterly deranged suggests some kind of pattern. We need, urgently, a complete theory of what it is about atheism that drives its most prominent high priests mad.

more here.

the struggle behind brexit

Guldi_inlineJo Guldi at Boston Review:

This account of Brexit, drawing on the framework of class-consciousness, turns on the rise of a reactionary electorate outside of London. The idea, in short, is that the United Kingdom has witnessed the lumpenproletariat exact uncertain revenge upon the nation’s ruling elite. This narrative more or less parallels Marx’s account of the December 1851 coup in France in The Eighteenth Brumaire of Louis Napoleon. Marx blamed the rise of the dictatorship on the greed and disappointment of the petite bourgeoisie, who revolted against the Second Republic and the interest of the workers. This betrayal, Marx argued, precipitated an era of rule by political moron, encapsulated in the premiership of Louis-Napoléon Bonaparte (figured as a template for Boris Johnson by some and for Jeremy Corbyn by others), whom Marx memorably dubbed a “grotesque mediocrity.” Leaders such as these, several commentators have implied, are a parody of the great leadership demanded by the moment.

A closely related understanding of Brexit can be found in the accounts of political scientists who theorize a connection between class resentment and the cause of participatory democracy. Mark Blyth, for example, has argued that Brexit typifies a global moment of participatory rebellion against the structures of expert rule, and Richard Tuck avers that the Left must embrace Brexit if the EU elite is to be replaced by a participatory process.

more here.

the ruins of palmyra

Palmyra-valley-of-tombsIngrid D. Rowland at the NY Review of Books:

In September 2015, the Getty Research Institute in Los Angeles acquired the first photographs ever taken of Palmyra, the great trading oasis in the heart of the Syrian desert. Louis Vignes was a young lieutenant in the French navy whose interest in photography earned him a place on a scientific expedition to the Dead Sea region in 1863. The twenty-nine photographs he made of Palmyra during his visit in 1864 (including two panoramic shots) were finally printed in Paris by the pioneering photographer Charles Nègre (who had taught Vignes) between 1865 and 1867.

With a history that extends back nearly four thousand years, Palmyra has risen and fallen many times. Its original name was Tadmor, which probably meant “palm tree,” an indication of the site’s renowned fertility. Both the Bible and local legend credit the city’s foundation to King Solomon in the tenth century BCE, but in fact it is already mentioned in Mesopotamian texts a millennium earlier. A spring and a wadi, or dry river bed, provided the settlement with water, making it a welcome stop for travelers and traders on the road between Central Asia and the Mediterranean Sea.

The population, almost from the outset, was a mixture of Semitic peoples from surrounding areas: Amorites, Aramaeans, Arabs, and Jews, who developed a distinctive Palmyrene language and a distinctive script expressive of their distinctive cosmopolitan culture, which drew from Persia, Greece, and Rome as well as local tradition.

more here.

How cats conquered the world (and a few Viking ships)

Ewen Callaway in Nature:

CatResearchers know little about cat domestication, and there is active debate over whether the house cat (Felis silvestris) is truly a domestic animal — that is, its behaviour and anatomy are clearly distinct from those of wild relatives. “We don’t know the history of ancient cats. We do not know their origin, we don't know how their dispersal occurred,” says Eva-Maria Geigl, an evolutionary geneticist at the Institut Jacques Monod in Paris. She presented the study at the 7th International Symposium on Biomolecular Archaeology in Oxford, UK, along with colleagues Claudio Ottoni and Thierry Grange. A 9,500-year-old human burial from Cyprus also contained the remains of a cat1. This suggests that the affiliation between people and felines dates at least as far back as the dawn of agriculture, which occurred in the nearby Fertile Crescent beginning around 12,000 years ago. Ancient Egyptians may have tamed wild cats some 6,000 years ago2, and under later Egyptian dynasties, cats were mummified by the million. One of the few previous studies3 of ancient-cat genetics involved mitochondrial DNA (which, contrary to most nuclear DNA, is inherited through the maternal line only) for just three mummified Egyptian cats.

…Cat populations seem to have grown in two waves, the authors found. Middle Eastern wild cats with a particular mitochondrial lineage expanded with early farming communities to the eastern Mediterranean. Geigl suggests that grain stockpiles associated with these early farming communities attracted rodents, which in turn drew wild cats. After seeing the benefit of having cats around, humans might have begun to tame these cats. Thousands of years later, cats descended from those in Egypt spread rapidly around Eurasia and Africa.

More here.

Tuesday Poem

Settling for the Night

It’s a custom with my youngest
to sprinkle “sleeping dust”
over his eyes
before closing them,
combing the sleep down through his hair
and tenderly over his forehead

good night, Dad,
good night…

I listen to our children breathing the night,
their tiny heads under the covers
gone somewhere where we cannot follow
but at least they will return;
the stars of some eternal night
speckle their hair
and their faces are like clocks
in the bedroom twilight.

Their morning is afternoon to us;
their afternoon will see us settled for the night;
some quiet Sunday perhaps
the sun through the blinds will raise
its black ladder on my bedroom wall
and the child fists
will have become adult hands
that will sprinkle the sleeping dust
over my closed eyes
before combing it down
through my peppered grey hair…

Good night, Dad,
good night…
.

by Ifor ap Glyn
from Cerrdi Map yr Underground
publisher: Gwasg Carreg Gwalch, Llanrwst, 2001
.

Monday, September 19, 2016

Sunday, September 18, 2016

When Analogies Fail

Alexander Stern in the Chronicle of Higher Education:

Photo_78335_landscape_650x433An analogy is, according to Webster’s, “a comparison of two things based on their being alike in some way.” The definition seems to capture exactly what Simmons, a sports commentator, and Dowd, a New York Times columnist, are doing in the sentences above: comparing two things and explaining how they’re alike. Being a dictionary, however, Webster’s has little to say about why we use analogies, where they come from, or what role they really play in human life.

Analogies need not, of course, all have the same aim. They’re used in different contexts to varying effect. Still, it is evident that we use analogies for mainly rhetorical reasons: to shed light, to explain, to reveal a new aspect of something, to draw out an unseen affinity, to drive home a point. As Wittgenstein wrote, “A good simile refreshes the mind.”

This Simmons’s and Dowd’s analogies demonstrably fail to do. Our understanding of Trump is unlikely to benefit from an attentive viewing of Species. The careers of the basketball player Robert Horry and the actor Philip Baker Hall, admirable though they may be, leave Australia similarly unilluminated. This kind of analogy — which often consists of an ostensibly funny pop-culture reference or of objects between which certain equivalences can be drawn (x is the y of z’s) — has become increasingly common.

You also find it in academic writing. For example, from the journal Cultural Critique: “Attempting to define multiculturalism is like trying to pick up a jellyfish — you can do it, but the translucent, free-floating entity turns almost instantly into an unwieldy blob of amorphous abstraction.” The analogy aims not to enlighten, but to enliven, adorn, divert.

Of course there’s nothing wrong with this, as far as it goes, but its increasing prominence reflects more general changes in the way we relate to the world around us.

More here.

Every cognitive bias exists for a reason—primarily to save our brains time or energy

Buster Benson in Quartz:

ScreenHunter_2227 Sep. 18 20.14I’ve spent many years referencing Wikipedia’s list of cognitive biaseswhenever I have a hunch that a certain type of thinking is an official bias but I can’t recall the name or details. But despite trying to absorb the information of this page many times over the years, very little of it seems to stick.

I decided to try to more deeply absorb and understand this list by coming up with a simpler, clearer organizing structure. If you look at these biases according to the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs (and resulting mental errors) that they introduce.

Four problems that biases help us address: Information overload, lack of meaning, the need to act fast, and how to know what needs to be remembered for later.

Problem 1: Too much information

There is just too much information in the world; we have no choice but to filter almost all of it out. Our brain uses a few simple tricks to pick out the bits of information that are most likely going to be useful in some way.

More here.

ON THE CELIBATE LOVE AFFAIR OF NORA EPHRON AND MIKE NICHOLS

Richard Cohen in Literary Hub:

ScreenHunter_2226 Sep. 18 20.04“Marlon Brando’s gay, everybody knows that.”

Nora said that one night in my house in Washington. I can’t remember how Brando’s name came up, but there it was, this startling (at the time) piece of information, so inside, so unknown to the general public, who considered him—fools that they were—a womanizer of great repute. I can remember exactly where I was at the time. In the living room. Standing in front of the sofa and to the right. The remark hit with the force of a dumdum bullet. Marlon Brando’s gay? Who knew?

Everybody, it turned out. Everybody knew. And whether they did or they didn’t, whether it was true or not, was totally beside the point. When Nora said one of these things—and she said them quite often—she did not do so with any sort of tentativeness, with hesitation, with the suggestion that this might be the rawest gossip and possibly wrong, but with a firmness and robust confidence that transformed the gossamer of hearsay into something chiseled into the frieze of a Greek temple. It was beyond dispute. Behold what she knew and behold what you didn’t. You knew some things. She knew everything.

More here.

Don’t believe the rumours: Universal Grammar is alive and well

Dan Milway writes:

Main-qimg-0f0e650bd9f16c171f4d31768717f0be-cAs I write this I am sitting in the Linguistics Department lounge at the University of Toronto. Grad students and Post-doctoral researchers are working, chatting, making coffee. Faculty members pop in every now and then, taking breaks from their work.

It’s a vibrant department, full of researchers with varied skills and interests. There are those who just got back into town from their summer fieldwork, excited to dig into the new language data from indigenous Canadian, Amazonian, or Australian languages. There are those struggling to find a way to explain the behaviour of some set of prefixes or verbs in Turkish, or Spanish, or Niuean. There are those designing and running experiments to test what young children know about the languages they are acquiring. There are those sifting through massive databases of speech from rural farmers, or lyrics of local hip-hop artists, or the emails of Enron employees, to hopefully gain some understanding of how English varies and changes. And there are those who spend their time thinking about our theories of language and how they might be integrated with theories of Psychology, Neurology, and Biology.What unites these disparate research agendas is that they are all grounded in the hypothesis, generally attributed to Noam Chomsky, that the human mind contains innate structures, a Universal Grammar, that allows us to acquire, comprehend, and use language.

According to a recent article in Scientific American, however, the community I just described doesn’t exist, and maybe couldn’t possibly exist in linguistics today, because the kind of work that I just described has long since shown the Universal Grammar hypothesis (UG) to be flat-out wrong. But such a community does exist, and not just here at UofT, or in Chomsky’s own department at MIT, but in Berkeley and Manhattan, in Newfoundland and Vancouver, in Norway and Tokyo. Communities that collectively groan whenever someone sounds the death knell of the UG hypothesis or the enterprise of Generative Linguistics it spawned. We groan, not because we’ve been exposed for the frauds or fools that these pieces say we are, but because we are always misrepresented in them. Sometimes the misrepresentation is laughable, but more often it’s damn frustrating.

More here.

Once we blamed Yoko Ono. Now we blame refugees

Shazia Mirza in New Statesman:

Shazia-mirza_2016tour_image-print_copyHate and lies are all the rage. Everyone’s at it. “Obama is the founder of ISIS!” and everyone believes the Trump. “We’re at breaking point” – look, here’s a poster of lots of brown men that look just like your dad, cousin and brother. If you vote out, all these scavengers that come over here for the good life, they’ll be gone in the morning.Hate is fashionable. It’s flourishing in the comments section of the Daily Mail, Facebook, and this morning I saw some photographs on Twitter of Madonna’s cellulite with the comment: “I thought grandma had died of AIDS.”When people are unhappy, discontent and disillusioned with their own life, they want someone to blame. Once we blamed Yoko Ono. Now we blame refugees. They caused Brexit; they are destroying the NHS, housing, transport, and education. They can’t seem to do any good or make any valuable contribution.

Well, some people might be surprised to hear that Syrian refugees are not coming to Britain for the food, weather and £65.45 a week, which couldn’t even get you a night out at the cinema. They’re not coming because they want to find Harry Potter, drink tea and watch drunk people rolling down every high street looking for their teeth on a Friday night. No. These people want to live. When I was a teacher, I taught in some very challenging schools where a lot of the children had difficult and unstable parents. But no matter how awful their parents, the child always wanted to remain with them. They would rather be with their own volatile parent than with a kind, caring stranger. Refugees would love to remain in their homeland. The place they know and where their family life has been. But they are forced to leave out of desperation.

More here.

What It Feels Like to Die

Jennie Dear in The Atlantic:

Lead_960For many dying people, “the brain does the same thing that the body does in that it starts to sacrifice areas which are less critical to survival,” says David Hovda, director of the UCLA Brain Injury Research Center. He compares the breakdown to what happens in aging: People tend to lose their abilities for complex or executive planning, learning motor skills—and, in what turns out to be a very important function, inhibition. “As the brain begins to change and start to die, different parts become excited, and one of the parts that becomes excited is the visual system,” Hovda explains. “And so that’s where people begin to see light.”

Recent research points to evidence that the sharpening of the senses some people report also seems to match what we know about the brain’s response to dying. Jimo Borjigin, a neuroscientist at the University of Michigan, first became intrigued by this subject when she noticed something strange in the brains of animals in another experiment: Just before the animals died, neurochemicals in the brain suddenly surged. While scientists had known that brain neurons continued to fire after a person died, this was different. The neurons were secreting new chemicals, and in large amounts. “A lot of cardiac-arrest survivors describe that during their unconscious period, they have this amazing experience in their brain,” she says. “They see lights and then they describe the experience as ‘realer than real.’” She realized the sudden release of neurochemicals might help to explain this feeling.Borjigin and her research team tried an experiment. They anesthetized eight rats, and then stopped their hearts. “Suddenly, all the different regions of the brain became synchronized,” she says. The rats’ brains showed higher power in different frequency waves, and also what is known as coherence—the electrical activity from different parts of the brain working together.

More here.