Friday, May 29, 2015

Academic debate has huge implications for the future of world peace

Screen_Shot_2014-10-03_at_6.19.42_PM.0

Zack Beauchamp in Vox:

A key part of Pinker's work is the notion of the “long peace” — an idea that Pinker actually borrows from a historian, John Lewis Gaddis. It refers to the fact that in the past 70 years, wars between great powers have basically gone away. Because situations like the Cold War never escalated to direct conflict, we've managed to avoid the type of warfare that devastated societies in the early 20th century and, indeed, much of human history.

If the causes of that are, as Pinker suggested in a lecture, “the pacifying forces” of “democracy, trade, and international society,” then we should expect this trend to continue. So long as we continue to maintain the trends of the world we live in, including growing international trade, strengthening of international institutions like the UN, and strong diplomatic ties between democratic states, then we might actually be able to keep making the world a better place.

Enter NYU professor Nassim Nicholas Taleb, who is best known as the author of The Black Swan, a book on rare events. He thinks all of this is starry-eyed nonsense. In his opinion, proponents of the “war is declining” argument are over-interpreting evidence of a good trend in the same way people used to argue that the stock market could go up forever without crashes. He wrote a stinging critique of Pinker's work, which Pinker replied to, and then Taleb replied to again.

Taleb's new paper, co-authored with Delft University's Pasquale Cirillo, is the latest volley in that ongoing intellectual war. It's probably the most statistically sophisticated argument to date that war isn't declining — and that we're still every bit at risk of a major conflict as we always were.

More here.

I’m a black ex-cop, and this is the real truth about race and policing

474571396.0.0

Redditt Hudson in Vox:

On any given day, in any police department in the nation, 15 percent of officers will do the right thing no matter what is happening. Fifteen percent of officers will abuse their authority at every opportunity. The remaining 70 percent could go either way depending on whom they are working with.

That's a theory from my friend K.L. Williams, who has trained thousands of officers around the country in use of force. Based on what I experienced as a black man serving in the St. Louis Police Department for five years, I agree with him. I worked with men and women who became cops for all the right reasons — they really wanted to help make their communities better. And I worked with people like the president of my police academy class, who sent out an email after President Obama won the 2008 election that included the statement, “I can't believe I live in a country full of ni**er lovers!!!!!!!!” He patrolled the streets in St. Louis in a number of black communities with the authority to act under the color of law.

That remaining 70 percent of officers are highly susceptible to the culture in a given department. In the absence of any real effort to challenge department cultures, they become part of the problem. If their command ranks are racist or allow institutional racism to persist, or if a number of officers in their department are racist, they may end up doing terrible things.

It is not only white officers who abuse their authority. The effect of institutional racism is such that no matter what color the officer abusing the citizen is, in the vast majority of those cases of abuse that citizen will be black or brown. That is what is allowed.

And no matter what an officer has done to a black person, that officer can always cover himself in the running narrative of heroism, risk, and sacrifice that is available to a uniformed police officer by virtue of simply reporting for duty. Cleveland police officer Michael Brelo was recently acquitted of all charges against him in the shooting deaths of Timothy Russell and Malissa Williams, both black and unarmed. Thirteen Cleveland police officers fired 137 shots at them. Brelo, having reloaded at some point during the shooting, fired 49 of the 137 shots. He took his final 15 shots at them after all the other officers stopped firing (122 shots at that point) and, “fearing for his life,” he jumped onto the hood of the car and shot 15 times through the windshield.

More here.

My Title IX Inquisition

Photo_69821_wide_large

Laura Kipnis on the backlash to her earlier piece, in The Chronicle of Higher Education Chronicle Review illustration by Scott Seymour):

When I first heard that students at my university had staged a protest over an essay I’d written in The Chronicle Review about sexual politics on campus — and that they were carrying mattresses and pillows — I was a bit nonplussed. For one thing, mattresses had become a symbol of student-on-student sexual-assault allegations, and I’d been writing about the new consensual-relations codes governing professor-student dating. Also, I’d been writing as a feminist. And I hadn’t sexually assaulted anyone. The whole thing seemed symbolically incoherent.

According to our campus newspaper, the mattress-carriers were marching to the university president’s office with a petition demanding “a swift, official condemnation” of my article. One student said she’d had a “very visceral reaction” to the essay; another called it “terrifying.” I’d argued that the new codes infantilized students while vastly increasing the power of university administrators over all our lives, and here were students demanding to be protected by university higher-ups from the affront of someone’s ideas, which seemed to prove my point.

The president announced that he’d consider the petition.

Still, I assumed that academic freedom would prevail. I also sensed the students weren’t going to come off well in the court of public opinion, which proved to be the case; mocking tweets were soon pouring in. Marching against a published article wasn’t a good optic — it smacked of book burning, something Americans generally oppose. Indeed, I was getting a lot of love on social media from all ends of the political spectrum, though one of the anti-PC brigade did suggest that, as a leftist, I should realize these students were my own evil spawn. (Yes, I was spending a lot more time online than I should have.)

Being protested had its gratifying side — I soon realized that my writer friends were jealous that I’d gotten marched on and they hadn’t. I found myself shamelessly dropping it into conversation whenever possible. “Oh, students are marching against this thing I wrote,” I’d grimace, in response to anyone’s “How are you?” I briefly fantasized about running for the board of PEN, the international writers’ organization devoted to protecting free expression.

Things seemed less amusing when I received an email from my university’s Title IX coordinator informing me that two students had filed Title IX complaints against me on the basis of the essay and “subsequent public statements” (which turned out to be a tweet), and that the university would retain an outside investigator to handle the complaints.

I stared at the email, which was under-explanatory in the extreme. I was being charged with retaliation, it said, though it failed to explain how an essay that mentioned no one by name could be construed as retaliatory, or how a publication fell under the province of Title IX, which, as I understood it, dealt with sexual misconduct and gender discrimination.

More here.

The Mystery of Kangaroo Adoptions

Carl Zimmer in his excellent blog, The Loom:

Kangaroo-12003-990x667Scientists have observed adoption in occurring 120 species of mammals. Other species that are harder to study may be adopting, too. As for kangaroos, scientists have long known that if they put a joey in an unrelated female’s pouch, she will sometimes keep it. But King and her colleagues have now discovered that kangaroos will voluntarily adopt joeys in the wild. All told, they found that 11 of the 326 juveniles were adopted over their five-year study–a rate of about three percent. Given the commitment adoption demands from a mammal mother–a kangaroo mother needs a full year to raise a single joey to weaning–this discovery cries out for an explanation.

Over the years, researchers have proposed a number of different explanations for adoption. Some have suggested that mammals adopt young offspring of their relatives because they are genetically similar. By rearing the offspring of their kin, this argument goes, adoptive parents can ensure that some of their own genes get passed down to future generations.

According to another explanation, unrelated adults may adopt each other’s young because this kind of quid-pro-quo benefits everyone involved. And according to a third explanation, young adults adopt orphaned juveniles as a kind of apprenticeship. They learn some important lessons about how to raise young animals, which they can apply later to raising their own offspring.

These explanations share something in common. They all take adoption to have an evolutionary benefit. In the long run, the genes that make animals willing to adopt become more common thanks to natural selection.

But in the case of kangaroos–and perhaps other species, too–evolution may have instead have made a mess of things. Adoption may not be an adaptation. It may be a maladaptation.

More here.

‘Primates of Park Avenue: A Memoir,’ by Wednesday Martin

Vanessa Grigoriadis in the New York Times:

31grigoriadis-blog427A few pages into “Primates of Park Avenue,” I raised an eyebrow as high as a McDonald’s arch. Was Wednesday Martin, a ­Midwestern-born Ph.D., trying to explain the rites of the Upper East Side to me, an autochthonous Manhattanite schooled at one of the neighborhood’s top “learning huts”? She was a late transfer to the New York troop — a particularly vicious troop, at that — and it’s a weak position to be in throughout the primate kingdom, whether human or monkey.

I underestimated Martin with few repercussions, but the SoulCycled, estrogen-dimmed and ravenously hungry young mothers who similarly exhibited New York’s inbred superciliousness have done so at their peril, because now she’s gone and told the world their tricks. “I was afraid to write this book,” Martin confesses, but I guess she got over it. Instead, she obsessively deconstructs the ways of her new tribe, from the obvious — “No one was fat. No one was ugly. No one was poor. Everyone was drinking” — to the equally obvious but narratively rich: “It is a game among a certain set to incite the envy of other women.”

The result is an amusing, perceptive and, at times, thrillingly evil takedown of upper-class culture by an outsider with a front-row seat. The price of the ticket, a newly purchased Park Avenue condop in the 70s with a closet designated exclusively for her handbags, wisely goes unmentioned, the better to establish rapport with readers in Des Moines.

More here.

Addy Walker, American Girl

Brit Bennett in Paris Review:

Addymeet2In 1864, a nine-year-old slave girl was punished for daydreaming. Distracted by rumors that her brother and father would be sold, she failed to remove worms from the tobacco leaves she was picking. The overseer didn’t whip her. Instead, he pried her mouth open, stuffed a worm inside, and forced her to eat it. This girl is not real. Her name is Addy Walker; she is an American Girl doll, one of eight historical dolls produced by the Pleasant Company who arrive with dresses, accessories, and a series of books about their lives. Of all the harrowing scenes I’ve encountered in slave narratives, I remember this scene from Meet Addy, her origin story, most vividly. How the worm—green, fat, and juicy—burst inside Addy’s mouth. At eight years old, I understood that slavery was cruel—I knew about hard labor and whippings—but the idea of a little girl being forced to eat a worm stunned me. I did not yet understand that violence is an art. There’s creativity to cruelty. What did I know of its boundaries and edges?

An American Girl store is designed like a little girl’s fantasyland, or what the Pleasant Company, owned by Mattel, imagines that to be. Pink glows from the walls; yellow shelves hold delicate dolls in display cases. Nurses tend to a hospital for defunct toys and a café hosts tea parties for girls and their dolls. The company has retired many of the historical American Girls from my childhood—the colonist Felicity, the frontierswoman Kirsten, and the World War II–era Molly, all among the original set of dolls, released in 1986—but Addy remains. Against the store’s backdrop of pink tea parties, her story seems even more harrowing. Addy escapes to the north with her mother, forced to leave her baby sister behind because her cries might alert slave-catchers. In Philadelphia, Addy struggles to adjust and dreams of her family reuniting. They do, it turns out, find each other eventually—a near impossibility for an actual enslaved family—but at no small cost. Her brother loses an arm fighting in the Civil War. Her surrogate grandparents die on the plantation before she can say goodbye. Other American Girls struggle, but Addy’s story is distinctly more traumatic. For seventeen years, Addy was the only black historical doll; she was the only nonwhite doll until 1998. If you were a white girl who wanted a historical doll who looked like you, you could imagine yourself in Samantha’s Victorian home or with Kirsten, weathering life on the prairie. If you were a black girl, you could only picture yourself as a runaway slave.

More here.

Lust and the Turing test

Christof Koch in Nature:

Ex-machina-poster-1024x768By and large, we watch movies to be entertained, not to be provoked into deep thought. Occasionally, a film does both. This year’s Ex Machina is one such gem. It prompted me to reflect upon the evolution of the idea of machine sentience over the past three decades of science fiction on film. I am a long-time student of the mind-body problem — how consciousness arises from the brain. There is a conundrum at the heart of this ancient dilemma, challenging both brain science and AI; and it is well captured by Ex Machina and two other SF movies. In essence, it lies in how we can ever be certain that a machine feels anything, is conscious.

…Enter Ex Machina, directed by Alex Garland. This intelligent and thoughtful mix of psycho-drama and SF thriller centers on a strange ménage à trois. Ava is a beauty with a difference (a phenomenal performance by Alicia Vikander); Caleb is a nerdy young programmer (Domhnall Gleeson); Nathan is a beastly, brilliant inventor and immensely rich tech-entrepreneur (Oscar Isaac). Caleb is selected by Nathan, a recluse, to spend a week at his live-in Arctic laboratory. He introduces Caleb to Ava, an advanced cyborg whose semi-transparent skull and body reveal inner workings, including a brain that is quasi-organic in some unspecified way. It’s a twist on Blade Runner: if Caleb interacts with Ava as he would with an alluring woman – while seeing clearly that she is not flesh and blood – that would testify to Ava’s ability to convince him she has real feelings. Ava and Caleb hit it off at first sight. Unlike Her, Ex Machina soon becomes a game of smoke and mirrors. Ava hints to Caleb that she doubts Nathan’s purely scientific motives; there are bizarre scenes such as Nathan doing a synchronized dance routine with a mute servant. Nathan’s lab becomes Bluebeard’s Castle, complete with locked rooms and heavy psychosexual undertones. Ex Machina’s ending, invoking the trope of the femme fatale, is logical, surprising and darker than Blade Runner’s. All three films showcase how the psychology of desire can be exploited to forge a powerful empathic response in their protagonists, sweeping away doubts about the object of their longing having sentience. It’s a Turing test based on lust, each movie an excursion into human social psychology and the attendant gender power politics.

More here.

Manhood: Badly educated men in rich countries have not adapted well to trade, technology or feminism

In The Economist:

20150532_esd666For those at the top, James Brown’s observation that it is a man’s, man’s, man’s world still holds true. Some 95% of Fortune 500 CEOs are male, as are 98% of the self-made billionaires on the Forbes rich list and 93% of the world’s heads of government. In popular films fewer than a third of the characters who speak are women, and more than three-quarters of the protagonists are men. Yet the fact that the highest rungs have male feet all over them is scant comfort for the men at the bottom.

Technology and trade mean that rich countries have less use than they once did for workers who mainly offer muscle. A mechanical digger can replace dozens of men with spades; a Chinese steelworker is cheaper than an American. Men still dominate risky occupations such as roofer and taxi-driver, and jobs that require long stints away from home, such as trucker and oil-rig worker. And, other things being equal, dirty, dangerous and inconvenient jobs pay better than safe, clean ones. But the real money is in brain work, and here many men are lagging behind. Women outnumber them on university campuses in every region bar South Asia and sub-Saharan Africa. In the OECD men earn only 42% of degrees. Teenage boys in rich countries are 50% more likely than girls to flunk all three basic subjects in school: maths, reading and science.

The economic marginalisation this brings erodes family life. Women who enjoy much greater economic autonomy than their grandmothers did can afford to be correspondingly pickier about spouses, and they are not thrilled by husbands who are just another mouth to feed.

If the sort of labour that a man like Mr Redden might willingly perform with diligence and pride is no longer in great demand, that does not mean there are no jobs at all. Everywhere you look in Tallulah there are women working: in the motels that cater to passing truckers, in the restaurants that serve all-you-can-eat catfish buffets, in shops, clinics and local government offices. But though unskilled men might do some of those jobs, they are unlikely to want them or to be picked for them.

Read the rest here.

Friday Poem

Different Rose
.
I gave birth to an incredibly beautiful daughter, her teeth,
her hair as though from the Song of Songs. And I
felt beautiful myself, thank you. Whereas she –
that's a completely different beauty,
that's beauty I want to protect.
If I had some sort of beauty I'd blush,
anyhow I probably do have some, guys
wouldn't chase after me as much if I didn't,
but I don't like my beauty, because guys
chase after it. My daughter’s beauty
is something else. My daughter’s beauty, I believe,
is the only hope
for this world.
.

by Justyna Bargielska
from Bach for my baby
publisher: Biuro Literackie, Wrocław, 2013
translation: Maria Jastrzębska
from: Versopolis.com, 2015

Read more »

Thursday, May 28, 2015

What scared Hitchcock?

Trot01_3711_01David Trotter at the London Review of Books:

Even the biographers, watching the life ‘start at zero’, have struggled to establish where the motivation for the inventiveness came from. The most popular hypothesis, not least because Hitchcock himself promoted it so vigorously, concerns timidity. ‘The man who excels at filming fear is himself a very fearful person,’ Truffaut observed, ‘and I suspect that this trait of his personality has a direct bearing on his success.’ The most substantial biography to date, by Patrick McGilligan, includes plenty of anecdotes about fear, but supplies little by way of evidence of its ultimate cause, and draws no conclusions. Peter Ackroyd, however, is firmly of the Truffaut school. His Hitchcock trembles from the outset: ‘Fear fell upon him in early life.’ At the age of four (or 11, or …), his father had him locked up for a few minutes in a police cell, an episode that became, as Michael Wood puts it, the ‘myth of origin’ for his powerful distrust of authority. Ackroyd rummages dutifully for further evidence. Was young Alfred beaten at school by a ‘black-robed Jesuit’? Or caught out in the open when the Zeppelins raided London in 1915? Did he read too much Edgar Allan Poe? It doesn’t really add up to very much. And yet – or therefore – the strong conviction persists. Fear is the key; and not just to the life. Interview the films, he once told an inquisitive journalist. Those who have interviewed the films often conclude that, like their creator, they too tremble. ‘Hitchcock was a frightened man,’ Wood writes, ‘who got his fears to work for him on film.’

more here.

Eileen Chang’s ‘Naked Earth’

LF_GOLBE_NAKED_AP_001Stefany Anne Golberg at The Smart Set:

Eileen Chang started writing early, completing her first novel at the age of 12. By the time the Communist government came to power, Eileen Chang was a well-known writer in China (now considered by many to be China’s first modernist). Dubious about her role in this new society, however, Chang chose self-imposed exile. She moved to Hong Kong, then Japan, then back to Hong Kong, and eventually to Los Angeles, where she died alone in her apartment in 1995. Chang never again returned to mainland China. In Hong Kong, America became Eileen Chang’s patron. For three years, she worked as a translator for the United States Information Service. Then the USIS hired Chang to write anti-Communist propaganda in the form of two novels: The Rice Sprout Song and Naked Earth. Wanting propaganda, the Information Service encouraged Chang to be unsparing in her depiction of China’s confessional spectacles. And so she was.

In another early scene from Naked Earth, a Mass Meeting is held in the vacant lot in front of an ancestral temple. Tang, a Middling Farmer (neither unfortunately rich nor blessedly poor), is brought to a platform for his session. Schoolchildren wave paper flags and sing loudly. They are accompanied by militiamen, members of the Farm Workers Association, the Women’s Association, the Youth Vanguard Corps.

more here.

New poems from John Ashbery

150601_r26579_rd-320-240-21105451Dan Chiasson at The New Yorker:

John Ashbery’s latest book of poems—his twenty-sixth, not counting various compilations and re-issues—is “Breezeway” (Ecco). As with most of Ashbery’s work, its medium is composed partly of language foraged from everyday American speech. The effect is sometimes unnerving, as though somebody had given you your own garbage back as a gift, cheerfully wrapped. Ashbery is nearly eighty-eight; more than ever, his style is a net for the weirdest linguistic flotsam. Few others of his generation would think to put “lemon telenovela” or “texasburger” in a poem, or write these lines: “Thanks / to a snakeskin toupee, my grayish push boots / exhale new patina / prestige. Exeunt the Kardashians.” He has gone farther from literature within literature than any poet alive. His game is to make an intentionally frivolous style express the full range of human feeling, and he remains funnier and better at it, a game he invented, than his many imitators.

It’s common for people to prefer a prior Ashbery, though few can agree on which one. There is the noncompliant poet of “The Tennis Court Oath,” his 1962 book, giddy in his defiance of meaning; the poet of childhood and its longueurs whom we encounter in his seven-hundred-and-thirty-nine-line poem, “The Skaters” (1966); the sublime meditative poet of “Self-Portrait in a Convex Mirror” (1975); the elegist of “Your Name Here” (2000).

more here.

Fitzgerald’s greatest novel: It’s not the one everyone thinks it is

Taki in Spectator:

ScottFitzgerald was famously obsessed with the mysteries of great wealth, but back then wealth was something new among Americans. Poor old Scott wrote more about the ruinous effects of wealth, which is a very large theme even today. I recently read a couple of articles on Fitzgerald, one claiming that he wrote Gatsby in Great Neck, Long Island, where the action takes place, the other that he wrote the greatest of American novels in Antibes. I believe both writers are correct. Fitzgerald started the novel in Long Island and finished it in Antibes. Detective Taki solves the riddle in one short declarative sentence.

Scott and Zelda’s two granddaughters, their mother being Scottie, the couple’s only issue, are very much with us and recently visited Juan-les-Pins and the hotel that was once the home of their grandparents. The cruel irony is that Scott died broke and forgotten, and his granddaughters are very rich because of his immortal work. Although The Great Gatsby is considered Scott’s greatest work, the greatest literary critic of our time, Taki, thinks otherwise. He gives the nod to Tender Is the Night. When Fitzgerald showed Papa the manuscript of Gatsby, Hemingway did not brood, he sprang into action. He came up with The Sun Also Rises, not a bad response. It was almost America versus Europe. Great stuff. Which brings me to Hollywood. The best that degenerate place has managed in filming either writer’s works was — in my not so humble opinion — The Short Happy Life of Francis Macomber, Papa’s short story about grace under pressure.

More here.

Don’t Overthink It, Less Is More When It Comes To Creativity

Jessica Schmerler in Scientific American:

CreateMost of us have experienced writer’s block at some point, sitting down to write, paint or compose only to find we can’t get the creative juices flowing. Most frustrating of all, the more effort and thought we put into it, the harder it may become. Now, at least, neuroscientists might have found a clue about why it is so hard to force that creative spark. Researchers at Stanford University recently set out to explore the neural basis of creativity and came up with surprising findings. Their study, published May 28 in Scientific Reports, suggests the cerebellum, the brain region typically associated with movement, is involved in creativity. If so, the discovery could change our understanding of the neurological mechanisms behind some thought processes.

There is a scientific belief that the cerebral cortex is the part of the brain that “makes us human,” and that the two hemispheres of the cortex differentiate the creative thinkers from the logical thinkers (the “right-brained” from the “left-brained”). This has fostered the view that “neurological processes can be divided into “higher” cognitive functions and “lower” basic sensory-motor, functions,” says Robert Barton, an evolutionary biologist at Durham University in England who was not involved in this study—but the latest research calls that understanding into question.

More here.

More here.

Warp Drives and Scientific Reasoning

Warpy

Sean Carroll in Preposterous Universe:

The more recent “news” is not actually about warp drive at all. It’s about propellantlessspace drives — which are, if anything, even less believable than the warp drives. (There is a whole zoo of nomenclature devoted to categorizing all of the non-existent technologies of this general ilk, which I won’t bother to keep straight.) Warp drives at least inspired by some respectable science — Miguel Alcubierre’s energy-condition-violating spacetime. The “propellantless” stuff, on the other hand, just says “Laws of physics? Screw em.”

You may have heard of a little thing called Newton’s Third Law of Motion — for every action there is an equal and opposite reaction. If you want to go forward, you have to push on something or propel something backwards. The plucky NASA engineers in question aren’t hampered by such musty old ideas. As others have pointed out, what they’re proposing is very much like saying that you can sit in your car and start it moving by pushing on the steering wheel.

I’m not going to go through the various claims and attempt to sort out why they’re wrong. I’m not even an engineer! My point is a higher-level one: there is no reason whatsoever why these claims should be given the slightest bit of credence, even by complete non-experts. The fact that so many media outlets (with some happy exceptions) have credulously reported on it is extraordinarily depressing.

Now, this might sound like a shockingly anti-scientific attitude. After all, I certainly haven’t gone through the experimental results carefully. And it’s a bedrock principle of science that all of our theories are fundamentally up for grabs if we collect reliable evidence against them — even one so well-established as conservation of momentum. So isn’t the proper scientific attitude to take a careful look at the data, and wait until more conclusive experiments have been done before passing judgment? (And in the meantime make some artist’s impressions of what our eventual spaceships might look like?)

No. That is not the proper scientific attitude. For a very scientific reason: life is too short.

More here.

Thursday Poem

Tonight

You are being born. Feels good.
Something enormous kisses you.
Its eye surveys your revolutions.
Relaxed in your new nudity.

you work your labyrinthine ears,
those perfect disciples,
registering all that hums, ticks.
O you encyclopedia you,

you do not know what I know,
how blank the cold world can grow.
But let the addendums come later.
I listen to the dust from the city

gather on the necks of the saints
at the hospital’s exits I exit.
And so I say to you yes you:
everyone’s a fugitive. Everyone.
.

by Spencer Reece
from The Clerk’s Tale

What We Really Need is a Slice of Humble Pie

Daniel Nexon, over at Duck of Minerva:

We now have a lot of different meta-narratives about alleged fraud in “When Contact Changes Minds: An Experiment in the Transmission of Support for Gay Equality.” These reflect not only different dimensions of the story, but the different interests at stake.

One set concerns confirmation bias and the left-leaning orientations of a majority of political scientists. At First Things, for example. Matthew J. Franck contrasts the reception of the LaCour and Green study (positive) with that of Mark Regnerus’ finding of inferior outcomes for children of gay parents (negative). There’s some truth here. Regnerus’ study was terminally flawed. LaCour and Green’s study derived, most likely, from fraudulent data. Still, one comported with widespread ideological priors in the field, while the other did not. That surely shaped their differential reception. But so did the startling strength of the latter’s findings, as well as the way they cut against conventional wisdom on the determinants of successful persuasion.

We might describe another as “science worked.”

This narrative sometimes strays into the triumphalist: rather than exposing problems with the way political science operates, the scandal shows how the discipline is becoming more scientific and thus more able to catch—and correct—flawed studies. Again, there’s something to this. To the extent that political scientists utilize, say, experiments, then that opens up the possibility of creating fraudulent experimental data but also of uncovering such fraud.

More here.

Hive consciousness

Final1-960x601

Peter Watts in Aeon (Illustration by Richard Wilkinson):

Rajesh Rao (of the University of Washington's Center for Sensorimotor Neural Engineering) reported what appears to be a real Alien Hand Network – and going Pais-Vieira one better, he built it out of people. Someone thinks a command; downstream, someone else responds by pushing a button without conscious intent. Now we're getting somewhere.

There’s a machine in a lab in Berkeley, California, that can read the voxels right off your visual cortex and figure out what you’re looking at based solely on brain activity. One of its creators, Kendrick Kay, suggested back in 2008 that we’d eventually be able to read dreams(also, that we might want to take a closer look at certain privacy issues before that happened). His best guess was that this might happen a few decades down the road – but it took only four years for a computer in a Japanese lab to predict the content of hypnagogic hallucinations (essentially, dreams without REM) at 60 per cent accuracy, based entirely on fMRI data.

When Moore’s Law shaves that much time off the predictions of experts, it’s not too early to start wondering about consequences. What are the implications of a technology that seems to be converging on the sharing of consciousness?

It would be a lot easier to answer that question if anyone knew what consciousness is. There’s no shortage of theories. The neuroscientist Giulio Tononi at the University of Wisconsin-Madison claims that consciousness reflects the integration of distributed brain functions. A model developed by Ezequiel Morsella, of San Francisco State University, describes it as a mediator between conflicting motor commands. The panpsychics regard it as a basic property of matter – like charge, or mass – and believe that our brains don’t generate the stuff so much as filter it from the ether like some kind of organic spirit-catchers. Neuroscience superstar V S Ramachandran (University of California in San Diego) blames everything on mirror neurons; Princeton’s Michael Graziano – right here in Aeon – describes it as an experiential map.

I think they’re all running a game on us. Their models – right or wrong – describe computation, not awareness. There’s no great mystery to intelligence; it’s easy to see how natural selection would promote flexible problem-solving, the triage of sensory input, the high-grading of relevant data (aka attention).

But why would any of that be self-aware?

More here.