Christian America is an invention: Big business, right-wing politics and the religious lie that still divides us

Kevin M. Kruse in Salon:

Huckabee_bush_roveThe idea of “one nation under God” is a modern one — and does not date back to the Founding Fathers.

When he ran for the White House, Texas governor George W. Bush took a similarly soft approach, though one that came from the right. A born-again Christian, he shared Bill Clinton’s ability to discuss his faith openly. When Republican primary candidates were asked to name their favorite philosopher in a 1999 debate, for instance, Bush immediately named Christ, “because He changed my heart.” Despite the centrality of faith in his own life, Bush assured voters that he would not implement the rigid agenda of the religious right. Borrowing a phrase from author Marvin Olasky, Bush called himself a “compassionate conservative” and said he would take a lighter approach to social issues including abortion and gay rights than culture warriors such as Pat Buchanan. But many on the right took issue with the phrase. For some, the “compassionate” qualifier implicitly condemned mainstream conservatism as heartless; for others, the phrase seemed an empty marketing gimmick. (As Republican speechwriter David Frum put it, “Love conservatism but hate arguing about abortion? Try our new compassionate conservatism—great ideological taste, now with less controversy.”) But the candidate backed his words with deeds, distancing himself from the ideologues in his party. In a single week in October 1999, for instance, Bush criticized House Republicans for “balancing the budget on the backs of the poor” and lamented that all too often “my party has painted an image of America slouching toward Gomorrah.”

In concrete terms, Bush’s “compassionate conservatism” constituted a promise to empower private religious and community organizations and thereby expand their role in the provision of social services. This “faith­ based initiative” became the centerpiece of his campaign. In his address to the 2000 Republican National Convention, Bush heralded the work of Christian charities and called upon the nation to do what it could to sup­port them. After his inauguration, Bush moved swiftly to make the pro­posal a reality. Indeed, the longest section of his 2001 inaugural address was an expansive reflection on the idea. “America, at its best, is compassionate,” he observed. “Church and charity, synagogue and mosque lend our communities their humanity, and they will have an honored place in our plans and in our laws.” Bush promoted the initiative at his first Na­tional Prayer Breakfast as well. But it was ill-fated. Hamstrung by a lack of clear direction during the administration’s first months, it was quickly overshadowed by a new emphasis on national security after the terrorist attacks of 9/11.

More here.



Why Almost Everything Dean Ornish Says about Nutrition Is Wrong

Melinda Wenner Moyer in Scientific American:

FatLast month, an op–ed in The New York Times argued that high-protein and high-fat diets are to blame for America’s ever-growing waistline and incidence of chronic disease. The author, Dean Ornish, founder of the nonprofit Preventive Medicine Research Institute, is no newcomer to these nutrition debates. For 37 years he has been touting the benefits of very low-fat, high-carbohydrate, vegetarian diets for preventing and reversing heart disease. But the research he cites to back up his op–ed claims is tenuous at best. Nutrition is complex but there is little evidence our country’s worsening metabolic ills are the fault of protein or fat. If anything, our attempts to eat less fat in recent decades have made things worse. Ornish begins his piece with a misleading statistic. Despite being told to eat less fat, he says, Americans have been doing the opposite: They have “actually consumed 67 percent more added fat, 39 percent more sugar and 41 percent more meat in 2000 than they had in 1950 and 24.5 percent more calories than they had in 1970.” Yes, Americans have been eating more fat, sugar and meat, but we have also been eating more vegetables and fruits (pdf)—because we have been eating more of everything.

What’s more relevant to the discussion is this fact: During the time in which the prevalence of obesity in the U.S. nearly tripled, the percentage of calories Americans consumed from protein and fat actually dropped whereas the percentage of calories Americans ingested from carbohydrates—one of the nutrient groups Ornish says we should eat more of—increased. Could it be that our attempts to reduce fat have in fact been part of the problem? Some scientists think so. “I believe the low-fat message promoted the obesity epidemic,” says Lyn Steffen, a nutritional epidemiologist at the University of Minnesota School of Public Health. That’s in part because when we cut out fat, we began eating foods that were worse for us. Ornish goes to argue that protein and saturated fat increase the risk of mortality and chronic disease. As evidence for these causal claims, he cites a handful of observational studies. He should know better. These types of studies—which might report that people who eat a lot of animal protein tend to develop higher rates of disease—“only look at association, not causation,” explains Christopher Gardner, a nutrition scientist at the Stanford Prevention Research Center. They should not be used to make claims about cause and effect; doing so is considered by nutrition scientists to be “inappropriate” and “misleading.” The reason: People who eat a lot of animal protein often make other lifestyle choices that increase their disease risk, and although researchers try to make statistical adjustments to control for these “confounding variables,” as they’re called, it’s a very imperfect science. Other large observational studies have found that diets high in fat and protein are not associated with disease and may even protect against it. The point is, it’s possible to cherry-pick observational studies to support almost any nutritional argument.

More here.

Friday Poem

The Child Who Was Shot Dead
.

The child is not dead
the child raises his fists against his mother
who screams Africa screams the smell
of freedom and heather
in the locations of the heart under siege

The child raises his fists against his father
in the march of the generations
who scream Africa scream the smell
of justice and blood
in the streets of his armed pride

The child is not dead
neither at Langa nor at Nyanga
nor at Orlando nor at Sharpeville
nor at the police station in Philippi
where he lies with a bullet in his head

The child is the shadow of the soldiers
on guard with guns saracens and batons
the child is present at all meetings and legislations
the child peeps through the windows of houses and into the hearts of mothers
the child who just wanted to play in the sun at Nyanga is everywhere
the child who became a man treks through all of Africa
the child who became a giant travels through the whole world

Without a pass
.

by Ingrid Jonker Trust
from Rook en Oker
publisher: Afrikaanse Pers, Johannesburg, 1963

Thursday, April 23, 2015

How the Truth About Palestine Won Netanyahu the Israeli Election

1024px-Yitzhak_Rabin_(1986)-web

Omri Boehm in The Boston Review (Image: Wikimedia commons):

The universal deceit exposed by Netanyahu just before Election Day had two main lies for pillars. First, of course, his own lie, the 2009 “Bar Ilan Speech,” in which the Prime Minister, facing European and American pressure, pretended to renounce everything that he himself and the Likud Party had ever believed in, and publically endorsed the two-state solution:

The truth is that in the area of our homeland, in the heart of our Jewish Homeland, now lives a large population of Palestinians. We do not want to rule over them. We do not want to run their lives. We do not want to force our flag and our culture on them. In my vision of peace, there are two free peoples living side by side in this small land, with good neighborly relations and mutual respect, each with its flag, anthem and government, with neither one threatening its neighbor's security and existence.

Non-experts may be unable to appreciate how dramatic this statement was. Ever since the 1930s, the ideological difference dividing Ben-Gurion’s mainstream Labor Zionism from Jabotinsky’s nationalist-revisionist alternative has turned on the question of the land’s partition. In 1947, Ben-Gurion enthusiastically supported the United Nation’s Partition Plan—encouraging the establishment of a two-state solution within today’s ’48 borders—while Jabotinsky’s successor as revisionist leader, Menachem Begin, fiercely objected.

True: some thirty years later, as Israel’s Prime Minister, Begin would hand over to Sadat the entire Sinai Peninsula. But this was a territorial compromise to the Arab Republic of Egypt, not a political-territorial concession to the Palestinian people, whose existence revisionists have always denied. Also true: Ariel Sharon, as Prime Minister, handed over occupied land to the Palestinian Authority in Gaza, and to that end evacuated thousands of settlers. But in order to do this, Sharon had to leave the Likud party, his natural home, and establish a new party, Kadima, with Labor leaders such as Shimon Peres.

So when Netanyahu stood in Bar-Ilan University and announced, as Israel’s Prime Minister and Likud leader, that the Palestinians deserve to get “their own flag, anthem and government,” he did something genuinely new in Zionist history. For die-hard revisionists such as Rubi Rivlin, Israel’s current friendly-looking president, Netanyahu’s two-state concession came as a shock. If the Likud Party kept relatively calm, it was because everybody knew that Netanyahu was lying.

More here.

The Function of Criticism at the Present Time

Virginia-jackson1

Virginia Jackson in the LA Review of Books:

LAUREN BERLANT is a critic’s critic, a feminist’s feminist, and a thinker’s friend. This is most simply true because of the number, depth, and influence of her abundant authored and co-authored and edited and co-edited books, her ever more numerous articles, essays, interviews, dialogues and monologues, and especially her proliferating collaborations; she always seems to be writing yet another book with yet another interesting someone else. Lots of people think with and because of Lauren Berlant. But academic “productivity” (that ubiquitous and ugly word, itself a symptom of the corporate manufacture of a crisis in the humanities) isn’t the most important reason that my first proposition — that Berlant is a critic’s critic — is just true. The reason that Lauren Berlant occupies this moment in critical theory so capaciously is that what she really always thinks about is genre.

Once upon a time, or so the story goes, the genre system was hierarchical and taxonomic (though not so fixed that at least as early as Aristotle’s Poeticsit wasn’t open to debate), with “tragedies” clearly separated from “comedies,” for example. Later, in modernity (the novel is usually considered both the origin and result of this shift), genres became modes of recognition — complex forms instantiated in popular discourse, relying on what we could or would recognize collectively, in common — and so subject to historical change and cultural negotiation. Once genres became historical, the story continues, it then became the critic’s job to manage and translate those emerging forms of recognition for the benefit of readers who experienced them without knowing exactly what it was they were seeing and feeling.

Genre seems like an old-fashioned, belletristic frame to impose on Berlant’s political, cultural, and affective range. I may be wrong, but I’m betting that if I asked you to think of a queer theorist, Berlant is one of the first critics you would name, and if I asked you to think of a theorist of public culture or affect or performativity or media publics or marginal aesthetics or crisis, Berlant would be one of the first critics you would name, but if I asked you to think of a genre theorist, Berlant would not be the first critic to come to mind. No one would accuse Lauren Berlant of being a purely literary critic.

More here.

Scientists seek rare muon conversion that could signal new physics

Diana Kwon in Symmetry:

ScreenHunter_1155 Apr. 23 17.48This weekend, members of the Mu2e collaboration dug their shovels into the ground of Fermilab's Muon Campus for the experiment that will search for the direct conversion of a muon into an electron in the hunt for new physics.

For decades, the Standard Model has stood as the best explanation of the subatomic world, describing the properties of the basic building blocks of matter and the forces that govern them. However, challenges remain, including that of unifying gravity with the other fundamental forces or explaining the matter-antimatter asymmetry that allows our universe to exist. Physicists have since developed new models, and detecting the direct conversion of a muon to an electron would provide evidence for many of these alternative theories.

“There's a real possibility that we'll see a signal because so many theories beyond the Standard Model naturally allow muon-to-electron conversion,” said Jim Miller, a co-spokesperson for Mu2e. “It'll also be exciting if we don't see anything, since it will greatly constrain the parameters of these models.”

Muons and electrons are two different flavors in the charged-lepton family. Muons are 200 times more massive than electrons and decay quickly into lighter particles, while electrons are stable and live forever. Most of the time, a muon decays into an electron and two neutrinos, but physicists have reason to believe that once in a blue moon, muons will convert directly into an electron without releasing any neutrinos. This is physics beyond the Standard Model.

Under the Standard Model, the muon-to-electron direct conversion happens too rarely to ever observe. In more sophisticated models, however, this occurs just frequently enough for an extremely sensitive machine to detect.

The Mu2e detector, when complete, will be the instrument to do this.

More here. [Thanks to Farrukh Azfar.]

Kiss goodbye to freedom

Raymond Geuss foresees a future of strict controls or war over resources. Matthew Reisz meets the radical philosopher and traces his intellectual development.

Matthew Reisz in The Times:

Philosophy_and_Real_PoliticsRaymond Geuss would like his fellow philosophers (and many of his fellow citizens) to think about politics in a radically different way. His latest book, Philosophy and Real Politics, is a quietly ferocious broadside against much received wisdom.

Among its targets are “the highly moralised tone in which some public diplomacy is conducted, at any rate in the English-speaking world, and … the popularity among political philosophers of the slogan 'Politics is applied ethics'.” What this slogan usually means in practice is starting off with abstract general principles or intuitions about fairness, justice, equality or rights, and then applying them to specific political situations. Such an approach, Geuss believes, is unlikely to tell us anything much about the real world of power struggles and messy compromises.

Some of his arguments have their roots in his earliest educational experiences. In a tribute to the philosopher Richard Rorty, Geuss offers an entertaining account of his Roman Catholic upbringing and how it had granted him “relative immunity to nationalism” – and in particular the “patently absurd” notion that there was something “special” about the United States.

For the Irish and Irish-American nuns who taught him from the ages of five to 12, only the Roman Catholic Church was truly universal and international. They knew that all the popes had been Italian, but that was for purely fortuitous reasons. “Only an Italian could stand to live in Rome: it was hot, noisy and overcrowded, and the people there ate spaghetti for dinner every day rather than proper food, ie potatoes, so it would be too great a sacrifice to expect someone who had not grown up in Italy to tolerate life there.”

More here.

How Fellini made his modernist masterpiece

Secchiaroli-mastroianni-02Ian Thomson at The Spectator:

Federico Fellini’s La Dolce Vita was a box-office triumph in Italy in 1960. It made $1.5 million at the box office in three months — more than Gone With the Wind had. ‘It was the making of me,’ said Fellini. It was also the making of Marcello Mastroianni as the screen idol with a curiously impotent sex appeal. No other film captured so memorably the flashbulb glitz of Italy’s postwar ‘economic miracle’ and its consumer boom of Fiat 500s and Gaggia espresso machines. Unsurprisingly, the Vatican objected to the scene where Mastroianni makes love to the Swedish diva Anita Ekberg (who died earlier this year at the age of 83) in the waters of the Trevi fountain. Sixties Rome became a fantasy of the erotic ‘sweet life’ thanks in part to that scene.

After La Dolce Vita, Fellini found himself at a creative loss and hung a sign above his desk: ‘NOW WHAT?’ Sophia Loren’s movie-mogul husband, Carlo Ponti, persuaded him to contribute to Boccaccio ’70, a collection of lewd short films inspired by the medieval Decameron, but it was a critical failure. At the end of 1961, still in ‘creative limbo’, Fellini began work on , his eighth and a half film; it was to be a signpost in his development as a magician-director.

more here.

Depictions of insanity through history

2161762323_983946a7ca_oAndrew Scull at The Paris Review:

Modern psychiatry seems determined to rob madness of its meanings, insisting that its depredations can be reduced to biology and nothing but biology. One must doubt it. The social and cultural dimensions of mental disorders, so indispensable a part of the story of madness and civilization over the centuries, are unlikely to melt away, or to prove no more than an epiphenomenal feature of so universal a feature of human existence. Madness indeed has its meanings, elusive and evanescent as our attempts to capture them have been.

Western culture throughout its long and tangled history provides us with a rich array of images, a remarkable set of windows into both popular and latterly professional beliefs about insanity. The sacred books of the Judeo-Christian tradition are shot through with stories of madness caused by possession by devils or divine displeasure. From Saul, the first king of the Israelites (made mad by Yahweh for failing to carry out to the letter the Lord’s command to slay every man, woman, and child of the Amalekite tribe, and all their animals, too), to the man in the country of the Gaderenes “with an unclean spirit” (maddened, naked, and violent, whose demons Christ casts out and causes to enter a herd of swine, who forthwith rush over a cliff into the sea to drown), here are stories recited for centuries by believers, and often transformed into pictorial form. None proved more fascinating than the story of Nebuchadnezzar, the mighty king of Babylon, the man who captured Jerusalem and destroyed its Temple, carrying the Jews off into captivity all apparently without incurring divine wrath.

more here.

The True Story of Rupert Brooke

Scutts-The-True-Story-of-Rupert-Brooke-320Joanna Scutts at The New Yorker:

For a long time, the history of the First World War has been understood via the symbolic transition from Brooke to Wilfred Owen, from posh idiot nationalist to heroic witness. That simple narrative obscures the extent to which Owen worshipped Brooke in the early days and just how long Brooke remained the war’s most famous poet. Until the nineteen-sixties, when the “left-wing myths” about the war gained purchase, Brooke’s sonnets and his image still seemed to represent something true and comforting. In the mid-nineties, an anthology of the nation’s hundred favorite poems included three each by Brooke and Owen, although only one of Brooke’s, “The Soldier,” was a war poem. At the hundredth anniversary of his death, as more letters and lovers are revealed, Brooke is making more headlines as a famous playboy than as a poet or patriot.

It’s impossible to know what Brooke might have written if he had seen what the other war poets saw, or what he might have become if he’d survived his own golden age. In his small body of work, the war sonnets are anomalous—his verse is usually more playful, less po-faced.

more here.

Neuroscientists create the sensation of invisibility

From PhysOrg:

InvisibleThe power of invisibility has long fascinated man and inspired the works of many great authors and philosophers. In a study from Sweden's Karolinska Institutet, a team of neuroscientists now reports a perceptual illusion of having an invisible body, and show that the feeling of invisibility changes our physical stress response in challenging social situations. The history of literature features many well-known narrations of invisibility and its effect on the human mind, such as the myth of Gyges' ring in Plato's dialogue The Republic and the science fiction novel The Invisible Man by H.G. Wells. Recent advances in materials science have shown that invisibility cloaking of large-scale objects, such as a human body, might be possible in the not-so-distant future; however, it remains unknown how invisibility would affect our brain and body perception. In an article in the journal Scientific Reports, the researchers describe a perceptual illusion of having an invisible body. The experiment involves the participant standing up and wearing a set of head-mounted displays. The participant is then asked to look down at her body, but instead of her real body she sees empty space. To evoke the feeling of having an invisible body, the scientist touches the participant's body in various locations with a large paintbrush while, with another paintbrush held in the other hand, exactly imitating the movements in mid-air in full view of the participant.

“Within less than a minute, the majority of the participants started to transfer the sensation of touch to the portion of empty space where they saw the paintbrush move and experienced an invisible body in that position,” says Arvid Guterstam, lead author of the present study.

More here.

Assange: How ‘The Guardian’ Milked Edward Snowden’s Story

WikiLeaks founder Julian Assange investigates the book behind Snowden, Oliver Stone's forthcoming film starring Joseph Gordon-Levitt, Shailene Woodley, Nicolas Cage, Scott Eastwood and Zachary Quinto. According to leaked Sony emails, movie rights for the book were bought for $700,000.

Julian Assange in Newsweek:

Rtr4vr98

The Snowden Files: The Inside Story of the World's Most Wanted Man (Guardian/Faber & Faber, 2014) by Luke Harding is a hack job in the purest sense of the term. Pieced together from secondary sources and written with minimal additional research to be the first to market, the book's thrifty origins are hard to miss.

The Guardian is a curiously inward-looking beast. If any other institution tried to market its own experience of its own work nearly as persistently as The Guardian, it would surely be called out for institutional narcissism. But because The Guardian is an embarrassingly central institution within the moribund “left-of-center” wing of the U.K. establishment, everyone holds their tongue.

In recent years, we have seen The Guardian consult itself into cinematic history—in the Jason Bourne films and others—as a hip, ultra-modern, intensely British newspaper with a progressive edge, a charmingly befuddled giant of investigative journalism with a cast-iron spine.

The Snowden Files positions The Guardian as central to the Edward Snowden affair, elbowing out more significant players like Glenn Greenwald and Laura Poitras for Guardian stablemates, often with remarkably bad grace.

“Disputatious gay” Glenn Greenwald's distress at the U.K.'s detention of his husband, David Miranda, is described as “emotional” and “over-the-top.” My WikiLeaks colleague Sarah Harrison—who helped rescue Snowden from Hong Kong—is dismissed as a “would-be journalist.”

I am referred to as the “self-styled editor of WikiLeaks.” In other words, the editor of WikiLeaks. This is about as subtle as Harding's withering asides get. You could use this kind of thing on anyone.

Read the full piece here.

Wednesday, April 22, 2015

Decline of the West II: The Dysoning

WestDyson

Scott McLemee on the Michael Eric Dyson's profile on Cornel West in TNR:

The mutual-admiration arrangement lasted until sometime near the end of the first Obama administration, when West turned up the heat on his criticisms of the president as (among other things) a “black mascot of Wall Street oligarchs” and “the head of the American killing machine.” A number of black liberals took issue with West’s hard left turn. But it was Dyson’s defenses of the president that seemed especially to rankle West. In August 2013, West singled out Dyson by name as one of the people “who’ve really prostituted themselves intellectually in a very ugly and vicious way.”

Similar pleasantries followed. Dyson’s response was muted until earlier this month, when he made some not very subtle allusions to West at a meeting of the National Action Network, the civil rights organization founded by Al Sharpton. “Be honest and humble in genuine terms,” Dyson said, “not the public performance of humility masquerading a huge ego. No amount of hair can cover that.” His more expansive remarks in print run to more than 9,000 words, accompanied by a drawing in which West appears to have a very bad case of dandruff.

One assessment now making the rounds is that it’s a lamentable case of the white establishment turning two formidable African-American minds against one another when otherwise they might be uniting against all that merits ruthless critique. I doubt a more inane judgment is possible. A pretty thoroughgoing ignorance of African-American intellectual history would be required to assume that black thinkers can’t or won’t do battle without there being some Caucasian fight promoter involved. Richard Wright never entirely recovered from James Baldwin’s essay “Everybody’s Protest Novel.” The great but long-neglected black sociologist Oliver C. Cox was scathing about the work of his colleague E. Franklin Frazier.

Such conflicts can be psychobabbled into meaninglessness, of course. Cox’s remarks were attributed to jealousy (Frazier became the first African-American president of the American Sociological Association in 1948, the same year Cox published his overlooked masterpiece Class, Caste, and Race) while Baldwin’s critique of Wright seems like a perfect example of the Oedipal conflict between authors that Harold Bloom calls “the anxiety of influence.” And yes, the ego will take its revenge, given a chance. But real differences in understanding of American society or the role of the artist were involved in those disputes. Those who profess to favor a vigorous intellectual life, and yet deprecate polemic, want crops without plowing up the ground.

But in moving from Baldwin/Wright and Cox/Frazier to Dyson/West, we descend a hundred miles in conceptual altitude.

More here.

What is the Self? Watch Philosophy Animations Narrated by Stephen Fry on Sartre, Descartes & More

Colin Marshall in Open Culture:

If you’ve followed our recent philosophy posts, you’ve heard Gillian Anderson (The X-Files) speak on what makes us human, the origins of the universe, and whether technology has changed us, and Harry Shearer speak on ethics — or rather, you’ve heard them narrate short educational animations from the BBC scripted by Philosophy Bites‘ Nigel Warburton. Now another equally distinctive voice has joined the series to explain an equally important philosophical topic. Behold Stephen Fry on the Self.

More here.

Student Course Evaluations Get An ‘F’

Ed-lotf_slide-ba79a46720a0c24f2471e0680256ab143312ed04-s700-c85

Anya Kamenetz in NPR (image LA Johnson/NPR):

Recently, a number of faculty members have been publishing research showing that the comment-card approach may not be the best way to measure the central function of higher education.

Philip Stark is the chairman of the statistics department at the University of California, Berkeley. “I've been teaching at Berkeley since 1988, and the reliance on teaching evaluations has always bothered me,” he says.

Stark is the co-author of “An Evaluation of Course Evaluations,” a new paper that explains some of the reasons why.

For one thing, there's response rate. Fewer than half of students complete these questionnaires in some classes. And, Stark says, there's sampling bias: Very happy or very unhappy students are more motivated to fill out these surveys.

Then there's the problem of averaging the results. Say one professor gets “satisfactory” across the board, while her colleague is polarizing: Perhaps he's really great with high performers and not too good with low performers. Are these two really equivalent?

Finally, there's the simple fact that faculty interactions with students and the student experience in general vary widely across disciplines and types of class. Whether they're in an an upper-division seminar, a studio or lab, or a large lecture course, students are usually asked to fill out the same survey.

Stark says his paper is unlikely to surprise most faculty members: “I think that there's general agreement that student evaluations of teaching don't mean what they claim to mean.” But, he says, “there's fear of the unknown and inertia around the current system.”

Michele Pellizzari, an economics professor at the University of Geneva in Switzerland, has a more serious claim: that course evaluations may in fact measure, and thus motivate, the opposite of good teaching.

More here.

Yes, You Can Catch Insanity

Andrew Curry in Nautilus:

ScreenHunter_1154 Apr. 22 19.11PANDAS represents a striking branch of medical research that has been gaining acceptance in recent years, though not without controversy. In a field known as immunopsychiatry, researchers are exploring the possibility that inflammation, or an overactive immune system, is linked to mental disorders that include depression, schizophrenia, and Alzheimers’ disease.

A host of recent genetic and epidemiological studies “have shown that when people are depressed or have psychotic episodes, inflammatory markers are found in their blood,” says Golam Khandaker, a senior clinical research associate at the University of Cambridge, in England, who studies inflammation and the brain.

In the case of PANDAS, when the body reacts to strep infection, parts of the brain that help regulate motion and behavior wind up caught in the crossfire, mistaken for bacterial invaders by cells bent on destroying them. Eliminate the inflammation, some doctors say, and you signal the immune system to stand down, restoring normal brain function.

The emergence of immunopsychiatry is a story of rediscovery, reflecting the twists and turns of mental health treatment over the last century. In the 19th century, mental illness and infectious disease were closely linked. That connection came uncoupled in the 20th century and immunopsychiatry’s argument that infection and inflammation can have a profound impact on the brain has struggled against psychiatric and neurological dogma. Yet emerging insights into mental illness unite the brain, body, and environment in ways that doctors and therapists are finally beginning to understand.

More here.

Ian McEwan on Books That Have Helped Shape His Novels

Alec Ash in Five Books:

Let’s start on your book selection. Your first choice is What Science Offers the Humanities, by Edward Slingerland. Tell us a little about the book first.

ScreenHunter_1153 Apr. 22 19.04It’s a rather extraordinary and unusual book. It addresses some fundamental matters of interest to those of us whose education has been in the humanities. It’s a book that has received very little attention as far as I know, and deserves a lot more. Edward Slingerland’s own background is in Sinology. Most of us in the humanities carry about us a set of assumptions about what the mind is, or what the nature of knowledge is, without any regard to the discoveries and speculations within the biological sciences in the past 30 or 40 years. In part the book is an assault on the various assumptions and presumptions of postmodernism – and its constructivist notions of the mind.

Concepts that in neuroscience and cognitive psychology are now taken for granted – like the embodied mind – are alien to many in the humanities. And Slingerland addresses relativism, which is powerful and pervasive within the humanities. He wants to say that science is not just one more thought system, like religion; it has special, even primary, status because it’s derived from empiricism, or it’s predictive and coherent and does advance our understanding of the world. So rather than just accept at face value what some French philosopher invents about the mirror stage in infant development, Slingerland wants to show us where current understanding is, and where it’s developing, in fields such as cognition, or the relationship between empathy and our understanding on evil. Slingerland believes that there are orthodox views within the humanities which have been long abandoned by the sciences as untenable and contradictory.

More here.

Does the return of the mercenary mean a world with more war?

Increasingly, America, Britain and even Australia are relying on ‘private security contractors’ to fight their wars. It’s a multi-billion dollar industry, but it’s also largely unregulated. Are we heading towards a world where armies and navies are available to the highest bidder?

Antony Funnell at the Australian Broadcasting Corporation:

ScreenHunter_1152 Apr. 22 18.53At the northern end of the Arlington National Cemetery in Washington DC there’s a statue that symbolises the way America sees its military.

Six WWII marines are set in bronze, frozen in time as they hoist the Stars and Stripes over the Japanese island of Iwo Jima.

The memorial was created from a real life photograph.

At the bottom of the statue there’s an inscription set into the pedestal in gold lettering. It reads: ‘Uncommon valour was a common virtue.’

This is the way most Americans still like to think of their soldiers: men and women determined in their duty and confirmed in their patriotism.

Whether that ideal was ever a universal reality is impossible to know, but it’s fair to say that for most of America’s history the stated motivation of the average American soldier has been national service.

Something unexpected happened after the invasion of Iraq in 2003, however.

More here.