The rise of the medical humanities

Belinda Jack in Times Higher Education:

Feature-illustration-220115-0_450The cynical account for the rise of the medical humanities – a newish interdisciplinary area that explores the social, historical and cultural dimensions of medicine – would be an economic one. At a time of retrenchment in some subjects at some universities, disciplines are under pressure to demonstrate their practical value. Recent research that claims to show that reading novels promotes empathy would be an example of literature’s utility, particularly for medical students. There’s money in medicine and not so much in the humanities. But how new is this field or set of fields? The ancient Greek physician Hippocrates claimed that “wherever the art of Medicine is loved, there is also a love of Humanity”, suggesting both that medicine is an “art” and that there is a crucial association between medicine and the “human” dimension of the humanities.

In terms of literature, as soon as the novel rose to prominence in the 18th century a good many doctors more than dabbled in writing, often fiction. Oliver Goldsmith (1730-1774) trained as a doctor and wrote the best-selling novel The Vicar of Wakefield (1766), and Keats (1795-1821) turned to poetry in part because of the trauma he suffered by the experience of physically restraining fully conscious patients in order to perform surgery without anaesthesia. Friedrich von Schiller (1759-1805), German writer, poet, essayist, dramatist and friend of Goethe, was an army surgeon before achieving fame as a writer.

More here.

A Modern Clinical Trial: 7 Years, 1,000 Patients, and Plenty of Questions About Cost

Paul Basken in The Chronicle of Higher Education:

BpIt was my ritual for seven years. Every day, take two sets of pills—one labeled, the other a mystery. Every three months, take three sets of blood-pressure readings, twice a day for a week. Once a year, collect urine for 24 straight hours, lug it everywhere in an ice pack, then get it through airport security for a flight from Washington to Boston. For me and about 1,000 other participants in our medical trial, the payoff for such tedious detail came back last month: The combination of the two common types of blood-pressure drugs being tested didn’t make any significant difference in the progression of our inherited kidney disease. That was disappointing. But it didn’t necessarily mean that the trial was a failure, a waste of the time I spent on it, or a poor use of the $40-million in taxes that paid for it. The trial’s participants got top-notch medical attention for our polycystic kidney disease, and our records will almost certainly help others with PKD, now and in the future.

…All of that logistical structure can mean a huge financial cost. Randomized trials now account for about 20 percent of the $30-billion annual budget of the National Institutes of Health. Private drug companies spend more than $30-billion on them. Yet drug trials fail at a rate of about 90 percent. That level of failure has attracted serious attention now that U.S. medical research has entered a period of tighter budgets, accelerating technological advances, and extensive procedural reassessments. In that light, much about our trial’s design and execution illustrates a system of human experimentation that’s ripe for overhaul.

More here.

Saturday, January 24, 2015

I Served in Iraq, and American Sniper Gets It Right. But It’s Still Not the War Film We Need.

A_560x375

Brian Turner in Vulture:

This isn’t the defining film of the Iraq War. After nearly a quarter century of war and occupation in Iraq, we still haven’t seen that film. I’m beginning to think we’re incapable as a nation of producing a film of that magnitude, one that would explore the civilian experience of war, one that might begin to approach so vast and profound a repository of knowledge. I’m more and more certain that, if such a film film ever arrives, it’ll be made by Iraqi filmmakers a decade or more from now, and it’ll be little known or viewed, if at all, on our shores. The children of Iraq have far more to teach me about the war I fought in than any film I’ve yet seen — and I hope some of those children have the courage and opportunity to share their lessons onscreen. If this film I can only vaguely imagine is ever made, it certainly won’t gross $100 million on its opening weekend.

The biggest problem I have with American Sniper is also a problem I have with myself. It’s a problem I sometimes find in my own work, and it’s an American problem: We don’t see, or even try to see, actual Iraqi people. We lack the empathy necessary to see them as fully human. In American Sniper, Iraqi men, women, and children are known and defined only in relation to combat and the potential threat they pose. Their bodies are the site and source of violence. In both the film and our collective imagination, their humanity is reduced in ways that, ultimately, define our own narrow humanity. In American Sniper, Iraqis are called “savages,” and the “streets are crawling” with them. Eastwood and his screenwriter Jason Hall give Iraqis no memorable lines. Their interior lives are a blank canvas, with no access points to let us in. I get why that is: If Iraqis are seen in any other light, if their humanity is recognized, then the construct of our imagination, the ride-off-into-the-sunset-on-a-white-horse story we tell ourselves to push forward, falls apart.

More here.

‘Sophia: Princess, Suffragette, Revolutionary,’ by Anita Anand

25berne-master675Suzanne Berne at the New York Times:

Part of a biographer’s job is to rescue forgotten figures, and in “Sophia: Princess, Suffragette, Revolutionary” Anita Anand has salvaged an extraordinary one. Sophia Duleep Singh was a Punjabi princess and Queen Victoria’s goddaughter, a bucktoothed “docile little thing” who went on to become a celebrated London fashion plate and then a steely suffragist.

Her father, Maharajah Duleep Singh, was 11 when the British seized his vast Sikh empire and 15 when he was sent into exile in England. Victoria doted on him, remarking that he “was beautifully dressed and covered with diamonds” (though not the famed Koh-i-Noor, now among her crown jewels), adding, “I always feel so much for these poor deposed Indian princes.” Graciously, she granted him a stipend, which he overspent remodeling an East Anglian pile into the fabulous “Mogul palace” where he installed his bride, the daughter of a German businessman and an Abyssinian slave.

Born in 1876, Sophia spent much of her early years in the English countryside playing with her brothers and sisters “amidst enclosures filled with ostriches, rare parrots and monkeys,” an idyll that ended when the maharajah pillaged the estate to pay his creditors.

more here.

An impressive account of TS Eliot’s formative years

9ead061a-a2a1-11e4-aba2-00144feab7deJohn Sutherland at the Financial Times:

Within the bosom of every old man, said the philosopher William James, there is a dead young poet. TS Eliot, as Robert Crawford suggests in his opening sentence, “was never young”. He’s the Benjamin Button of poets. His first mature work, “The Love Song of J Alfred Prufrock”, was written when he was 22. It contains the couplet:

I grow old . . . I grow old . . . 
I shall wear the bottoms of my trousers rolled.

A later poem, “Gerontion” (in Greek, “wizened old man”), opens:

Here I am, an old man in a dry month,
Being read to by a boy, waiting for rain.

The author was barely 30 at the time but already “Old Possum”. Crawford’s endeavour, brilliantly achieved, is to disinter the dead young poet buried in the prematurely aged TS Eliot.

When Eliot died in January 1965 it was, for the literate classes, a passing of the same magnitude as Winston Churchill’s, three weeks later. One genuflected, humble in the face of literary greatness. But Eliot’s reputation, over the next half-century, was to become sadly chipped.

more here.

nick hornby’s ‘funny girl’

La-la-ca-0116-nick-hornby-363-jpg-20150121David L. Ulin at The LA Times:

Hornby has written about other female protagonists: Annie in “Juliet, Naked,” Katie Carr in “How to Be Good.” There's something more expansive, though, in “Funny Girl,” which is as sedate a work as he has produced. What I mean is that this is a book that takes the long view, that seeks to give us a broad sense of its characters' circumstances. In that regard, its 1960s setting serves a double purpose — first, to engage us in the energy of the era's burgeoning youth culture, and second, to remind us of the speed with which time eclipses all.

Sophie is an appropriate signifier: “Here was everything they wanted to bring to the screen,” Hornby writes of the production team that discovers her, “in one neat and beautifully gift-wrapped package, handed to them by a ferocious and undiscovered talent who looked like a star. The class system, men and women and the relationships between them, snobbery, education, the North and the South, politics, the way that a new country seemed to be emerging from the dismal old one that they'd all grown up in.”

The members of that team are the novel's other central players — Clive, the leading man who becomes Sophie's faithless fiancé; Dennis, the producer-director who loves her from a distance; the writers, Bill and Tony — one gay, the other married but (perhaps) closeted. It adds up to the portrait of a culture in transition, in which “[w]hat was once both pertinent and laudably impertinent became familiar and sometimes even a little polite.”

more here.

The World in 2030: We asked 15 of the smartest people we know for their most out-there predictions

Susan B. Glasser in Politico:

ScreenHunter_958 Jan. 24 16.37Genes as commerce

By Alec Ross, senior fellow at the Columbia University School of International & Public Affairs

Fifteen years from now, everybody reading this will live, on average, two years longer than their current life expectancy because of the commercialization of genomics. The price of mapping an individual’s genetic material has fallen from $2.7 billion to below $10,000, and it continues to fall.

Omniscience into the makeup and operation of the 3 billion base pairs of genetic code in each of our bodies will allow for tests to be developed that will find cancer cells at 1 percent of the size of what can be detected by an MRI today. It will allow for personalized prevention and treatment programs for nearly every illness, and will make today’s medical practices look medieval by comparison.

Of course, all of this will benefit the wealthy before it becomes affordable and available to everybody. That is the cruel reality of many of the innovations to come. They will make people live longer, healthier lives—but not everybody, and not all at once.

More here.

The virtue of scientific thinking

Steven Shapin in Boston Review:

Shapin-JF15-600_0Can science make you good?

Of course it can’t, some will be quick to say—no more than repairing cars or editing literary journals can. Why should we think that science has any special capacity for moral uplift, or that scientists—by virtue of the particular job they do, or what they know, or the way in which they know it—are morally superior to other sorts of people? It is an odd question, maybe even an illogical one. Everybody knows that the prescriptive world of ought—the moral or the good—belongs to a different domain than the descriptive world of is

The ideas and feelings informing the tendency to separate science from morality do not go back forever. Underwriting it is a sensibility close to the heart of the modern cultural order, brought into being by some of the most powerful modernity-making forces. There was a time—not long ago, in historical terms—when a different “of course” prevailed: of course science can make you good. It should, and it does.

A detour through this past culture can give us a deeper appreciation of what is involved in the changing relationship between knowing about the world and knowing what is right. Much is at stake. Shifting attitudes toward this relationship between is and ought explain much of our age’s characteristic uncertainty about authority: about whom to trust and what to believe.

Read the rest here.

Life beyond memory

Tomas Hachard in National Post:

Alzheimers-webWhen discussing a disease that is expected to double in prevalence over the next two decades, it is hard to countenance a silver lining; currently Alzheimer’s afflicts 5 percent of Canadians over 65, and the only existing treatment is a series of drugs that, at best, alleviate symptoms for a year. Even what little hope there is for avoiding the disease seems feeble, at best. In The End of Memory, a wide-ranging book on the history of Alzheimer’s, Jay Ingram lists a handful of lifestyle choices that apparently help prevent the disease. Exercise and education are two—the most proven. Learning a second language is another.And then there’s “conscientiousness,” an umbrella term, Ingram explains, for goal setting, determination, efficiency, organization, thoroughness, self-discipline, and reliability. According to some studies, the more we exhibit these traits, the less susceptible we are to Alzheimer’s. A responsible life, it seems, might actually afford us a peaceful death.

…The underlying reality of their topic, however, brings an unavoidable bleakness, and not just because of the currently far-off hopes for a cure. Scientists generally agree today that Alzheimer’s differs from normal aging. But precisely what distinguishes the two is still unclear. However unlikely a conclusion it is at this point, Peter Whitehouse’s suggestion that “in some sense we would all get Alzheimer’s if we live long enough”—posited in his 2008 book The Myth of Alzheimer’s—is still with us and signals an important fact about the disease: it’s impossible to detach our fear of it from our more general anxieties about growing old.

More here.

science as a force for good

Seth Shulman in The Washington Post:

MoralArc%20deep%20gray%20metallic‘The arc of the moral universe is long, but it bends toward justice,” the Rev. Martin Luther King Jr. told a crowd of protesters in Montgomery, Ala., in March 1965. King’s use of that quote stands as one of history’s more inspiring pieces of oratory, acknowledging that victories in the fight for social justice don’t come as frequently as we might like, while offering hope that progress will come eventually. But is the contention empirically true? Michael Shermer, a professor, columnist for Scientific American, and longtime public champion of reason and rationality, takes on this question and more. In “The Moral Arc,” Shermer aims to show that King is right so far about human civilization and that, furthermore, science and reason are the key forces driving us to a more moral world. It is at once an admirably ambitious argument and an exceedingly difficult one to prove. First, Shermer — defining moral progress as “improvement in the survival and flourishing of sentient beings” — needs to make a case that we humans are, in fact, moving toward such an improvement despite terrorist attacks on cartoonists, Islamic State beheadings, Taliban massacres of schoolchildren and police shootings of innocent civilians, among other seemingly daily atrocities. As he notes in the preface, when they heard he was working on a book about moral progress, “most people thought I was hallucinatory. A quick rundown of the week’s bad news would seem to confirm the diagnosis.”

If that weren’t tough enough, Shermer also needs to show that science and scientific reasoning are responsible for bettering our lot. Given science’s role in everything from the development of the atomic bomb to pervasive government surveillance, it’s hard to know which of his self-appointed tasks is more daunting.To his credit, Shermer tackles this broad agenda with an abundance of energy, good cheer and anecdotes on everything from “Star Trek” episodes and the reasoning of Somali pirates to the demise of the Sambo’s restaurant chain. The anecdotes provide leavening but don’t alter the fact that this is a work of serious and wide-ranging scholarship with a bibliography that runs to nearly 30 pages. The effect can be kaleidoscopic and even a bit scattershot at times, but that doesn’t detract from the truly impressive array of data Shermer assembles.

More here.

Saturday Poem

Crimson

—“Darkening Red”
a painting by Mark Rothko

To explain crimson, Darkening red
the grotesque danger,
the acute beauty
and commotion of it,

how it commands recollection,
even after every trace
is vanished, I describe
our small faces

smeared crimson
sweet and sour cherry pits
stacked in front of us
like small cannonballs

the first stain gleaming
inside my teenage thighs,
seen down below
through new breasts,

my cousin’s cheek
after the rake hit
the bony part near her eye
forming a fork-shaped wound,

or at the butcher’s shop,
watching as his thick fingers
kept streaking
his long white apron.

I know there is no forgetting.
Years after my butterflied chest
(the surgeon’s cache) is splayed
under a blaze of lights, I relive red

nightmares that darken
long after the scar that ropes my ribs
turns silvery,
like birch.

by Jim Culleny
from Alehouse 2011

Painting by Mark Rothko

Friday, January 23, 2015

Summer without End

Wayne Scott in The Millions:

DownloadWhen I was on a vacation in the Virgin Islands with my two brothers and my 70-year old mother — an exceptional hiatus from our lives with family and children, just the four of us, to celebrate my mother’s milestone birthday, our good fortune that we had had her in our lives for such a long time — I happened upon a collection of essays by E.B. White, a book that the house owners had left on the shelf. I had read White’s autobiographical piece, “Once More to the Lake” in college, but here I was, a man in his late-40s, again under its spell. Throughout our time at that lovely house under the clear skies, overlooking the deep-blue Atlantic Ocean, I kept returning to his rumination on summer memories.

Written in August 1941 and published originally in Harper’s, the story is deceptively simple. White takes his son to a camp for a short vacation. It is the same camp, by Belgrade Lake in Maine, where his father had taken him many times when he was growing up, over 30 years before. He writes, “I wondered how time would have marred this unique, this holy spot.” Except for the sound of outboard motors on boats, a mid-century technological advance — a “petulant, irritable sound” that “whined about one’s ears like mosquitoes” — he found it to be the same place. “Once More to the Lake” is not a psychological exploration, except for one recurring detail. As White sees his son engage in activities that he himself used to do — baiting a fish hook, pulling on a bathing suit — he transposes identities, imagining himself as his father to his younger self. The jarring illusion keeps returning.

More here.

Lies, All Lies

Clancy Martin in the Chronicle of Higher Education:

ScreenHunter_957 Jan. 23 23.52Practically speaking, I’ve always been interested in lying. But I remember when the subject first caught my intellectual attention: I was 11 or 12, in a Waldenbooks, and the shelves of the philosophy section—I’ve walked straight to that aisle since I was a kid, with my dad, who loved philosophy, though he was kicked out of college after only one semester—were lined with copies of Sissela Bok’s best-selling Lying. I was nervous even to pick it up, fearing, as many people do, that taking an interest in lies would expose that I was a liar.

This is one of the curious facts about lying. It’s treated a lot like the subject of masturbation was at around the same time. Among my friends, everyone suspected that all of us masturbated, but when one kid, my closest buddy—now a respected psychiatrist—tried to bring it up honestly, we laughed at him and nervously changed the subject.

This is how we handle embarrassing open secrets about popular “vices.” And we lie even more often (a lot more often) than we masturbate. In Dallas G. Denery’s excellent new history of Western thinking on deception, The Devil Wins, he cites a recent study that shows that “during every 10 minutes of conversation, we lie three times and even more frequently when we use email and text messaging.”

More here.

Walzer on Islamism and the Left

Justin E. H. Smith in his blog:

6a00d83453bcda69e201b8d0c6a1a7970c-350wi (1)I finally read Michael Walzer's influential article on “Islamism and the Left,” after being told a number of times that I had inadvertently been echoing his opinion when I sided unconditionally with the caricaturists against the assassins who came to kill them. I find that I do agree with an early, fairly obvious point Walzer makes, but then disagree with most of the rest.

The obvious point is that the American left has for the most part failed to provide any serious analysis of the phenomenon of political Islamism, and moreover that it has failed to do so for very bad reasons, including notably the groundless presumption of common cause with the Islamists. Where my disagreement begins is with Walzer's central assertion that Islam presents a particular problem in the current global order. It seems to me that this claim is at odds with his own further assertion that religion in general is functioning as a stimulant to violence throughout the world in the post-secular age.

To ward off in advance any suspicion of Islamophobia on his own part, Walzer invokes the Christian crusades in the Levant of the Middle Ages to show that there is nothing eternal or essential about Islamic violence, but that in different times and places the same violence can be done in the name of other religions, sometimes targeting Muslims. A Muslim in the 12th-century Levant would have been justified to suppose that the Christians have a problem with violence, Walzer observes. But why time-travel, when we can just travel? We don't have to go to the 12th-century Levant, when we can go directly to 21st-century India, where the Muslim minority, right now, is very justified to suppose that Hindus have a 'violence problem'. The same thing for Muslims in Burma being massacred by rampaging Buddhist monks.

More here.

the cave: thoughts on pregnancy, privacy, and pain

Kiss_claysketch_1180Dawn Herrera-Helphand at The Point:

Arendt calls the private realm “the realm of necessity.” The language is hers, but it’s a variation on an old binary theme, the song of necessity and freedom. Figured variously as chaos, the animal, the feminine and the shadow realm, human necessity is the umbrella term for those aspects of life not subject to the rational will. In Arendt’s understanding, it especially signifies the immediate reality of embodied life, the thick stuff of it, the part that’s been squicking out Western squares from Plato to the present. To the chagrin of the Platonist, it is an irreducible aspect of our living being.

In its most mundane iterations, necessity is a driving and an equalizing force that compels everyone. We all eat and drink, we shit, we sleep and probably try to get off—you, yes you. With luck, the resources for doing so are reasonably secure and we can meet these demands with dignity, securely and without fear of opprobrium at the salience of our appetites and drives. Fussing over particulars aside, there is not a lot of room for reason-giving or reason-having in this realm of experience. Bodies drive us in some things. We do them because we are essentially beholden—we have to. And, having to do them, we prefer to do them in private.

Pain is the most intense manifestation of this phenomenon. As Elaine Scarry puts it in The Body in Pain (1985), pain brings about “a state anterior to language, to the sounds and cries a human being makes before language is learned.”

more here.

virginia woolf on e.m. forster

E._M._Forster_von_Dora_Carrington_1924-25Virginia Woolf at berfrois (originally from: The Death of the Moth, And Other Essays):

We look then, as time goes on, for signs that Mr. Forster is committing himself; that he is allying himself to one of the two great camps to which most novelists belong. Speaking roughly, we may divide them into the preachers and the teachers, headed by Tolstoy and Dickens, on the one hand, and the pure artists, headed by Jane Austen and Turgenev, on the other. Mr. Forster, it seems, has a strong impulse to belong to both camps at once. He has many of the instincts and aptitudes of the pure artist (to adopt the old classification)— an exquisite prose style, an acute sense of comedy, a power of creating characters in a few strokes which live in an atmosphere of their own; but he is at the same time highly conscious of a message. Behind the rainbow of wit and sensibility there is a vision which he is determined that we shall see. But his vision is of a peculiar kind and his message of an elusive nature. He has not great interest in institutions. He has none of that wide social curiosity which marks the work of Mr. Wells. The divorce law and the poor law come in for little of his attention. His concern is with the private life; his message is addressed to the soul. “It is the private life that holds out the mirror to infinity; personal intercourse, and that alone, that ever hints at a personality beyond our daily vision.” Our business is not to build in brick and mortar, but to draw together the seen and the unseen. We must learn to build the “rainbow bridge that should connect the prose in us with the passion. Without it we are meaningless fragments, half monks, half beasts.” This belief that it is the private life that matters, that it is the soul that is eternal, runs through all his writing.

more here.

Ritual as an Urban Design Problem

4340071721_76648ae13e_b-150x150Sarah Perry at Front Porch Republic:

The Benedictine monk Aidan Kavanagh, who straddled two worlds as both a monk and a Yale divinity professor, proposes that we understand the Church as originally and centrally an urban phenomenon. He translates civitas as “workshop” and “playground,” the space in which social, philosophical, and even scientific questions are worked out by humans in contact with their God, “the locale of human endeavor par excellence.”

By the fifth century A.D., Christian worship in the great cities of Jerusalem, Antioch, Alexandria, Rome, and Constantinople had become not just one service, but an “interlocking series of services” that began at daybreak with laudes and ended at dusk with lamp-lighting and vespers. Only the most pious participated in all the services, but everyone participated in some. The rites “gave form not only to the day itself but to the entire week, the year, and time itself,” says Kavanagh.

Perhaps just as important as the transformation of time was the transformation of space, for the mid-morning assemblages and processions appropriated the entire neighborhood as space for worship.

more here.