Kapuscinski: autobiography by other means

Cover00

Taking stock of his habit of reporting from dangerous places, Ryszard Kapuscinski once told an interviewer, “Mine is not a vocation, it’s a mission. I wouldn’t subject myself to these dangers if I didn’t feel that there was something overwhelmingly important—about history, about ourselves—that I felt compelled to get across. This is more than journalism.” Kapuscinski’s celebrated chronicles of war and revolution in Africa, Asia, Latin America, and points elsewhere made him a darling of literary circles, and you hear a lot from his admirers about how he transcended the limits of journalism, how he was a practitioner of “a kind of magic journalism,” as fellow journalist Adam Hochschild has put it. Academics and others have questioned his facts and methods, but Kapuscinski , as he freely admitted, was after something different; literalism wasn’t the point.

more from Bookforum here.

How does a nation devoted to nonintervention become a global power?

China_africa1

Embarking upon a 12-day tour of Africa earlier this year, Chinese President Hu Jintao likely expected a warm welcome. Over the past five years China had emerged as a new power on the continent: Trade between China and Africa is growing by nearly 50 percent annually, Beijing may soon become Africa’s top aid donor, and in the winter China hosted nearly every African leader for a historic summit in Beijing. On Hu’s previous trip to the continent, in 2004, African leaders basked in China’s new interest.

But this time around, Hu found a far different welcome. Though he received polite applause from leaders across Africa, he had to cancel part of his trip to Zambia amid fears of street protests over poor safety records at a Chinese-owned mine there. In South Africa and other countries, he faced condemnation in the media for China’s human rights abuses, while across the continent African opinion leaders wondered why China was not doing more to help stop the genocide in Darfur. Before Hu’s visit, Nigerian militants had kidnapped Chinese workers; in April, Ethiopian militants killed nine Chinese oil workers.

more from Boston Globe Ideas here.

The Biology of the Imagination

Simon Baron-Cohen in Entelechy:

Childers1art5In what sense might something as intrinsically human as the imagination be biological? How could the products of the imagination – a novel, a painting, a sonata, a theory – be thought of as the result of biological matter? After all, such artefacts are what culture is made of. So why invoke biology? In this essay, I will argue that the content of the imagination is of course determined more by culture than biology. But the capacity to imagine owes more to biology than culture.

Let’s start with a few definitional issues. What do we mean by ‘imagination’? I do not mean mere imagery, though clearly the imagination may depend on the manipulation of imagery. Imagery is usually the product of one of the five senses (though it can also be generated without any sensory input at all, from the mere act of thinking or dreaming). Imagery typically comprises a mental representation of a state of affairs in the outside, physical world. I don’t want to put you off from reading this essay by littering it with jargon, so let’s just think of a mental representation as a picture in your head. That is what we are going to be calling an image, but that is not the same as imagination. Consider why not.

When we create a visual image of a specific object in our mind, the image as a picture of the object has a more or less truthful relationship to that object or outside state of affairs. If the image is a good, faithful, representation, it depicts the object or state of affairs accurately in all its detail. So, mental images typically have ‘truth relationships’ to the outside world.

More here.

Gabriel Garcia Marquez at last journeys home

John Otis in the Houston Chronicle:

311xinlinegalleryWhen a passenger train crawled into the station bringing Gabriel Garcia Marquez back to his hometown for the first time in 24 years, tears welled up in the Colombian writer’s eyes.

So many well-wishers showed up at the white-washed terminal in Aracataca — the setting for the surreal village of Macondo in his epic novel One Hundred Years of Solitude — that police officers struggled to clear a path for him. Everywhere the Nobel Prize-winning author paraded under overcast skies in a horse-drawn carriage, people chanted his nickname, breaking it into two distinct syllables, “Ga-bo! Ga-bo! Ga-bo!”

But the full-throated welcome for the man people here reverently call “El Nobel” also may prove to be a final farewell.

Now a silver-haired 80-year-old who was diagnosed with lymphatic cancer in 1999, Garcia Marquez is clearly slowing down. Although friends say he continues to write for two or three hours a day, it’s uncertain when he’ll finish his next book, the second part of a planned three-volume memoir.

More here.  [Thanks to Ruchira Paul.]

How People Come to Believe They Were Kidnapped by Aliens

Gustav Jaboda reviews Abducted by Susan A. Clancy, in Metapsychology:

Abduction20logoHalf a century ago, during the cold war, the social psychologist Leon Festinger and his colleagues studied a millennial sect who believed that the earth was going to be destroyed, but that they would be saved by extra-terrestrials. The book by Susan Clancy deals, in lively semi-popular fashion, with a similar topic. It opens with a refreshingly candid account of how she came to embark on such unusual research. She presents extracts from interviews with people of varying backgrounds who shared a set of — to us — weird beliefs. For instance, a number of them were convinced that they had been taken aboard a space ship where they became the objects of sexual or medical experimentation.

The question asked is how it is possible for 21st-century Americans to have such strange thoughts. Clancy’s main approach was connected with her special interest in ‘false memories’, a phenomenon extensively investigated in relation to alleged child sex abuse. Such memories were often elicited by psychiatrists or other therapists, and similarly she found that most of her ‘abduction’ cases had ‘either sought out or fell into the hands of an abduction researcher [who was a believer] or therapist.’ These ‘experts’ apparently tended to reinforce the beliefs, if not actually shaping them.

More here.

Bringing the War Home

Laura Hanna and Astra Taylor in The Nation:

Pointing imaginary guns and roughing up “Iraqi civilians”, a group of antiwar veterans brought the realities of the Iraq debacle to Manhattan, in a Memorial Day protest that briefly turned the streets of the city into a combat zone. In “Operation First Casualty,” a half-dozen members of Iraq Veterans Against the War employed the tactics of street theater to stage mini-dramas in Times Square, Union Square and the World Trade Center site, simulating sniper fire and staging mock arrests of fellow protesters who portrayed Iraqis. The group plans to take Operation First Casualty to the streets of Chicago June 17.

Why Islamic Hijab

Jahanshah Rashidian in Butterflies and Wheels:

With the arrival of spring, the Islamic Republic of Iran’s police have launched this year their traditional crackdown on women’s dress. Such crackdowns have become a regular feature of life for Iranian women. The crackdown is to force women to respect the strict Islamic dress code.

Under Iran’s Islamic laws (Sharia) women are obliged to cover their body from head- to-toe with a black chador or at least long, loose-fitting clothes to disguise their whole figures. The Islamic dress code is severely imposed at this time. Violators can receive lashes, fines or imprisonment.

Since the existence of the IRI, not a day has passed without attack, physical assault, arrest, acid-throwing, harassment and psychological pressure on women in Iran. The IRI has clearly specified that for women no other sort of dress is permitted except the Islamic hijab.

The first question is: why does the IRI since 1979 stubbornly impose Islamic hijab on women of different social backgrounds, ethnic groups, and religious minorities?

More here.

Bush’s Mistake and Kennedy’s Error: Self-deception proves itself to be more powerful than deception

Michael Shermer in Scientific American:

Book The war in Iraq is now four years old. It has cost more than 3,000 American lives and has run up a tab of $200 million a day, or $73 billion a year, since it began. That’s a substantial investment. No wonder most members of Congress from both parties, along with President George W. Bush, believe that we have to “stay the course” and not just “cut and run.” As Bush explained in a speech delivered on July 4, 2006, at Fort Bragg, N.C.: “I’m not going to allow the sacrifice of 2,527 troops who have died in Iraq to be in vain by pulling out before the job is done.”

We all make similarly irrational arguments about decisions in our lives: we hang on to losing stocks, unprofitable investments, failing businesses and unsuccessful relationships. If we were rational, we would just compute the odds of succeeding from this point forward and then decide if the investment warrants the potential payoff. But we are not rational–not in love or war or business–and this particular irrationality is what economists call the “sunk-cost fallacy.” The psychology underneath this and other cognitive fallacies is brilliantly illuminated by psychologist Carol Tavris and University of California, Santa Cruz, psychology professor Elliot Aronson in their book Mistakes Were Made (But Not by Me) (Harcourt, 2007). Tavris and Aronson focus on so-called self-justification, which “allows people to convince themselves that what they did was the best thing they could have done.”

More here:

Slam Dancing for Allah

From Newsweek:

Muslimpunk_bbox_2 June 11, 2007 issue – It’s near midnight in a small Fairfax, Va., bar, and Omar Waqar stands on a makeshift stage, brooding in a black tunic and brown cap. He stops playing his electric guitar long enough to survey the crowd—an odd mix of local punks and collared preps—before screaming into the microphone: “Stop the hate! Stop the hate!” Stopping hate is a fairly easy concept to get behind at a punk-rock show, and the crowd yells and pumps its fists right on cue. But it’s safe to say that Waqar and his band, Diacritical, aren’t shouting about the same kind of hate as the audience. Waqar wants to stop the kind that made people call him “sand flea” as a kid and throw rocks through the windows of the Islamic bookstore he worked at on 9/11. Waqar, 26, the son of a Pakistani immigrant, is a Muslim—a punk-rock Muslim.

More here:  (Thanks to Hassan Usmani)

Sir Edward Elgar: Allegro vivace e nobilmente

Australian poet and author Peter Nicholson writes 3 Quarks Daily’s Poetry and Culture column (see other columns here). There is an introduction to his work at peternicholson.com.au and at the NLA.

ElgarThe one hundred and fiftieth anniversary of Edward Elgar’s birth fell on June 2, 2007. This anniversary has been the spur for some strange commentaries in Britain: Elgar’s music isn’t modern enough; its tub-thumping pomp and circumstance state music doesn’t reflect contemporary, multicultural Britain. How symbolic, some say, that Elgar’s face has just disappeared from the twenty pound note to be replaced by that of sensible Scottish economist Adam Smith. 

This is political correctness gone barking. It is sobering to see that true greatness can still be blithely consigned, by some, to an imaginary junk heap of artistic detritus. Sensibly, the general public won’t have a bar of these musings, and no doubt Elgar’s music will continue to be comprehensively celebrated and performed.

Australian artists, for many years on the receiving end of condescension from points north, can relate to Elgar’s long struggle to establish himself as composer and artist in a hostile cultural environment. Anyone who tries this face-to-face on Australians these days is in for a rude surprise, though there really isn’t any cure for parochialism and snobbery, which are without end. However, in some parts hereabouts, there is now, somewhat inevitably, an Australia-first campaign whereby anything English is hammered—they think—into the ground: psychic punishment for past colonial sins of omission. Percy Grainger, for example, is held up a preferable alternative to Elgar. As a republican, I have no desire to linger in antique realms, but I’m not going to be bludgeoned into rejecting greatness when it is there before me. English critics are partly to blame for Elgar’s reputation abroad with their endless references to Elgar’s Englishness. Elgar’s music belongs to the world, not just to England. (There is a specific Australian connection in Elgar’s music. He sets Adam Lindsay Gordon’s poem ‘The Swimmer’ as the last song in Sea Pictures—‘O brave white horses! you gather and gallop, / The storm sprite loosens the gusty reins . . .’) If I stand on Bondi beach on a wild afternoon, hearing the surge of Alassio in my head, or, in mourning, recall that last restatement of the ‘Spirit of Delight’ theme at the end of the Second Symphony (‘the passionate pilgrimage of a soul’), this is not reflux nostalgia. Here is the music equal to the depth of life. If you want, there are certainly plenty of alternatives to choose from.   

Grandeur of spirit, and passion, in art, will never be consigned to a use-by date. Elgar’s story is a remarkable one of persistence through the awfulness of the English class system to the creation of great music, the first Britain had experienced since the time of Purcell. Elgar had a large chip on his shoulder because he, and his wife Alice, had to pay heavy dues in getting to this position of eminence. If, as I read, Elgar tried to wangle a peerage for himself, it was only what he deserved. Such splendour in the Malvern grass—the symphonies, the Violin Concerto, the Cello Concerto, the Enigma Variations, The Dream of Gerontius, Falstaff, the mass of ceremonial, occasional and salon music: all this music speaks of the seriousness and loveliness of the world, often with nobility, sometimes with wistfulness and melancholy.

You don’t have to be Roman Catholic to enjoy The Dream of Gerontius. What sort of mindset is it that can’t enjoy music of this kind because it doesn’t fit the listener’s prescriptive personal agenda. There are people who claim to be living, mentally, always in the present moment. The radical, the cutting edge, are what they crave. When you have the pile of bicycle parts waiting for you at the Whitney Biennial, why go mooning over some Degas? How tedious to sit through hours of Tristan when some hipster rap group is about to let loose at the latest in venue. But what if Elgar or Degas or Wagner are, emotionally and creatively, more radical and cutting edge, than they they have ever begun to conceive. I live only in, and for, the present moment. Too bad if the present moment is dullness enbalmed and then overhyped by the usual organs of capitalist increase.

Thus do some go their weary way, unaware of the marvels about them, forever out of reach because of ideological posturing or just plain ignorance. Well, I’m not forsaking Elgar for any whim of contemporary fashion and, if you don’t know the music of this composer, do yourself a favour that will repay you in kind one hundredfold.

Knowing he had composed a masterpiece, Elgar wrote at the end of the score of Gerontius the following words of Ruskin: ‘This is the best of me; for the rest, I ate, and drank, and slept, loved and hated, like another; my life was as a vapour, and is not; but this I saw and knew; this, if anything of mine, is worth your memory.’ However much these words apply to Gerontius, they also apply to the whole. The grocer’s daughter who became Prime Minister of the United Kingdom once told the nation, in very different circumstances, to rejoice; and we may well say of the piano tuner’s son who became a composer of world renown: rejoice that such a person may triumph, and that such music can be. 

Below the Fold: Is There a Doctor in the House?

Michael Blim

I lost my family doctor last week. Or rather, he has decided to undertake a boutique practice, and I can’t afford the annual fee. So, in a way, he is leaving me, and I am not happy about it.

Last week a friend of mine received a “Dear John” letter from her family physician. He is cutting 400 patients from his list, and he had selected her to be one of them. He is leaving her, and she is not happy either.

My friend and I are medically well connected. Both my partner and my friend work at a major research hospital in Boston. They know lots of doctors who know lots of doctors. And still, both of us are scrambling to find a family physician. Many have closed their practices to new patients, and even a well-placed word doesn’t always unlock their availability.

There must be primary care doctors who are not being overrun by patient demand, but in my quest to find a new doctor, I haven’t come across any with good track records in Boston that are not.

There are a lot of reasons why my friend and I have lost our doctors. First, it would seem pretty obvious that some doctors, particularly primary care doctors, would like to make more money. The average salary for an internist in 2005, according to the US Department of Labor, was $166,000, while surgeons, for instance, made $282,000. My now former family doctor and his new partner plan to serve only 800 patients at a time. As minimum annual membership in the practice costs $3500, this means that they begin each year with $2.8 million in income from their patients that is in addition to insurance reimbursements they will receive for services rendered. Clearly, money is an important motivation.

(For the record, it should be noted that the federal government limits the number of doctors that can be trained, and the medical profession has not founded new medical faculties that could produce more doctors. As the number of doctors has actually declined 17% since 1983, according to the federal government, this would seem to be an odd set of decisions given the rising demand for medical care.)

Second, if we consider the doctor reducing his patient load, money is not likely his concern. He sold his practice some time ago to a large research hospital that covers all office expenses and pays him a very good salary, much more I would guess than the national average. The load-reducing doctor easily earns his keep by providing the hospital with millions of dollars in “billables” through referrals for treatment.

The load-reducing doctor doubtless wants to return some sanity to his professional life, and the same could probably be said for my doctor starting up the boutique practice. My doctor going boutique is doing it with money by increasing his yield per patient, and in effect lowering the demand for his services. He is also no longer subject to the possible productivity demands of hospital overlords who seek in turn to boost hospital profits. He is regaining some professional autonomy, though at a very high price for his patients.

The rub, it seems to me, lies with the load-reducing doctor who has remained with the hospital. It likely makes good sense to slim down his patient list, unless his hospital is paying a per capita bonus, or it pays him for exceeding a goal for the number of patient he sees per year. He probably will not be paid less because of his market value and reputation. His practice has been closed for years, and his remaining patients are likely to cling to him for his competence and for the security of having a family doctor. He will not want for patients, though the hospital might miss some referrals.

In all though, I bet he will barely feel the difference. It might work out a bit better for his patients, whose access to him might marginally increase. But his workload and work routine probably won’t change much, as the remaining patients will take advantage of the opportunity to get more care.

This is because our health is so valuable to us that we will seek as much care possible because we can never tell what is enough. The more opportunities that are offered, the more care we seek. As our demand for care increases, more care is offered.

Here are a few examples of how our consumption of medical care is growing. In 2004, according to the Centers for Disease Control, Americans made 1 billion doctor visits, and the rate of increases in doctor visits is running about three times the rate of our population growth. We made 35 million visits to the hospital in 2004, and the number of stays is growing by about 3% a year.

Services such as diagnostic radiology and commodities such as prescription drugs are growing much faster, both in quantity and cost. Radiology billings are increasing at the rate of 20% a year, hitting $100 billion in 2006. Americans now consume an average of 11 drug prescriptions yearly, and their costs are rising faster than the rate of inflation.

We spent $2 trillion on health care in 2005, a figure that amounts to 16% of our gross domestic product. By 2015, we will be spending $4 trillion a year, thus devoting 20% of our gross domestic product to health care.

We are definitely consuming more health care, even though we cannot determine how much health care is enough. In addition, as the population ages, and our collective health faces more challenges, we are likely to seek more care. And as one national scheme or another covers millions of uninsured Americans, they can finally and fully meet their health needs, and will consume more care. The good news for the currently uninsured, if Medicare is any guide, their health status will improve measurably, as did the health status of the elderly.

Thus, there is surely unmet need, even in the face of galloping demand. This anomaly dissolves once one recognizes that there are many in America who don’t get enough, and many who can’t get enough. For people who just can’t get enough, their desire for more is transforming health care into a “quality experience,” from boutique medical practices to boutique wings of hospitals.

In the past, rank and wealth surely had its privileges in towns and cities where the rich could support society doctors and special hospital accommodations. The new, more numerous class of well paid managers and professionals that has grown up since World War II has recreated American health care as a growth industry which they run and from which they profit. They have pushed the medical profession to provide care as Henry Ford pushed his workers to make Fords. Doctors, once accorded elite status by virtue of their profession, now pursue entrepreneurial projects via boutiques, incorporation and refusing insurance. Many are converting medical practice into a business model.

Even as some doctors rebel, escape, or go out their own in some out of the way place, others like the load-reducer looking for sanity for himself and better service for his patients, are becoming cogs in the wheel of large vertically integrated firms. They refer clients to a capital-intensive medical machine run by managers and doctors with profit-based business plans. Every hospital caught up in the race believes that it will soak up the growing demand by providing an ever growing supply of machines, beds, day surgeries, and importantly innovative cures for the very sick.

As the supply grows, so in turn does demand once more, fed by our unquenchable desire for more health and more well being. We return once again to the question: What is enough? The answer is presently unknowable for three reasons.

First, enough is defined now in terms of differential resources. If you have money or even proxy money such as insurance, private or federal, you can answer the question much more robustly. The lack of money or insurance for others defines in effect what is enough for them. The fact that persons making less than $20,000 a year spend 15% of their income on medical care and those making more than $70,000 spend just 3% of their income on medical care suggests how much more low income individuals are affected by medical costs than those with high incomes. I suspect, though I cannot prove it here, that high income individuals not only are better protected by insurance, but that they have more disposable income to devote to more health care consumption. This last might explain why hospitals are installing those hotel-like hospital wings complete with chefs and concierges, and thus making health care into a “quality experience.”

Second, the private system of medical care today is driven by the profit motive in which expanding our notion of what is enough in part creates greater demand for their products. For them, more is better, particularly as professional medical knowledge and ethics are being subjected to a business model. They can answer for their own interests, but their opinions of necessity are partial, and would probably boil down to the argument that we can never have enough, as health is more generally treated as an immeasurable human good.

Third, self-preservation being keenly desired by many does not necessarily encourage rational choices. To the question of what is enough, for many, the answer is simply the egoistic reply of what is best for them.

Our two practitioners, like us, are struggling to answer the same question in their ethically loosened and bureaucratized world. Their choices – one to ration care by making it expensive, and the other to ration care by eliminating patients – are the unfortunate products of a system incapable of rationalizing itself.

When we think of national health insurance, I believe that we think largely of satisfying the unmet demands of a near majority of Americans for quality medical care. However, we often fail to realize that any national health system will come to pass as a descendant of the one we have now, one in which the question of what is enough is answered by an anxious, insatiable demand for care in an environment of relative indifference to the needs of others. As so often happens in America, it is hard to note the suffering of others at the same time resources are expanding for the things that other individuals value.

A new national system must not only provide medical care equitably for the first time in American history, but it also must develop a collective answer to the question of what is enough. It involves answering the practical question of how much of our national income we want to devote to medical care as against other goods and services. It also involves re-setting the moral terrain through collective agreement based upon an ongoing investigation of what care is necessary for a decent life.

Random Walks: Tunnel Vision

Titanic_ver2James Cameron’s 1997 blockbuster movie Titanic broke box office records and garnered bushels of awards; it remains one of the top-grossing films of all times. A large part of its appeal lay in the central (fictional) story of the doomed young lovers, London socialite Rose DeWitt Decatur (Kate Winslet) and impoverished American artist Jack Dawson (Leonardo DiCaprio), who ultimately sacrifices his own life to save hers. A good tragic love story is a time-tested recipe for cinematic success.

But even people whose heartstrings remained untugged by the tearjerker tale couldn’t help but be entranced at the spectacle of the great Titanic brought to life on the big screen, and the lavish re-enactment of its tragic sinking makes for stellar cinema. There’s just something about the Titanic that resonates with us on a deep, subconscious level, and it is that element that ultimately raises Cameron’s film above mere Hollywood bathos.

It’s tough to put one’s finger precisely on just what that “something” is, but sci-fi author Connie Willis managed to do just that in her 2001 novel, Passage, which I recently re-read for the umpteenth time while recovering from a virus. I’ve been a Willis fan for years, ever since I read her short stories, “At the Rialto,” and “Even the Queen” (included in the collection Imaginary Things). She has the skeptical mind of a scientist — her husband is a physicist, which probably helps — and the soul of poet, mixing science fact, science fiction, literary allusion, and metaphor with memorable characters and terrific story-telling. Her broad popular appeal is evidenced by her winning nine Hugo Awards and six Nebula Awards (thus far), making her “one of the most honored sci-fi writers of the 1980s and 1990s.”

Willis has tackled time travel, chaos theory, and the sociology of fads (Doomsday Book, To Say Nothing of the Dog, Bellwether), but in Passage, she immerses herself in what is for many the Ultimate Question: is there life after death, a part of our consciousness — a soul, if you will — that continues even after the body dies? And to explore her theme, she delineates the science of Near-Death Experiences (NDEs): the proverbial light at the end of the tunnel, at least if the estimated 7 million people who claim to have experienced an NDE are to be believed. Written accounts of NDEs date back some two thousand years,and hail from all over the world. There’s even a research foundation devoted to the study of NDEs.

The man responsible for coining the term “near-death experience” is Raymond Moody, an MD who has written several books on the afterlife based on patient testimonials,  who believes NDEs are evidence of  a soul (consciousness that exists separately from the brain), and evidence of the existence of an afterlife. He’s even boiled the typical NDE down to a few key features. First, there’s a strange kind of noise, alternately described as a ringing or a buzzing. There is a sense of blissful peace, and often an out-of-body experience (feeling as if one is floating above one’s body and observing it from that vantage point). There’s that light at the end of the tunnel, being met by loved ones, angels, or other religious figures, and a kind of “life review” — seeing one’s life flash before one’s eyes. But as the Skeptic’s Dictionary helpfully points out, Moody’s books ignore the fact that as many as 15% of NDEs are outright hellish experiences. Neardeathexperience1

It’s not all mystical New Age spiritualism, or even overt religiosity. There have been some solid scientific studies of what happens to the brain during such events, most notably a 2001 Dutch study published in the prestigious British medical journal, The Lancet. The researchers examined 344 patients who were resuscitated after suffering cardiac arrest, and interviewed them within a week afterwards about what — if anything — they remembered. The results were a bit startling: about 18% reported being able to recall some portion of what happened when they were clinically dead, and between 8 and 12 percent said they experienced some form of an NDE.

Neurochemistry offers some convincing alternative explanations, namely, that NDEs aren’t evidence of an afterlife, but illusions created by a dying (oxygen deprived) brain. Cardiac arrest and the anesthesias used in ERs are capable of triggering NDE-like brain states. The Dutch researchers found that “similar experiences can be induced through electrical stimulation of the temporal lobe,” for instance, as can neurochemicals such as endorphins and serotonin, and hallucinogenic drugs like LSD and mescaline. Last October, an article in New Scientist described a new theory by University of Kentucky neurophysiologist Kevin Nelson attributing NDEs to a kind of “REM intrusion” : “Elements of NDE bear uncanny similarity to the REM state,” he told the magazine. He describes REM intrusion as “a glitch in the brain’s circuitry that, in times of extreme stress, may flip it into a mixed state of awareness where is is both in REM sleep and partially awake at the same time.” Something similar might be happening with NDEs, he reasons, although the jury is still out on that particular hypothesis.

Karl Jansen has managed to induce NDEs with ketamine, a hallucenogenic related to PCP, but far less destructive; it’s an anesthetic that works not just by dulling pain, but by creating a dissociative state. According to Jansen, the conditions that give rise to NDEs — low oxygen, low blood flow, low blood sugar, and so forth — can kill brain cells, and the brain often responds by triggering a flood of chemicals very similar to ketamine to protect those cells, which would produce “out of body” sensations and possibly even hallucinations. Jansen claims his approach can reproduce all the main elements Moody attributes to NDEs: the dark tunnel with a light at the end, out of body experiences, strange noises, communing with god, and so on.

Why do so many people see a light at the end of the tunnel? Susan Blackmore, a psychology professor at the University of the West of England in Bristol, thinks she might have an explanation: neural noise. During cardiac arrest, in the throes of death, the brain is deprived of oxygen, causing brain cells to fire rapidly and quite randomly in the visual cortex. There are lots of cells firing in the middle, and fewer towards the outer edge, producing white light in the center fading into dark at the outer edges. That feeling of peace and well-being might be due to the fact that the brain is pumping out endorphins in response to pain, which can produce a dream-like state of euphoria. That same cerebral anoxia might also cause the strange buzzing or ringing sound people claim to hear when they enter an NDE.

Willis takes that tiny bit of factual thread and spins it into a complex scientific mystery, skewering cheap spiritualism in the bargain. She got the idea for the book when a friend insisted she read a book on NDEs, insisting she would find it inspiring. Instead, Willis loathed it. In fact, it made her angry: “I thought it was not only pseudoscience, but absolutely wicked in the way it preyed on people’s hopes and fears of death, telling them comforting fictions.” She channeled that anger into the novel, creating the character of Dr. Maurice Mandrake, a physician who has abandoned all pretense of scientific objectivity, prompting his “witnesses” to say what he expects to hear — a strategy that has made him a best-selling author of just the sort of book that triggered Willis’ creative rage. (The savvy reader will note some striking likenesses to Moody.)

Mandrake’s foil in the novel is the main protagonist, Joanna Lander, also a doctor studying NDEs, but her approach is far more rational, firmly grounded in the scientific method — which puts her at odds with Mandrake and his minions. She finds an ally (and potential love interest) in neuroscientist Richard Wright, who has contrived a way to induce NDEs using a psychoactive drug called dithetamine. (I guess one could say he’s the fictional counterpart to Jansen.) His theory is that the NDE is a survival mechanism, part of a series of strategies the brain employs whenever the body is seriously injured. The NDE is a side effect of neurochamical events.

While NDEs seem to have some striking similarities in the various recorded accounts, what specific form the NDE will take depends on the individual, and for Willis, this provides an opportunity for a clever twist. Recall the now-famous final scene in Cameron’s Titanic, when the elderly Rose, having lived a long, full life, dies quietly in her sleep. The camera follows her “soul’s” journey out of her body, towards a white light, then down into the ocean depths until she reaches the present-day wreck. As the camera moves into the main dining hall, we see the shipwreck morph back into its former unsunken glory, and all those who perished are on hand to welcome Rose back to the fold. Waiting at the top of the grand staircase is Jack himself, who takes Rose’s hand, her youth restored, and the lovers are reunited in eternity. (Cue violins and a thousand sobbing fans, followed by that overblown — and overplayed — Celine Dion ballad.)

But what if Rose hadn’t willingly gone into the light? Then she may have popped back to her reality as an elderly woman  in the middle of the ocean, on a complimentary junket to re-visit the sunken ship because she alone might know the secret of the fabulous lost diamond necklace. And her “vision” would have classified as a classic NDE. Joanna would definitely have wanted to interview her. Not only would Rose have made an excellent witness, but the two share a common NDE framework. When Joanna consents to let Richard induce NDEs in herself as a subject, she also finds herself on the Titanic — the very day of its sinking. Titanic_bow_railing

Of course, it isn’t really the Titanic: she knows that, even though the experience feels uncannily real, nothing like a typical dream state. But Joanna is convinced there’s a reason her subconscious has picked this particular framework in which to place her NDE. The Titanic is the perfect metaphor for what is happening as the brain strives to make sense of things even as it is dying (or pseudo-dying, in the case of induced NDEs). The body is a sinking ship, the chemical signals and electrical impulses are SOS messages trying to find some form of rescue, some way of jump-starting the body before brain death sets in, within four to six minutes after the onset of oxygen deprivation. The metaphor of the Titanic, says Joanna’s former English teacher, is “the very mirror image of death.”

Willis knows a little something about death, having lost her mother quite suddenly when she was 12, an experience she has described as “a knife that cut across my life and chopped it in two. Everything changed.” But she didn’t take refuge in the popular Hallmark sentiments that often pass for profundity in this country. [“[O]ur American culture is especially in denial about death,” she said in a 2003 interview, and while writing Passage, “I wanted to make sure some reader who had just had somebody die… would say, ‘Thank you for telling the truth and trying to help me understand this whole process.'”

And what a brutal truth it is. The novel ends by taking us right into the dying brain of one of the characters, quite conscious and very aware of what these visions represent: synapses firing randomly, memories falling away, even language being lost, before the visual cortex shuts down completely. The depiction is unflinching, and all the more powerful because it never resorts to easy platitudes. It’s not an easy book to read; it’s decidedly unsettling, yet I return to it again and again. Maybe it’s because Willis stops short of letting everything go dark; in the end, she acknowledges that perhaps some things cannot  — and need not — be known. By leaving things open-ended, Passage reassures us that it’s okay for this to remain unknown. Our only job, as human beings, is to make the life journey as rich and meaningful as possible — in whatever way we choose to do so.

It’s safe to say that a vast majority of the world’s population professes to believe in some form of life after death, even if the form it takes differs from culture to culture; it’s far more comforting to them. No doubt many of those resist any attempt by scientists to explain their experiences from a rational perspective. Like Maurice Mandrake and his fawning acolytes in Willis’ novel, they want their mystical/spiritual trappings, and consider alternative scientific explanations to be a kind of cheapening of such beliefs and experiences.

Even before I became a science writer, I never understood that attitude. How much more magical and wondrous to discover that there is a rational reason why we experience such things. “[P]eople are trying to validate their experience by making these paranormal claims, but you don’t need to do that,” Blackmore told ABC News when the Lancet study was published in 2002. “They’re valid experiences in themselves, only they’re happening in the brain and not in the world out there.”

When not taking random walks at 3 Quarks Daily, Jennifer Ouellette blogs about science and culture at Cocktail Party Physics. Her most recent book is The Physics of the Buffyverse (Penguin, 2007).

Blair’s Legacy and Brown’s Prospects

by Matthias Matthijs

Tony_blair_24_350x470The curtain is finally falling over Tony Blair’s tenure in 10 Downing Street . After ten years as prime minister, Blair will step down on June 27. He will be succeeded by his stoic chancellor, Gordon Brown, although many Labour Party faithful are far from enthusiastic about the pending “coronation.” Gordon_brown_deputy_prime_ministerAfter almost ten years of Blair in Number 10, there is a widespread sense of disappointment – across the political spectrum – with the man who promised to create a “new Britain” and pledged to forever change the face of British society. On the left, there is frustration with New Labour’s slow progress in poverty reduction and the unacceptable levels of income inequality; while on the right there is criticism for excessive micro management and endless tinkering with the tax system, which supposedly harms the country’s international competitiveness. An overwhelming majority of the British population disapproved of his foreign policy, seen as too close to the Bush administration and often actively harming fundamental British interests.

What started out as a potentially brilliant premiership in 1997 was halted by excessive caution in domestic politics during the first term and his controversial decision to join US forces in what most Britons saw as an illegal war in Iraq during his second term. Where Thatcher was able to fuel nationalistic fervor and pride in 1982, and mask her initial failure in economic policy with a resounding military victory over Argentina in the Falklands; Blair’s disastrous adventure in Iraq would soon overshadow his successes in the domestic economy. History will judge Blair as a rather weak consolidator of Thatcherism, both in domestic and foreign policy. During his first term, most of Thatcher’s economic reforms were made all but irreversible by formally institutionalizing them (Bank of England independence forever banished the goal of full employment to the dustbin of history, and Brown’s “golden rule” made a virtue of fiscal austerity). Market mechanisms have been infused in all parts of government, something even Thatcher would not have dreamt of. In foreign affairs, Blair’s knee-jerk reaction to relations with Europe and America has been the same as Thatcher’s. However, Thatcher was much more independent from Washington than Blair, more ready to defend the plight of the Palestinians and criticize Israeli policy if she felt it justified. And whatever has been said about the Falklands, it could be defended in terms of the British national interest, which is not the case for Blair’s participation in the Iraq war.

Given the vague ideas of New Labour and Blair’s internal Party reforms while in opposition, one should probably not be so surprised that Blair and Brown acted the way they did once in power. It was never their intention to attack the emerging Thatcherite consensus in economic policy – even though they could have made the case and definitely had the overwhelming majority to do so – most of which they saw as inevitable in an increasingly globalized world. From the very beginning, Blair’s goals in government were much more modest than Thatcher’s were in 1979, and consisted in merely tweaking what he saw as a fundamentally sound economic system inherited from Major and Clarke. The fact that New Labour had such a large majority in the House of Commons, the widespread disillusion with the Tories after eighteen years in power, together with the ceaseless rhetoric and promise of “a New Settlement for a New Britain,” created too high hopes and unrealistic expectations – especially on the left – for the incoming government.

After ten years in power, it is clear that Blair is not in the same league as Attlee or Thatcher, and will never be seen as a politician who “changed the weather.” Maybe he was unlucky in this respect. Both Attlee and Thatcher came to power supported by a changing tide of the dominant ideology. It should have been clear from the start that The Third Way – or at least the interpretation given to it by New Labour – was ideationally too close to Thatcherism to create widespread enthusiasm and significant change. In the end, Mrs. Thatcher herself is perhaps the best judge. During her 81st birthday party in London last October, a guest asked her what she thought of new Tory leader David Cameron. With a smile, she answered: “I still prefer Tony Blair.”

But what are the prospects for Gordon Brown? Maybe he is secretly hoping that he can repeat the stunt of John Major in the general election of 1992. He is wrong for two reasons. Firstly, Major – for all his shortcomings – was a relatively new face which Britain seemed to trust. He had only been in high politics for about a year and a half when he got the top job. Brown is as much – if not more – associated with New Labour as Tony Blair. Secondly, the opinion polls in October 1990 showed that 34.3% intended on voting Conservative at the next general election against 46.4% for Labour. Just two months later, in December 1990, after the defenestration of Margaret Thatcher, 44.6% of the electorate told Gallup they would vote Tory compared to 39.1% for Labour. It is unlikely that Brown can bring about the same dramatic change in public perceptions. One recent poll by The Guardian gives Cameron’s Conservatives a lead of 10 points if Brown were to face an election as Prime Minister. The similarities with Al Gore in 2000 become more striking by the day.

The author is a Professorial Lecturer in Economics at Johns Hopkins University in Washington, DC.

Andrew Sullivan’s Quote of the Day

From The Daily Dish:

Padillafeet“The effects of isolation, anxiety, fatigue, lack of sleep, uncomfortable temperatures, and chronic hunger produce disturbances of mood, attitudes and behavior in nearly all prisoners. The living organism cannot entirely withstand such assaults. The Republicans Communists do not look upon these assaults as ‘torture.’ But all of them produce great discomfort, and lead to serious disturbances of many bodily processes; there is no reason to differentiate them from any other form of torture…

The CIA KGB hardly ever uses manacles or chains, and rarely resorts to physical beatings. The actual physical beating is, of course, repugnant to overt Republican Communist principles and is contrary to C.I.A. K.G.B. regulations…

Prisoners are tried before “military tribunals,” which are not public courts. Those present are only the interrogator, the state prosecutor, the prisoner, the judges, a few stenographers, and perhaps a few officers of the court…

In typical Republican Communist legalistic fashion, the O.L.C. N.K.V.D. rationalized its use of torture and pressure in the interrogation of prisoners of war. When it desired to use such methods against a prisoner or to obtain from him a propaganda statement or ‘confession,’ it simply declared the prisoner an enemy combatant a “war-crimes suspect” and informed him that, therefore, he was not subject to international rules governing the treatment of prisoners of war,” – “Communist Interrogation,” The Annals of Neurology and Psychology, 1956.

The Heraclitus of New Hampshire

Eric Ormsby in The New Criterion:

Ft_frost_2_85Not surprisingly for “one acquainted with the night,” Robert Frost cultivated a lifelong penchant for dark sayings. These sayings included aphorisms and maxims, apothegms and proverbs, wise saws and the occasional bon mot, alongside interjections, exclamations, and guffaws, interrupted thoughts and broken utterances. They were dark because they riddled, sometimes as much by their sound as by their content. Many, of course, made their way into his finest poems. “Good fences make good neighbors” is the obvious example, but the closer you look the more you find. So strong is this tendency in Frost’s poetry that even his less aphoristic lines have taken on a lapidary sheen. “And miles to go before I sleep,” though hardly an aphorism, is often intoned as though it were. These dark sayings of our own Heraclitus of New Hampshire have by now become so familiar as to appear immemorial folk wisdom. And yet, clad in cunning homespun though they are, they conceal contradictory flashes of wit as well as mischief. Like the milkweed pod with its “bitter milk,” of which he wrote so memorably, their rough husks hold hidden, and sometimes ticklish, silks.

Now, with the publication of his Notebooks, we can gauge just how fundamental such fragmented wisdom was to Frost’s own peculiar cast of mind.

More here.

The Other Einstein

Lee Smolin in the New york Review of Books:

In his new book, Einstein: His Life and Universe, Walter Isaacson explains that

studying Einstein can be worthwhile [because] it helps us remain in touch with that childlike capacity for wonder…as the sagas of [science’s] heroes reminds us…. These traits are…vital for this new century of globalization, in which our success will depend on our creativity….

As he elaborates in a recent interview with Thomas Friedman, “If we are going to have any advantage over China, it is because we nurture rebellious, imaginative free thinkers, rather than try to control expression.”[1]

AlberteinsteinimaginationNoble sentiments, and certainly sufficient justification for continuing to promulgate uplifting myths about science and its heroes. But what does this have to do with the actual character and life of the real person who happened to be the most important physicist of the last two hundred years? There is no doubt that any attempt to understand who Einstein actually was and what he actually did is hampered by a smokescreen that was created by his executors, his colleagues, his biographers, and perhaps even Einstein himself. The myth of Einstein presents us with an elderly sage, a clownish proto-hippy with long hair, no socks, and a bumbling, otherworldly manner. As Isaacson writes it:

Adding to his aura was his simple humanity. His inner security was tempered by the humility that comes from being awed by nature. He could be detached and aloof from those close to him, but toward mankind in general he exuded a true kindness and gentle compassion.

This certainly describes a role that the older Einstein might plausibly have chosen to play as a defense against the onslaught of fame and responsibility. But what Isaacson is describing is a role, not a human being. Who was the person behind that role, and what were his reasons for playing the endearing sage?

More here.

Death is such a heavy subject, it would be good to make something that laughed in the face of it

03matter1901_2

It’s particularly fitting that the title of Damien Hirst’s new headline-grabbing work came from an exasperated exclamation of his mother’s: “For the love of God, what are you going to do next?”

The answer, pictured here, is a life-size platinum skull set with 8,601 high-quality diamonds. If, as expected, it sells for around $100 million this month, it will become the single most expensive piece of contemporary art ever created. Or the most outrageous piece of bling.

At home in Devon, Hirst insists it’s absolutely the former. “I was very worried for a while, because if it looked like bling — tacky, garish and over the top — we would have failed. But I’m very pleased with the end result. I think it’s ethereal and timeless.”

more from The NY Times Magazine here.

I Believe In Evolution, Except For The Whole Triassic Period

Ibelieve

We can look at the fossil record and trace many of our genetic traits back to ancient species. In fact, scientific reasoning can explain nearly every stage of life from the Big Bang to the present day. I say “nearly” because the period that scientists claim lasted from roughly 205 to 250 million years ago, commonly known as the Triassic period, was quite obviously the work of the Lord God Almighty.

Don’t get me wrong: I’m not one of those religious nut cases who denies that evolution is real. Of course evolution is real, just not during the “Triassic period.”

This so-called Triassic period saw the formation of scleractinian corals and a slight changeover from warm-blooded therapsids to cold-blooded archosauromorphs. Clearly, such breathtakingly subtle modifications could only have been achieved by an active intelligence.

more from The Onion here.