Vlatko Vedral recommends five books

Tom Dannet in Five Books:

41CuGQa3uJL Your first book is Quantum Physics: Illusion or Reality? by Alastair Rae.

This is a completely popular book about quantum physics: there is not a single equation in there, I think. What he does is to go through all the major ways in which we try to understand quantum physics, all the major interpretations. It’s extremely good in that he writes in a very objective way and it’s very difficult to tell which one he supports. It’s very passionately argued as well, and it’s a beautiful exposition, very philosophical. I think it’s the best, probably my favourite, popular account of all the things we argue about on the fundamental side of quantum physics.

There are all kinds of strange views on what quantum physics actually is.

Right. There are connections with religion, then there are extremes saying it’s all in the mind: basically that nothing becomes real until we measure it and look at it and consciously record it. On the other side there is a point of view that it’s as real as anything else, out there independently of us and so on. He talks about these two extreme views and what quantum physics tells us about this very old question: whether the world is ideal or real.

Does he resolve it?

He really leaves it open because, to be completely honest about these issues, I don’t think we have something that’s universally accepted as the view: each has lots of positive points but also something that makes it a not completely plausible view to hold. That’s a really nice book.

More here.



Tuesday, July 20, 2010

Tuesday Poem

Mating Chain

When three or more sea slugs mate in unison, the first animal in the chain acts exclusively
as female, the last as male, and the others as male/female simultaneously.

Learning the difference takes so long. Of being demeaned or being
taught to navigate the seafloor. It’s a language of stoplights
and dark folds you never saw creasing. For example, left is actually
below your stomach and to the right is a reef of indigo. Patches of grey
and pink fondle me to sleep. I want to be one of the species
that pins down the other, circling two or more lovers. To push
my flimsy heart forward in the currents. Lithe as eelgrass,
drunk on endorphins. The best a body can do
is fold itself in half, flapping flail, repetition
of loneliness. But what’s the difference between this hunger
and parasitic tendency? I twist and steer each tentacle,
tying knots against the stillness. This one to symbolize love and the other,
savagery. I’m learning the subtlety, braiding between them.

by Kelly Anne Noftle
from Blackbird, Vol. 9 No. 1

Two Books about Noise

From The Telegraph:

Noise_main_1679524f Loudness isn’t all bad and silence can be deathly. A padded cell would be more terrifying than the roar over Heathrow, and deafness is well known to be more alienating than blindness. Noise is invigorating: it wakes us up and warns us of danger. There are millions like Spinal Tap’s Nigel Tufnel who delight in turning the volume up to 11 as they listen to heavy metal and hip hop.

Mother Nature seems to agree: she abhors silence. The explosion of Krakatoa was heard 3,000 miles away, and the peace of the countryside is an urban myth: rushing water, the dawn chorus, mating animals or a storm are far noisier than the hum of traffic. Another myth is the peace of the past. Blacksmiths, horses’ hooves on cobbles and hawkers made a filthy racket in the pre-industrial city. Whenever I am tempted to rip that wretched iPod from a teenage ear, I remind myself of the ghetto-blasters and transistor radios a generation ago. Manual typewriters clattered far louder than computer keyboards. Yet noise is a terrible problem in the modern world, and one salutes both George Prochnik and Garret Keizer for proselytising on behalf of a bit more hush. Although they both write from the United States, the noisiest country in the world, and inevitably cover a lot of the same ground, their approaches are different and complementary.

Morehere.

Earliest Steps to Find Breast Cancer Are Prone to Error

From The New York Times:

CANCER-1-articleInline Monica Long had expected a routine appointment. But here she was sitting in her new oncologist’s office, and he was delivering deeply disturbing news. Nearly a year earlier, in 2007, a pathologist at a small hospital in Cheboygan, Mich., had found the earliest stage of breast cancer from a biopsy. Extensive surgery followed, leaving Ms. Long’s right breast missing a golf-ball-size chunk. Now she was being told the pathologist had made a mistake. Her new doctor was certain she never had the disease, called ductal carcinoma in situ, or D.C.I.S. It had all been unnecessary — the surgery, the radiation, the drugs and, worst of all, the fear. “Psychologically, it’s horrible,” Ms. Long said. “I never should have had to go through what I did.” Like most women, Ms. Long had regarded the breast biopsy as the gold standard, an infallible way to identify cancer. “I thought it was pretty cut and dried,” said Ms. Long, who is a registered nurse.

As it turns out, diagnosing the earliest stage of breast cancer can be surprisingly difficult, prone to both outright error and case-by-case disagreement over whether a cluster of cells is benign or malignant, according to an examination of breast cancer cases by The New York Times. Advances in mammography and other imaging technology over the past 30 years have meant that pathologists must render opinions on ever smaller breast lesions, some the size of a few grains of salt. Discerning the difference between some benign lesions and early stage breast cancer is a particularly challenging area of pathology, according to medical records and interviews with doctors and patients.

More here.

no living room

LivingRooms-Dejean3-custom1

What exactly is a living room? Is it a formal room for special occasions, or a casual space for everyday life? The meaning has been unclear ever since the late 17th century, when architects first considered what “living” in the home meant. In 1691, in the first edition of what was to become a hugely influential architectural manual, “Lessons of Architecture,” Charles Augustin d’Aviler drew a distinction between formal display spaces and a new kind of room, spaces that were “less grand.” D’Aviler used an unusual phrase to describe these new rooms: “le plus habité” — literally the most lived in. This marked the first time that an architect discussed the notion of living rooms, rooms intended for everyday life. Before this, anyone who could afford an architect-designed residence wanted it to serve as proof of status and wealth; almost all rooms were display spaces. But once d’Aviler opened the door, French architects began making rooms for specific activities of daily life integral to the design of the home: initially the bedroom, then dressing rooms and bathrooms. These “less grand” rooms were the original living rooms.

more from Joan Dejean at The Opinionater here.

a secret Plato

Plato-006

It may sound like the plot of a Dan Brown novel, but an academic at the University of Manchester claims to have cracked a mathematical and musical code in the works of Plato. Jay Kennedy, a historian and philosopher of science, described his findings as “like opening a tomb and discovering new works by Plato.” Plato is revealed to be a Pythagorean who understood the basic structure of the universe to be mathematical, anticipating the scientific revolution of Galileo and Newton by 2,000 years. Kennedy’s breakthrough, published in the journal Apeiron this week, is based on stichometry: the measure of ancient texts by standard line lengths. Kennedy used a computer to restore the most accurate contemporary versions of Plato’s manuscripts to their original form, which would consist of lines of 35 characters, with no spaces or punctuation. What he found was that within a margin of error of just one or two percent, many of Plato’s dialogues had line lengths based on round multiples of twelve hundred.

more from Julian Baggini at The Guardian here.

Solve poverty by simply giving out money

FreeMoney1__1279313718_5155

There are all sorts of things very poor people living in poor countries don’t have. They lack secondary-school educations, usually, and good medical care. They lack steady work and life insurance, bank accounts and competent legal representation, adequate fertilizer for their crops, adequate protein in their diets, reliable electricity, clean water, indoor plumbing, low-interest loans, incubators for their premature babies, vaccinations and good schools for their children. But the central thing they lack is money. That is what makes them, by definition, poor: International aid organizations define the “very poor” as those who live on less than a dollar a day. Despite this, the global fight that governments and nongovernmental organizations have waged against poverty in the developing world has focused almost entirely on changing the conditions in which the poor live, through dams and bridges and other massive infrastructure projects to bring commerce and electricity to the countryside, or the construction and staffing of schools and clinics, or subsidizing fertilizer and medicine, or giving away mosquito nets or cheap portable water filters. In the last decade, however, the governments of the nations where most of the world’s poorest actually live have begun to turn to an idea that seems radical in its simplicity: Solve poverty and spur development by simply giving out money.

more from Drake Bennett at The Boston Globe here.

Tariq Ali on the recent killings in Kashmir

Tariq Ali in the London Review of Books:

ScreenHunter_03 Jul. 20 11.27 A Kashmiri lawyer rang me last week in an agitated state. Had I heard about the latest tragedies in Kashmir? I had not. He was stunned. So was I when he told me in detail what had been taking place there over the last three weeks. As far as I could see, none of the British daily papers or TV news bulletins had covered the story; after I met him I rescued two emails from Kashmir informing me of the horrors from my spam box. I was truly shamed. The next day I scoured the press again. Nothing. The only story in the Guardian from the paper’s Delhi correspondent – a full half-page – was headlined: ‘Model’s death brings new claims of dark side to India’s fashion industry’. Accompanying the story was a fetching photograph of the ill-fated woman. The deaths of (at that point) 11 young men between the ages of 15 and 27, shot by Indian security forces in Kashmir, weren’t mentioned. Later I discovered that a short report had appeared in the New York Times on 28 June and one the day after in the Guardian; there has been no substantial follow-up. When it comes to reporting crimes committed by states considered friendly to the West, atrocity fatigue rapidly kicks in. A few facts have begun to percolate through, but they are likely to be read in Europe and the US as just another example of Muslims causing trouble, with the Indian security forces merely doing their duty, if in a high-handed fashion. The failure to report on the deaths in Kashmir contrasts strangely with the overheated coverage of even the most minor unrest in Tibet, leave alone Tehran.

On 11 June this year, the Indian paramilitaries known as the Central Reserve Police Force fired tear-gas canisters at demonstrators, who were themselves protesting about earlier killings. One of the canisters hit 17-year-old Tufail Ahmad Mattoo on the head. It blew out his brains. After a photograph was published in the Kashmiri press, thousands defied the police and joined his funeral procession the next day, chanting angry slogans and pledging revenge. The photograph was ignored by the mainstream Indian press and the country’s celebrity-trivia-obsessed TV channels. As I write, the Kashmiri capital, Srinagar, and several other towns are under strict military curfew.

More here. [Thanks to Yousaf Hyat.]

Herbie Hancock’s secret of great musicianship: Do your math and science homework!

Reed Johnson in the Los Angeles Times:

ScreenHunter_01 Jul. 20 10.40 Hancock has placed his mark on modern music like few other performers.

Now he's got a new gig as creative jazz chair of the Los Angeles Philharmonic. In that capacity, he'll be responsible for programming jazz concerts at the Walt Disney Concert Hall and the Hollywood Bowl, bringing in guest artists and possibly commissioning new pieces.

So what are some keys to the professional success and longevity of Hancock, who turned 70 in April and seems as occupied as ever?

Well, one of them is: Study your math and science. As a self-described “techie,” Hancock says his lifelong embrace of electronic experimentation has helped him stay on top as well as take advantage of evolving musical developments.

Here's part of what he had to say on the subject during a recent interview at his Westside home:

“I've always been interested in science. I used to take watches apart and clocks apart, and there's little screws, and a little this and that, and I found out if I dropped one of them, that thing ain't gonna work. When I was a kid, I put things back together and they never worked anyway! But just, like, going into those details, it's kind of a scientist's thing. And I have that kind of [mind], it's part of my personality.”

“I'm one of the people who helped push it in the beginning. It was easier for me because I was an engineering major in college for two years. So when synthesizers came in, they used terminology I knew.”

More here.

Darwin’s Method

Kamil Ahsan in The Box Move:

Darwin_5[1] This essay takes the view that Darwin never worked either purely inductively or deductively. It will demonstrate how Darwin often worked on a hunch, and thus collected his facts not blindly as one might be inclined to believe, but essentially searched for the evidence that could support his hunch of evolution by natural selection. It will further argue that Darwin’s method did not involve mere wide-eyed observation but instead was based on hypotheses that he had already clearly thought about and on analogies from social thought as varied as that of Thomas Malthus and Adam Smith. In order to assess Darwin’s methodology, two levels of analysis will be used. A: Using the Notebooks, Darwin’s recorded thought process will be traced chronologically, marking important occurrences such as his meeting with ornithologist John Gould, and demonstrating through the early effect of Lyell’s geology and Darwin’s unsuccessful hypotheses, that he could not have proceeded inductively. B: Using Darwin’s letters and the Origin, the general themes in Darwin’s collection of evidence to support a work that was two decades or more in preparation will be propounded upon. The themes will thus demonstrate how Darwin selectively chose information to suit his needs especially in the context of Malthusian ideas, and that the best analysis can be made by approaching On the Origin of Species primarily as a work of synthesis and not merely as Darwin’s extrapolation following a great deal of objective observation. When viewing the Origin as a cumulative work, it will also be stressed that Darwin did not simply string together facts from observations in the field of biology, but drew from analogies across disciplines including geology and economics.

More here.

Monday, July 19, 2010

Academic War About War

by Frans de Waal

[Film by The Department of Expansion.]

For many years, anthropologists and biologists have been comparing the aggression of animals with human warfare. It started with Konrad Lorenz in the 1960s, and remains a popular endeavor. We have an aggressive instinct that leads to warfare, hence war will always be with us. This message was a bit hard to accept from Lorenz, an Austrian who served in the German army during WWII, but the debate continues as seen in the video above featuring interviews with Steven Pinker, Richard Wrangham, and myself.

Part of the problem is that modern warfare seems to have little to do with the raw aggressive instinct. Modern warfare rests on a tight hierarchical structure of many parties, not all of which are driven by aggression. In fact, most are just following orders. The decision to go to war is typically made by older men in the capital. When I look at a marching army, I don’t see aggression in action. I see the herd instinct: thousands of men in lock-step, willing to obey superiors.

In recent history, we have seen so much war-related death that we imagine that it must always have been like this, that warfare is written into our DNA. In the words of Winston Churchill: “The story of the human race is War. Except for brief and precarious interludes, there has never been peace in the world; and before history began, murderous strife was universal and unending.” But is Churchill’s warmongering state-of-nature any more plausible than Rousseau’s noble savage? Although archeological signs of individual murder go back hundreds of thousands of years, we lack similar evidence for warfare (such as graveyards with weapons embedded in a large number of skeletons) from before the agricultural revolution. Even the walls of Jericho — considered one of the first pieces of evidence of warfare and famous for having come tumbling down in the Old Testament — may have served mainly as protection against mudflows.

Long before this, our ancestors lived on a thinly populated planet, with altogether only a couple of million people. Before this, about 70,000 years ago, our lineage was at the edge of extinction living in scattered small bands. A study of mitochondrial DNA by genographer Doron Behar suggests: “Tiny bands of early humans developed in isolation from each other for as much as half of our entire history as a species.” These are hardly the sort of conditions to promote continuous warfare. My guess is that for our ancestors war was always a possibility, but that they followed the pattern of present-day hunter-gatherers, who do exactly the opposite of what Churchill surmised: they alternate long stretches of peace and harmony with brief interludes of violent confrontation.

Comparisons with apes hardly resolve this issue. Since it has been found that chimpanzees sometimes raid their neighbors and take their enemies’ lives, these apes have edged closer to the warrior image that we have of ourselves. Like us, chimps wage violent battles over territory. Genetically speaking, however, our species is exactly equally close to another ape, the bonobo, which does nothing of the kind. Bonobos can be unfriendly to their neighbors, but soon after a confrontation has begun, females often rush to the other side to have sex with both males and other females. Since it is hard to have sex and wage war at the same time, the scene rapidly turns into a peaceful gathering. Lethal aggression among bonobos has been unheard of.

Read more »

Seriously, What About Cousin Marriage?

Justin E. H. Smith

*

Books consulted for this essay:

Sidibe John Boswell, Christianity, Social Tolerance, and Homosexuality: Gay People in Western Europe from the Beginning of the Christian Era to the Fourteenth Century. 8th Edition. University of Chicago Press, 2005.

Robin Fox, Kinship and Marriage: An Anthropological Perspective. Cambridge University Press, 1967.

Maurice Godelier, Les métamorphoses de la parenté. Paris, Fayard, 2004.

Lewis Henry Morgan, Systems of Consanguinity and Affinity of the Human Family. London, 1871.

Martha C. Nussbaum, From Disgust to Humanity: Same-Sex Marriage and Constitutional Law. Oxford University Press, 2010.

Andrew Sullivan, Virtually Normal. An Argument About Homosexuality. Vintage, 1996.

Göran Therborn, Between Sex and Power: Family in the World, 1900-2000, Routledge, 2004.

*

I recently spelled out some of the reasons why I remain doubtful about the prospects for transforming marriage, worldwide, into a gender-indifferent institution. (It is only the worldwide perspective that interests me.) I have not heard, in reply, any substantive arguments against the reasons I give for my doubts, and I have therefore decided that it might be a good idea to try one more time, and this time to make my call for serious engagement more explicit. I would sincerely like to know whether there is something I am missing.

I have been alarmed to see a sort of orthodoxy emerge as if out of nowhere over just the past few years (many of you will be old enough to remember when, in the not-so-distant past, Andrew Sullivan was condemned as a betrayer and a domesticator of the gay spirit for his powerful defense of same-sex marriage in Virtually Normal; I hope no one will try to tell me that everyone who condemned him at the time was, wittingly or un-, an enemy of human rights). This orthodoxy, like its opposite and indeed like all orthodoxies, presumes that any questioning of it amounts to hostility. There is no room in either of the prevailing orthodoxies that have formed around the controversy over same-sex marriage for someone like me: someone who supports marriage equality, but doubts, based on a thorough but admittedly incomplete reading of historical and anthropological scholarship, that the concept of marriage is in fact flexible enough to ever be transformed in such a way that marriage will cease to be heterosexual by presumption.

That is, I believe that we are right to decide to make same-sex unions equal before the law, but that it is not up to us to decide that the primary meaning of 'marriage' will cease to be 'basic unit of kinship, involving the monogamous pair-bonding of a male and a female'. This meaning will remain primary not only because other-sex couples are, as everyone agrees, statistically more common than same-sex couples, but because there is a fairly rigid system of organization in societies throughout the world that continues to be based on a presumption of gender dimorphism, and that continues to take cross-gender pairings as the elementary units of social reality. This is not what I want (I personally couldn't be less interested in 'defending' traditional marriage, though as it happens I don't think it's going to need defending), but rather what I believe to be the case.

I also believe that the movement for marriage equality misunderstands its contingency and ignores the historical forces that brought it into being. One of the triggers of my coming-out as a skeptic occurred a few months ago, when I happened to be speaking with a group of acquaintances who are also outspoken defenders of marriage equality. When quite unexpectedly the topic of first-cousin marriage came up, they began snickering like little boys: like little boys I might add, who in the not so distant past found mirth in every occurrence of the word 'gay'. This caused me to note that there is a certain selectiveness in what counts among educated Western liberals as 'doing the right thing' (a phrase we hear so often, and have heard most recently in connection with the legalization of same-sex marriage in Argentina).

Read more »

The Minangkabau: Mixing Islam and Matriarchy

By Usha Alexander

Woman09 “In your marriage, who is the boss?” our driver, Arman, asked in a playfully provocative tone, like he was setting up the punchline of a joke.

My partner and I looked at each other, laughed, and shrugged. Arman belonged to the Minangkabau, the society recognized among anthropologists as the world’s largest and most stable surviving matriarchy* (though some prefer to call it a gylany, matrix, matrifocal or matricentric society, or something else to avoid conjuring images of mythical Amazons). Knowing this, I presumed his question was part of a routine entertainment for tourists.

“For us it is the woman who is boss,” he continued, predictably. “The woman has all the privileges; she owns everything. The men, we own nothing.”

Indonesia I knew that the Minangkabau, like most Indonesians, are Muslims. In May 2009, one of the first things I noticed upon arriving in their homeland—a stretch of volcanic highlands running along the western coast of Sumatra—was that a higher percentage of women here wear the hijab (here called jilbab) than did further north, near Medan and around Lake Toba. In fact, well over half of the adult women covered their hair in public. But here, as elsewhere on Sumatra, the headscarf appears to be as much a fashion statement as a covering for modesty. It’s often brightly colored or festooned with beads, sequins, rhinestones, small brooches, lace, or shimmery ribbons. Many women sport styles with a dainty sun visor in the front. Pretty much anything you can do to a hat is done to the Sumatran jilbab.

These observations, and Arman’s good humor at his lack of patrimony, made me wonder how I should understand the Minangkabau matriarchaat (their word, borrowed from the Dutch). What truce had been struck between Islam and matriarchy?

§

MtMerapi In the small villages surrounding Mount Marapi, which lies at the center of the Minangkabau creation myths, Arman lead us down tangled lanes lined with traditional Minangkabau homes. Many of these were great wooden structures, some as much as 300 years old, tattered or rotting in places, patched or expanded upon over the years. These traditional homes are long, each enclosing a broad rectangular hall over an empty ground floor, once used to house livestock. Their roofs are arched to suggest the horns of a buffalo. There were newer dwellings too, without the ground floor for livestock, with SUVs parked out front, satellite dishes growing like mushrooms of modernity from balconies and awnings.

Read more »

The Techno-Future and Pre-History of Toes

by Aditya Dev Sood

Grasp I was riding the 2/3 to Brooklyn the first couple of days I was back, when I saw this guy in a baggy pair of shorts, a T-shirt, and these kinda shoes I’d never seen before. They wrapped around each toe, exposing the toes basically, through the thin skin of the shoe. Years ago, I remember reading a children’s encyclopedia on Surrealist Art, where I saw a charcoal drawing of an empty pair of boots with laces whose burnished, buffeted folds drew further and further down to reveal toes. There was something spectral and scary about the catch in the mind, which confused shoe for feet, with the after-image of the even grosser idea that the skin of one’s feet might someday serve as the boots of another. These bizarre shoe-things with toes brought all that to mind and more. The mind understood sandals, it understood shoes, but these things were total genre busters – like the Sporks of footwear. They were somehow unseemly, uncanny, desirable. I had to have ‘em!

Grip I got online and found myself bang in the middle of a cultural revolution, where running is the leitmotif for a responsible and contemporary lifestyle. As many readers will already know, recent studies have suggested that human form emerges as a result of endurance running, whereby our distant ancestors ran and walk their prey to exhaustion and ultimate death. While we humans can easily be outclassed in a sprint and overwhelmed in a full frontal attack at close quarters, our intellect and genius for tracking was able to manifest a potentially overwhelming evolutionary advantage at long distances and over longer periods of time. Also relevant are recent pop-anthropological studies of Meso-American tribes who can still be observed running and hunting over long distances barefoot, perhaps evidence that we humans truly are born to run.

Ribbed for pleasure While there’s a small and growing sub-culture of barefoot runners these days, there’s also the view that this is a sure track to contracting Hepatitis C. This is because enough people have it, and enough of them are urinating out and about the city, so it is only a matter of time and chance for the moment when you have a cut on the palm of your foot, which becomes infected. But even in rural and remote regions of the world, walking or running barefoot can be a high-risk activity, exposing the body to hookworm, podoconiosis, and other neglected tropical diseases. Seen from this perspective, the shoe is a prophylactic, protecting the body from the diseases that may be locked into the loam of the earth. The goal of further design and innovation in shoes, therefore, should be to afford the flexibility and sensation of going bareback, while still ensuring that users enjoy safe sports.

Read more »

Blame the Victims and Make Them Feel Guilty – Part 2

Cardinal ratzinger 01

Blame the Victims and Make Them Feel Guilty – Part 2

by Norman Costa

Part 1 of “Blame the Victims and Make Them Feel Guilty” can be found HERE.

{Synopsis of Part 1}

Benedict XVI, Supreme Pontiff of the Catholic Church, visited the United States in April of 2008. He addressed the sexual abuse of children in the American Catholic Church, but never once, in his public homily at The Nationals Stadium in Washington, D. C., did he say or indicate that the abuse was committed by members of the clergy and religious congregations.

Two months later, George Weigel, Catholic theologian, public intellectual, and official biographer of Pope John Paul II, gave an interview on Book TV's “In Depth,” aired on C-SPAN 2.

I was not so much disappointed with Weigel, as bewildered by his complete lack of understanding the nature and consequences of child sex abuse; he does not understand what is involved in treating victims of child sex crimes; and he doesn't have any semblance of insight into the psychology of the perpetrators of child sex crimes.”

Weigel failed to see that what he calls, “grave errors of judgment,” and “irresponsibility” on the part of many bishops “…are really manifestations of criminal behavior, psychopathy, behavioral and mental disorders, narcissism, selfishness, a sociopath's belief that rules don't apply to them, sinful disregard for the spiritual well being of the faithful, sinful failure as shepherds who should protect their flock from harm, and pure self interest.”

He goes on to say, with little subtlety, that victims of clergy sex abuse are crippling the Catholic Church in America, driving it toward bankruptcy, and will bring about the end of all catholic education, hospitals, and social programs in the United States. The victims may very well end up burdening the U. S. tax payers with huge social costs or may cause national social programs to reduce services.

George Weigel doesn't stop there. He burdens the victims with more guilt, because they are helping their undeserving attorneys get rich. He would like victims to feel guilty about using the U. S. civil tort justice system, in order to get compensation for their losses. He says the victims are using an unfair justice system that doesn't work because citizen juries (the conscience of the court) do not work. He suggests that it is typical for millions of dollars to be awarded for frivolous claims, and cites a complete untruth and fabrication to support his view.

Weigel makes a not-too-veiled and sickening proposal that some victims may not be worth the money, and shouldn't get a monetary damage award, if society determines that they are so damaged they can't be 'fixed' by a monetary judgment.

I did not say this in Part 1, but I say it here: Weigel seemed to me to prefer that the Church efforts, particularly financial, to help victims should be reserved for those who still love the Church. In my view, this is offering help only to those who pass a loyalty test, and discards those so ravaged by the clergy that they lost their faith in the Church and in their religion. The most severely injured get the least help – maybe none.

{End Synopsis of Part 1}

Read more »

Five days with David Foster Wallace: Colin Marshall talks to author and journalist David Lipsky

David Lipsky is a contributing editor at Rolling Stone and the author of Although of Course You End Up Becoming Yourself: A Road Trip with David Foster Wallace. Crafted out of transcripts of a five day-long conversation between Lipsky and Wallace on the tail end of the publicity tour for Wallace’s breakthrough novel Infinite Jest, the book reveals facets of the beloved author that have never before been seen publicly. Colin Marshall originally conducted this interview on the public radio program and podcast The Marketplace of Ideas. [MP3] [iTunes]

Lipsky I want to tell you one thing I imagine about the creation of this book. Tell me if it's right or wrong. As the listener probably knows by now, this book is made out of transcripts of tapes you recorded while you were on the road with David Foster Wallace for five days during his publicity tour for his big novel in '96 Infinite Jest.

Yeah, it was a lot of fun.

It sounds like it. You didn't end up writing the article that these notes were for, a Rolling Stone profile. That got canceled. So you had these laying around, I presume, stored somewhere. I would imagine, after David Foster Wallace's untimely death in 2008, your mind went immediately to these materials, all this conversation you had with Wallace. I imagine a huge, crushing sense of responsibility. You're thinking, “I've got to do something with themes, but what?” Is that accurate at all?

Well, no — it's interesting, but when I first heard that he had died, like a lot of people, I didn't think it was true. I got an e-mail from a friend, and I assumed it was a prank. Spending time with David, what you have a sense of is just how mentally healthy he was. If you had asked me in the summer of 2008 to name the most healthy, mentally, American writer, I would have without any hesitation, said David Wallace. He just seemed like he'd gone through something when he was younger, but he seemed healed. He seemed like someone who had a wise, funny, sharp way of looking at life, which would tend to make you live longer, not less long. I was shocked. My first response was just tremendous surprise.

You saw this health in him. Is that just from your experience with him in '96, traveling for a few days, getting the first-person encounter, or was that from his work as well?

It was from both. I only knew him for those five days, and in the five days what you read us talking about is just how he'd gone a very hard time when he was in his late twenties, and had found a way to experience the world after that. That was what I had been reading in his work, and what I'd then read in his work afterwards. The person who writes a story like “Good Old Neon”, the person who writes nonfiction like “A Supposedly Fun Thing I'll Never Do Again” or “Consider the Lobster”, is not somebody who hasn't had hardships or wouldn't know how to go through it. Somebody who has, in the full way of a life, tested themselves against hardship and come out with a kind of warm comic knowledge. That was one of the things you love about his work. That's one of the things readers always feel: he has seen all the crap stuff, all the hard stuff they've seen, but he's also still incredibly aware, incredibly alive and incredibly funny.

The story you mention, “Good Old Neon” — it's gotten a lot of re-reading in the wake of Wallace's death simply because of the character it describes. There's this character that goes toward an end by his own hand in the story, and it even holds up a character called David Wallace who has avoided that. You think of other stories like “The Depressed Person”, an illustration of this phenomenon of depression that it's now revealed he suffered from himself.

There seems to be so much there than indicates David Wallace understands all these problems and has somehow transcended them. I think of that as a big paradox of his life and how he wound up. Is that the same way you think about it? There's all this understanding, but he ultimately did succumb to the same thing it seemed he had a grasp on.

I did, and when I read “Good Old Neon” when it came out in book form in 2005 — I'm not a crying reader, but that's one of the only short stories I read and cried at the end of, because of this beautiful line when the narrator becomes David and says, “David Wallace emerging from years of literally indescribable war with himself, won with considerably more intellectual firepower than he had in high school in 1982. I felt that.

That's one of the nice things of spending time with someone: I knew what he was talking about. I felt this great sense of power and health in that line. As a reader, I felt that thing of what a life is, which is that someone who is awake and aware — the kinds of people who like to read, the kinds of people who turn to books to find a little bit more about their lives — they've all gone through that kind of internal, internecine conflict. To see him saying that — I hadn't seen him, then, for almost ten years — I felt very warm for him.
Read more »

Sunday, July 18, 2010

Netanyahu admits he deceived U.S. to destroy Oslo accord

Jonathan Cook in The National:

Ben_netanyahu47357 The contents of a secretly recorded video threaten to gravely embarrass not only Benjamin Netanyahu, the Israeli prime minister but also the US administration of Barack Obama.

The film was shot, apparently without Mr Netanyahu’s knowledge, nine years ago, when the government of Ariel Sharon had started reinvading the main cities of the West Bank to crush Palestinian resistance in the early stages of the second intifada.

At the time Mr Netanyahu had taken a short break from politics but was soon to join Mr Sharon’s government as finance minister.

On a visit to a home in the settlement of Ofra in the West Bank to pay condolences to the family of a man killed in a Palestinian shooting attack, he makes a series of unguarded admissions about his first period as prime minister, from 1996 to 1999.

Seated on a sofa in the house, he tells the family that he deceived the US president of the time, Bill Clinton, into believing he was helping implement the Oslo accords, the US-sponsored peace process between Israel and the Palestinians, by making minor withdrawals from the West Bank while actually entrenching the occupation. He boasts that he thereby destroyed the Oslo process.

He dismisses the US as “easily moved to the right direction” and calls high levels of popular American support for Israel “absurd”.

He also suggests that, far from being defensive, Israel’s harsh military repression of the Palestinian uprising was designed chiefly to crush the Palestinian Authority led by Yasser Arafat so that it could be made more pliable for Israeli diktats.

More here.