Our First Expatriate President

by James McGirk

ScreenHunter_32 May. 14 09.04Pundits on the right and left have described President Barack Obama as having a distant attitude toward the United States – on the right they call it narcissism and hint at secret agendas and question his patriotism, while on the left they wonder darkly whether he might be “too brainy to be president.” I think it is something else. I have never met President Obama, but our lives have converged in unusual ways. Perhaps unpacking my own intense and complex relationship with the United States might shed some light into what might at first seem like an aloof and distant attitude toward our homeland.

Mr. Obama spent his formative years as an outsider and that estrangement shaped his view of the United States in a profound way. At school he was peculiar, he had lived overseas and was a jangled mixture of races and cultures. His father was Kenyan, his stepfather was Indonesian and his mother was a Caucasian expatriate academic. And Hawaii, particularly in the 1960s and 1970s, was more like a forward operating base or an embassy than a state. Men and material were flowing through it en route to Vietnam and the federal government had a far more pervasive and sinister presence in Hawaii than it did elsewhere. The United States wasn’t a fundamental part of himself – not in the unambiguous, automatic way it would be for someone born in Detroit, Michigan – rather his sense of belonging to the United States was something that had to be negotiated.

My early life was equally jangled. My parents were journalists and my grandfather was a petroleum prospector for Texaco, which meant that our family was estranged from the United States for more than 70 years. Growing up, the U.S. was a highly abstract concept that was paradoxically close and accessible to me. My information about the country came from mostly headline news, and was highly polarized; this was the era of Ronald Reagan and Margaret Thatcher, and more often than not the news contained tales of the tape comparing the United States with the Soviet arsenals. What little of I saw of the real U.S. came in brief glimpses during visits to embassies or when we visited relatives in Southern California. In comparison to the bleakness of the United Kingdom, Spain and India, the U.S. was a technologically advanced paradise where everyone looked and sounded the way I did.

Read more »

Happy 60th Birthday to 3QD friend David Byrne

by S. Abbas Raza

David+ByrneDavid is one of the rare rock stars who continues to innovate and surprise in several artistic fields decades after his rise to fame as lead singer of the Talking Heads in the 1970s. Today, even as he enters his seventh decade he shows no signs of slowing down. Last week I got the following email from him:

Last year I did some songs for a film by Italian director Paolo Sorrentino (check out his previous film Il Divo). This new film, This Must Be The Place, stars Sean Penn as a kind of present-day Robert Smith (The Cure). I wrote the music, Will Oldham wrote the lyrics and Michael Brunnock sang the songs. In the film, these songs appear on a demo CD that the Sean Penn character is handed. Now, the “Italian Academy Awards” have awarded us for best score and best original song!

If you want to get it, the soundtrack is available now on Amazon and iTunes and for people in the UK, there is also a vinyl collectors edition available.

I constantly listen to David's collaboration with Brian Eno, Everything That Happens Will Happen Today, and you can hear it too, here.

Five years ago I wrote about the amazing experience of seeing David perform live at Carnegie Hall. You can read that here.

We hope that David will continue to enrich our lives in his particular and quirky ways for decades more to come!

[Photo from last.fm.]

Two Hundred Years of Surgery

Nejmra1202392_attach_1_gawande_thumb_111x111Atul Gawande in the New England Journal of Medicine (via brainiac):

The first volume of the New England Journal of Medicine and Surgery, and the Collateral Branches of Science, published in 1812, gives a sense of the constraints faced by surgeons, and the mettle required of patients, in the era before anesthesia and antisepsis. In the April issue for that year, John Collins Warren, surgeon at the Massachusetts General Hospital and son of one of the founders of Harvard Medical School, published a case report describing a new approach to the treatment of cataracts.1 Until that time, the prevalent method of cataract treatment was “couching,” a procedure that involved inserting a curved needle into the orbit and using it to push the clouded lens back and out of the line of sight.2 Warren's patient had undergone six such attempts without lasting success and was now blind. Warren undertook a more radical and invasive procedure — actual removal of the left cataract. He described the operation, performed before the students of Harvard Medical School, as follows:

The eye-lids were separated by the thumb and finger of the left hand, and then, a broad cornea knife was pushed through the cornea at the outer angle of the eye, till its point approached the opposite side of the cornea. The knife was then withdrawn, and the aqueous humour being discharged, was immediately followed by a protrusion of the iris.

Into the collapsed orbit of this unanesthetized man, Warren inserted forceps he had made especially for the event. However, he encountered difficulties that necessitated improvisation:

The opaque body eluding the grasp of the forceps, a fine hook was passed through the pupil, and fixed in the thickened capsule, which was immediately drawn out entire. This substance was quite firm, about half a line in thickness, a line in diameter, and had a pearly whiteness.

A bandage was applied, instructions on cleansing the eye were given, and the gentleman was sent home. Two months later, Warren noted, inflammation required “two or three bleedings,” but “the patient is now well, and sees to distinguish every object with the left eye.”

The implicit encouragement in Warren's article, and in others like it, was to be daring, even pitiless, in attacking problems of an anatomical nature. As the 18th-century surgeon William Hunter had told his students, “Anatomy is the Basis of Surgery, it informs the Head, guides the hand, and familiarizes the heart to a kind of necessary inhumanity.”3 That first volume of the Journal provided descriptions of a remarkable range of surgical techniques, including those for removing kidney, bladder, and urethral stones; dilating the male urethra when strictured by the passage of stones; tying off aneurysms of the iliac artery and infrarenal aorta; treating burns; and using leeches for bloodletting. There were articles on the problem of “the ulcerated uterus” and on the management of gunshot and cannonball wounds, not to mention a spirited debate on whether the wind of a passing cannonball alone was sufficient to cause serious soft-tissue injury.

Surgery, nonetheless, remained a limited profession.

Could a Single Pill Save Your Marriage?

PillmarriageGeorge Dvorsky in io9:

Your relationship is on the rocks. Begrudgingly, you and your significant other visit a marriage counselor in the hopes that there's still something left to salvage in your relationship. You both spill your guts and admit that the love is gone. The counselor listens attentively, nodding her head every now and then in complete understanding. At the end of the session she offers the two of you some practical words of advice and sees you on your way. Oh, but before you leave she fills out a prescription for the two of you. Your marriage, it would seem, has been placed on meds.

Now, as messed up as this scenario might seem, this could very well be the future of marriage counseling. At least that's what Oxford neuroethicists Julian Savulescu and Anders Sandberg believe. In their paper, “Neuroenhancement of Love and Marriage: The Chemicals Between Us,” they argue that such a possibility awaits us in the not-too-distant future, and that a kind of ‘love potion' could eventually be developed to strengthen pair bonding. In fact, most of the compounds required to make such a concoction are already within our grasp. It's just a matter of doing it.

It's no secret that divorce rates are going up. Most people would agree that the end of a relationship is a tragic and undesirable thing. Modern couples tend to break-up between the five to nine year mark, a time when the initial honeymoon phase is long gone and the hard realities of managing a longterm relationship really start to kick in.

And while economic and social factors can often play a part in the disintegration of a marriage, neuroscience is increasingly showing that that love is in the brain.

Can Physics and Philosophy Get Along?

Gary Gutting in the New York Times:

ScreenHunter_31 May. 13 13.17Physicists have been giving philosophers a hard time lately. Stephen Hawking claimed in a speech last year that philosophy is “dead” because philosophers haven’t kept up with science. More recently, Lawrence Krauss, in his book, “A Universe From Nothing: Why There Is Something Rather Than Nothing,” has insisted that “philosophy and theology are incapable of addressing by themselves the truly fundamental questions that perplex us about our existence.” David Albert, a distinguished philosopher of science, dismissively reviewed Krauss’s book: “all there is to say about this [Krauss’s claim that the universe may have come from nothing], as far as I can see, is that Krauss is dead wrong and his religious and philosophical critics are absolutely right.” Krauss — ignoring Albert’s Ph.D. in theoretical physics — retorted in an interview that Albert is a “moronic philosopher.” (Krauss somewhat moderates his views in a recent Scientific American article.)

I’d like to see if I can raise the level of the discussion a bit. Despite some nasty asides, Krauss doesn’t deny that philosophers may have something to contribute to our understanding of “fundamental questions” (his “by themselves” in the above quotation is a typical qualification). And almost all philosophers of science — certainly Albert — would agree that an intimate knowledge of science is essential for their discipline. So it should be possible to at least start a line of thought that incorporates both the physicist’s and the philosopher’s sensibilities.

More here.

Israel’s Spy Revolt

Natan Sachs in Foreign Policy:

ScreenHunter_30 May. 13 13.10Something has gone very wrong with Israel's posture on Iran's nuclear program. While Prime Minister Benjamin Netanyahu and Defense Minister Ehud Barak lead a confrontational approach — including dramatic interviews and speeches to U.S. audiences that have convinced many that Israel might soon strike Iran's nuclear facilities — the former heads of Israel's intelligence agencies have come out publicly against the government's position. First, Meir Dagan — who headed the Mossad until late 2010 and coordinated Israel's Iran policy — called an attack on Iran “the most foolish thing I've heard.” In April, Yuval Diskin — the previous head of the domestic intelligence service, the Shin Bet — voiced a scathing and personal critique of Netanyahu and Barak. Diskin questioned not only the leaders' policy, but also their very judgment and capacity to lead, warning against their “messianic” approach to Iran's nuclear program.

Given these differences, should the United States — and Iran — fear an Israeli strike more, or should they relax as Israel busies itself with internal arguments? Although it may be tempting to think that the Dagan-Diskin campaign lessens the chance of confrontation, in truth it raises two dire possibilities. First, if the former spy chiefs are correct about Netanyahu's and Barak's lack of judgment, this is hardly cause for comfort. If, however, Dagan and Diskin are mistaken and Israeli strategy is in fact calculated and sober, then undermining Israel's credibility — as they themselves have done — makes an Israeli strike more likely, not less. The less credible the Israeli threat, the more likely Iran is to try to call an Israeli bluff, and thus the more likely Israel is to try to back up its words with deeds.

More here.

The Case For A Presidential Science Debate

Transcript from NPR:

Romney_obama1-460x307IRA FLATOW, HOST:

This is SCIENCE FRIDAY. I'm Ira Flatow. Every member of the House of Representatives and a few from the Senate, about a third of the Senate, I believe, is up for re-election this year. There will be hundreds of debates, local and national. Candidates will be asked questions about unemployment, the deficit, gay marriage, budget cutting. But will any of them be asked about their opinions or their knowledge of science and technology?

We have politicians who claim global warming is a hoax, others who don't believe in evolution. Shouldn't we want to know what the candidates know about the basic things in science? Will any moderators of the inevitable presidential debates even ask one question about science?

These are some of the reasons that a grassroots coalition of scientists, engineers and science advocates is calling for a televised presidential science debate. Their goal: for candidates to give us more than canned responses and for voters to make an informed decision in November, informed meaning knowing something about the candidates' views about science.

Joining me now to delve into some of these questions: Shawn Otto is the CEO and co-Founder of ScienceDebate.org, the group trying to organize a presidential science debate. He's also author of “Fool Me Twice: Fighting the Assault on Science in America.” He joins us from Minnesota Public Radio in St. Paul. Welcome back to the program, Shawn.

SHAWN OTTO: Thanks, Ira.

FLATOW: Dr. John Allen Paulos is a professor of mathematics at Temple University and author of several books, including “Innumeracy” and “A Mathematician Reads the Newspaper.” He joins us from Philadelphia. Welcome back to SCIENCE FRIDAY, Dr. Paulos.

DR. JOHN ALLEN PAULOS: Thanks much.

FLATOW: And former Congressman Vern Ehlers is a Republican, former Republican congressman from Michigan. He's also a physicist. He joins us from Grand Rapids. Welcome to SCIENCE FRIDAY.

More here.

Why Mother’s Day founder came to hate her creation

From The Washington Post:

MomYour friendly Census Bureau has provided the following facts about mothers, children and there is some surprising history behind the annual ritual we call Mother’s Day (and that some people see as a gift from greeting card companies to themselves). First of all, where did Mother’s Day originate and how is that the founder of the day eventually came to be arrested for protesting a Mother’s Day carnation sale?

Says the bureau:

“The driving force behind Mother’s Day was Anna Jarvis, who organized observances in Grafton, W.Va., and Philadelphia on May 10, 1908. As the annual celebration became popular around the country, Jarvis asked members of Congress to set aside a day to honor mothers. She finally succeeded in 1914, when Congress designated the second Sunday in May as “Mother’s Day.” As it turns out, her mother, Ann, had started Mother’s Day Work Clubs in five cities to improve health and sanitary conditions during the Civil War; soldiers from both sides were cared for equally. After her mother died, Anna Jarvis organized memorials in what ultimately led to the congressional action on Mother’s Day. But, according to Biography.com and other sources, Anna Jarvis eventually came to resent the commercialization of the holiday — so much so that she campaigned for its abolition — to no avail. She is said to have complained that she wanted it to be “a day of sentiment, not profit,” but instead had become a bonanza for greeting cards which she saw as “a poor excuse for the letter you are too lazy to write.” She and her sister spent the family assets trying to end it — and she was once arrested for protesting a sale of carnations for Mother’s Day after florists and greeting card companies realized in the early 1920s that the holiday could be a bonanza for them.

More here.

The other 1971: A Review of “Of Martyrs and Marigolds”

Javed Jabbar in the Express Tribune:

Martyrs_Marigolds03-12Of Martyrs and Marigolds by Aquila Ismail is a poignant and evocative portrayal of the so-far largely untold aspects of a sad saga. In a novelised form, the book depicts the shattered dreams and dilemmas of the Urdu-speaking Bihari-origin residents of East Pakistan, particularly in the years 1971 and 1972.

There has been patchy coverage of the roughly 200,000 Biharis living in refugee camps post-1971, who want to move to a Pakistan which is no longer willing to accept them. But news media in general and non-news media in particular have devoted little attention to the paradoxical plight of those Bihari East Pakistanis who genuinely loved the land and the people they had adopted. Many of them condemned the postponement of the National Assembly session by General Yahya Khan on March 1, 1971. They were grieved by the use of excessive military force against the Awami League onwards of March 25, 1971. And they did not support the pro-Pakistan militias that were pitched against the Bengali militias.

When these innocent, non-combatant Biharis and other Urdu-speaking residents of East Pakistan began to be indiscriminately targeted by Bengali Awami League extremists to settle scores against General Yahya Khan’s policies and the actions initiated by General Tikka Khan, tens of thousands of these persons became victims overnight at the hands of fellow countrymen.

More here.

Bird color variations speed up evolution

From PhysOrg:

BirdResearchers have found that bird species with multiple plumage colour forms within in the same population, evolve into new species faster than those with only one colour form, confirming a 60 year-old evolution theory. The global study used information from birdwatchers and geneticists accumulated over decades and was conducted by University of Melbourne scientists Dr Devi Stuart-Fox and Dr Andrew Hugall (now based at the Melbourne Museum) and is published in the journal Nature.

The link between having more than one colour variation (colour ) like the iconic red, black or yellow headed Gouldian finches, and the faster of new species was predicted in the 1950s by famous scientists such as Julian Huxley, but this is the first study to confirm the theory. By confirming a major theory in , we are able to understand a lot more about the processes that create biodiversity said Dr Devi Stuart-Fox from the University's Zoology Department. “We found that in three families of birds of prey, the hawks and eagles, the owls and the nightjars, the presence of multiple colour forms leads to rapid generation of new species,” Dr Stuart-Fox said. “Well known examples of colour polymorphic species in these families include the Australian grey goshawk which has a grey and pure white form, the North American eastern screech owl and the Antillean nighthawk, each with grey and red forms.”

More here.

Grand Flattery: The Yale Grand Strategy Seminar

Thomas Meaney and Stephen Wertheim in The Nation:

2644055955_ddbe01c39eIn 1909 a group of men met on an estate in Wales to save Western civilization. Troubled by the erosion of British world power, they believed the decline could be reversed if statesmen turned away from the mundane tasks of modern diplomacy and channeled the wisdom of ancient Greece instead. The Greeks, in reconciling rulership with freedom, had made the West great, and supplied a model for their Anglo-Saxon heirs. No longer should the empire run itself; members of the group, including Lloyd George and Lord Milner, would train men of penetrating insight to direct imperial affairs more self-consciously than ever before. Drawing protégés from Oxbridge, the Round Table, as the group called itself, aimed to impart the lessons of enlightened leadership to a new generation. They produced countless articles and monographs. Chapters of the society flourished all over the empire. Ten years later, they had disappeared: nationalism had swept away their plans to knit the colonies closer together. British ascendancy ended sooner than any of them could have imagined.

The mantle of world leadership soon passed to the United States, and it’s here, where the ruling class is now experiencing its own crisis of confidence, that the Round Table is having something of a second act. Anxiety about America’s place in the world intensified after 9/11 but first became acute in the late 1990s, when the ills of the post–cold war world no longer appeared transient and seemed to demand concerted US leadership in response. This was the moment when liberal interventionism and neoconservatism ascended to the political mainstream and the grand narrative of “globalization” entered into wide circulation. In New Haven, historians John Lewis Gaddis and Paul Kennedy put forth a different response. Opposed to the Clinton administration’s ad hoc policy-making, they conceived a series of “grand strategy” seminars at Yale that aspired to train the next generation of leaders.

More here.

Testosterone on My Mind and In My Brain

Bk_249_bc630Over at Edge, a conversation with Simon Baron-Cohen:

Many of you know that the topic of sex differences in psychology is fraught with controversy. It's an area where people, for many decades, didn't really want to enter because of the risks of political incorrectness, and of being misunderstood.

Perhaps of all of the areas in psychology where people do research, the field of sex differences was kind of off limits. It was taboo, and that was partly because people believed that anyone who tried to do research into whether boys and girls, on average, differ, must have some sexist agenda. And so for that reason a lot of scientists just wouldn't even touch it.

By 2003, I was beginning to sense that that political climate was changing, that it was a time when people could ask the question — do boys and girls differ? Do men and women differ? — without fear of being accused of some kind of sexist agenda, but in a more open-minded way.

First of all, I started off looking at neuroanatomy, to look at what the neuroscience is telling us about the male and female brain. If you just take groups of girls and groups of boys and, for example, put them into MRI scanners to look at the brain, you do see differences on average. Take the idea that the sexes are identical from the neck upwards, even if they are very clearly different from the neck downwards: the neuroscience is telling us that that is just a myth, that there are differences, even in terms of brain volume and the number of connections between nerve cells in the brain at the structure of the brain, on average, between males and females.

I say this carefully because it's still a field which is prone to misunderstanding and misinterpretation, but just giving you some of the examples of findings that have come out of the neuroscience of sex differences, you find that the male brain, on average, is about eight percent larger than the female brain. We're talking about a volumetric difference. It doesn't necessarily mean anything, but that's just a finding that's consistently found. You find that difference from the earliest point you can put babies into the scanner, so some of the studies are at two weeks old in terms of infants.

The Really Nice Guy Materialist

Cover_article_14814_en_US

Julian Baggini interviews Patricia Churchland, in The Philosophers' Magazine (for Abbas):

[Y]ou can understand the trepidation felt by many at the thought of Patricia [Chruchland] tackling the issue of morality head-on. But her recent book, Braintrust: What Neuroscience Tells Us about Morality defies such expectations, due largely to the fact that the answer to the question implied by the subtitle is very far from everything. This contrasts starkly with what many see as the scientific hubris of Sam Harris in his recent The Moral Landscape.

“Sam Harris has this vision that once neuroscience is much more developed then neuroscientists will be able to tell us what things are right or wrong, or at least what things are conducive to well-being and not. But even if you cast it in that way, that’s pretty optimistic – or pessimistic, depending on your point of view. Different people even within a culture, even within a family, have different views about what constitutes their own well-being. Some people like to live out in the bush like hermits and dig in the ground and shoot deer for resources, and other people can’t countenance a life that isn’t in the city, in the mix of cultural wonderfulness. So people have fundamentally different ideas about what constitutes well-being.

“I think Sam is just a child when it comes addressing morality. I think he hasn’t got a clue. And I think part of the reason that he kind of ran amuck on all this is that, as you and I well know, trashing religion is like shooting fish in a barrel. If Chris Hitchens can just sort of slap it off in an afternoon then any moderately sensible person can do the same. He wrote that book in a very clear way although there were lots of very disturbing things in it. I think he thought that, heck, it’s not that hard to i gure these things out. Morality: how hard can that be? Religion was dead easy. And it’s just many orders of magnitude more difficult.”

What Churchland believes science can do is describe the “neural platform” for ethics. What does she mean by this?

The Rumour Machine: On the Dismissal of Bo Xilai

200px-Voa_chinese_bo_xilai_13Feb12_portraitWang Hui in the LRB (image from Wikipedia):

‘March 14’ used to be shorthand in China for the 2008 unrest in Tibet; now it stands for the 2012 ‘Chongqing incident’. It is unusual for municipal policy to have national impact, and rarer still for the removal of a city leader to become international news. Some observers have argued that the dismissal of Bo Xilai, the party secretary of Chongqing, is the most important political event in China since 1989.

Stories began to circulate on 6 February, when Wang Lijun, Chongqing’s police chief, fled to the US consulate in the nearby city of Chengdu. Neither the Chinese nor the American authorities have revealed anything about what followed, the US saying only that Wang had an appointment at the consulate and left the next day of his own accord. Since then he has been in the custody of the Chinese government. Reports in the foreign media fuelled online speculation, with the result that all sorts of rumours began to spread – some of them later shown to be true. There were stories about a power struggle between Bo and Wang; about the corruption of Bo’s family (how could they afford to send their son to Harrow, Oxford and Harvard?); about coup attempts by Bo and Zhou Yongkang, the head of China’s security forces; about business deals and spying; about a connection between Bo and the mysterious death of the British businessman Neil Heywood in November. Even supporters of what has been called the Chongqing experiment – the reforms implemented under Bo, who became party secretary there in 2007 – were unwilling to say that no corruption or malfeasance took place. In today’s China, these offer a convenient pretext for an attack on a political enemy.

As the stories multiplied, two main interpretations emerged. The first – supported by a good deal of leaked information – saw the Chongqing case as merely a matter of a local leader who had broken the law. The second linked the incident to political differences. With a population of 32 million, Chongqing is one of the PRC’s four centrally governed municipalities (the others are Beijing, Shanghai and Tianjin). In the 1930s and 1940s, the city was an important arms-manufacturing centre for the Kuomintang, and today serves as a hub for much of south-west China. The Chongqing model operated within China’s existing political institutions and development structures, which emphasise attracting business and investment, but involved quite distinctive social reforms. Large-scale industrial and infrastructural development went hand in hand with an ideology of greater equality – officials were instructed to ‘eat the same, live the same, work the same’ as the people – and an aggressive campaign against organised crime. Open debate and public participation were encouraged, and policies adjusted accordingly. No other large-scale political and economic programme has been carried out so openly since the reform era began in 1978, soon after Mao’s death.

Memory Displaced: Re-reading Jean Améry’s “Torture”

Diner_468wDan Diner in Eurozine:

Back in the 1960s, Jean Améry's “Torture” (1965) was required reading. The line “Whoever has succumbed to torture can no longer feel at home in the world” had an exceptional persuasive intensity. In his iconic text, composed twenty-two years after being subjected to the torment of torture, Améry reflects on the pain inflicted on his body and soul. He develops a sort of anthropology of agony, scrutinizing the modes and manners of a deliberate and maliciously executed violence, one that rips through the layers of the flesh, wilfully rends the body's limbs, engendering abysmal suffering. The procedure starts with a first and definitive blow, shattering the human being's elementary trust in the world. Reflecting on its foundationally destructive impact in devastating man's personality, Améry writes: “The first blow brings home to the prisoner that he is helpless, and thus it already contains in the bud everything that is to come.”

Améry's essay presents a philosophy of pain remembered, of pain brutally inflicted on his body. In July 1943, after being arrested as an anti-Nazi resister, he was removed to Fort Breendonck, located between Antwerp and Brussels. There he was subjected to torture – a site of memory emblemized by W.G. Sebald in his novella Austerlitz, in allusion to Améry's existential experience there.

“There I experienced it: torture,” writes Améry. Yet the pain was not inflicted upon him randomly. The purpose of the torment directed at his body was to break his will as a political opponent to the regime, to compel him to betray his associates, his comrades. The torture carried out on the prisoner Améry was inflicted on a person who had chosen to resist – something that happens all the time and everywhere. Améry writes: “Somewhere, someone is crying out under torture. Perhaps in this hour, this second.” For him, the pain was exemplary, but by far not exceptional. All the more resolute and ultimate, therefore, was his judgment: “Torture is the most horrible event a human being can retain within himself”.

The Original Colonist

From The New York Times:

AntThis is not a humble book. Edward O. Wilson wants to answer the questions Paul Gauguin used as the title of one of his most famous paintings: “Where do we come from? What are we? Where are we going?” At the start, Wilson notes that religion is no help at all — “mythmaking could never discover the origin and meaning of humanity” — and contemporary philosophy is also irrelevant, having “long ago abandoned the foundational questions about human existence.” The proper approach to answering these deep questions is the application of the methods of science, including archaeology, neuroscience and evolutionary biology. Also, we should study insects.

…In “The Social Conquest of Earth,” he explores the strange kinship between humans and some insects. Wilson calculates that one can stack up log-style all humans alive today into a cube that’s about a mile on each side, easily hidden in the Grand Canyon. And all the ants on earth would fit into a cube of similar size. More important, humans and certain insects are the planet’s ­“eusocial” species — the only species that form communities that contain multiple generations and where, as part of a division of labor, community members sometimes perform altruistic acts for the benefit of others. Wilson’s examples of insect eusociality are dazzling. The army ants of Africa march in columns of up to a million or more, devouring small animals that get in their way. Weaver ants “form chains of their own bodies in order to pull leaves and twigs together to create the walls of shelters. Others weave silk drawn from the spinnerets of their larvae to hold the walls in place.” Leafcutter ants “cut fragments from leaves, flowers and twigs, carry them to their nests and chew the material into a mulch, which they fertilize with their own feces. On this rich material, they grow their principal food, a fungus belonging to a species found nowhere else in nature. Their gardening is organized as an assembly line, with the material passed from one specialized caste to the next.”

More here.

The Age of Insight: How Our Brain Perceives Art

From Columbia Magazine:

Denise-und-Eric-KandelMany strands of Eric Kandel’s life come together in his latest work, The Age of Insight: The Quest to Understand the Unconscious in Art, Mind, and Brain, from Vienna 1900 to the Present. The 82-year-old University Professor and co-director of the Mind Brain Behavior Initiative was born in Vienna, where, as a boy of 8, he witnessed the Nazis march into the Austrian capital. Decades later, he recalls how much his own intellectual interests were shaped not only by the Holocaust that followed, but by the cosmopolitan city that in the early 1900 served as an extraordinary incubator for creativity and thought that shaped the world we live in today. From the Modernist painters Gustav Klimt, Oskar Kokoschka and Egon Schiele to the pioneering psychoanalyst Sigmund Freud, a new view emerged of the human mind. Indeed, before Kandel plunged into research on the neurobiology of memory, which would win him the Nobel Prize in Physiology or Medicine in 2000, he had aspired to be a psychoanalyst himself. As he pondered the mission of Columbia’s Mind Brain Behavior Initiative—to connect biomedical sciences with the arts, humanities and social sciences—he came to see that our contemporary understanding of human behavior can be traced directly back to fin de sécle Vienna 1900, particularly in its emphasis on the unconscious and irrational aspects of the human mind.

And it didn’t hurt that he and his wife Denise, who survived the Holocaust as a child in France, have long been art collectors who own small works by Klimt, Kokoschka and Schiele, as well as by other German Expressionists. Indeed, Kandel’s 600-page work is dedicated to his wife, a professor of sociomedical sciences in the Department of Psychiatry at the Mailman School of Public Health. “I discuss beauty and I say everyone gets pleasure out of looking at a beautiful face, and I use a photograph of Denise when I first met her,” he says. “I mean she is, and she was, for me remarkably beautiful. And one of the things I enjoyed in our friendship when I first met her is that, in addition to our very wonderful relationship, she was just so pleasant to look at.”

Q. What made you decide to turn your attention to the neurobiology of how we perceive art?

There are many motivating factors. One was my longterm interest in Klimt, Kokoschka and Schiele, the three Austrian Modernists, my fascination with Vienna 1900 and with Freud. I wanted to become a psychoanalyst and I’m Viennese so I sense a shared intellectual history, particularly with turn-of-the-century Vienna. But the immediate stimulus actually came from [Columbia President] Lee Bollinger. The idea behind the Mind Brain Behavior Initiative is to try to understand the human mind in biological terms and to use these insights to bridge the biology of the brain with other areas of the humanities. Lee expressed the belief that the new science of the mind could have a major impact on the academic curriculum, that in a sense everyone at the University works on the human mind. I felt I was doing this for personal reasons, but isn’t it wonderful that it is also in line with one of the missions of the University?

More here.

Let the wild rumpus start!

16x9

Over the years, Sendak expressed frustration that “Where the Wild Things Are,” published in 1963, tended to overshadow his other books, which include many children’s classics: “Pierre,” “Chicken Soup With Rice” and the phantasmagoric “In the Night Kitchen,” often challenged because it showed its boy protagonist, Mickey, in the nude. He also disputed the description of himself as a children’s writer; as he told Stephen Colbert in January, “I don’t write for children. … I write. And somebody says, ‘That’s for children.’ I didn’t set out to make children happy. Or make life better for them, or easier for them.” That’s a fair point, and it suggests why his work remains resonant, because he was writing not for a specific audience but to express himself. This is the essence of art, the essence of what books and stories have to offer, that sense of reaching out across the void. And yet, as Sendak recognized — indeed, as he encoded into his very narratives — such connections go only so far.

more from David L. Ulin at the LA Times here.