Did human evolution favor individualists or altruists? Ayn Rand vs. the Pygmies

From Slate:

AtlasBlack-and-white colobus monkeys scrambled through the branches of Congo’s Ituri Forest in 1957 as a small band of Mbuti hunters wound cautiously through the undergrowth, joined by anthropologist Colin Turnbull. The Mbuti are pygmies, about 4 feet tall, but they are powerful and tough. Any one of them could take down an elephant with only a short-handled spear. Recent genetic evidence suggests that pygmies have lived in this region for about 60,000 years. But this particular hunt reflected a timeless ethical conflict for our species, and one that has special relevance for contemporary American society. The Mbuti employed long nets of twined liana bark to catch their prey, sometimes stretching the nets for 300 feet. Once the nets were hung, women and children began shouting, yelling, and beating the ground to frighten animals toward the trap. As Turnbull came to understand, Mbuti hunts were collective efforts in which each hunter’s success belonged to everybody else. But one man, a rugged individualist named Cephu, had other ideas. When no one was looking, Cephu slipped away to set up his own net in front of the others. “In this way he caught the first of the animals fleeing from the beaters,” explained Turnbull in his book The Forest People, “but he had not been able to retreat before he was discovered.” Word spread among camp members that Cephu had been trying to steal meat from the tribe, and a consensus quickly developed that he should answer for this crime. At an impromptu trial, Cephu defended himself with arguments for individual initiative and personal responsibility. “He felt he deserved a better place in the line of nets,” Turnbull wrote. “After all, was he not an important man, a chief, in fact, of his own band?” But if that were the case, replied a respected member of the camp, Cephu should leave and never return. The Mbuti have no chiefs, they are a society of equals in which redistribution governs everyone’s livelihood. The rest of the camp sat in silent agreement. Faced with banishment, a punishment nearly equivalent to a death sentence, Cephu relented. “He apologized profusely,” Turnbull wrote, “and said that in any case he would hand over all the meat.” This ended the matter, and members of the group pulled chunks of meat from Cephu’s basket. He clutched his stomach and moaned, begging that he be left with something to eat. The others merely laughed and walked away with their pound of flesh. Like the mythical figure Atlas from Greek antiquity, condemned by vindictive gods to carry the world on his shoulders for all eternity, Cephu was bound to support the tribe whether he chose to or not.

Meanwhile, in the concrete jungle of New York City, another struggle between the individual and the group was unfolding. In October of 1957, Ayn Rand published her dystopian novel Atlas Shrugged, in which a libertarian hero named John Galt condemns his collectivist society because of its failure to support individual rights. “By the grace of reality and the nature of life, man—every man—is an end in himself,” Galt announced, “he exists for his own sake, and the achievement of his own happiness is his highest moral purpose.” Unlike Cephu, Galt had the means to end his societal bondage. By withdrawing his participation and convincing others to do the same, he would stop the motor of the world. Atlas would shrug. “Every living species has a way of survival demanded by its nature,” Galt insisted. “I swear by my life, and my love of it, that I will never live for the sake of another man, nor ask another man to live for mine.” Ayn Rand’s defense of a human nature based on rationality and individual achievement, with capitalism as its natural extension, became the rallying cry for an emerging libertarian stripe in conservative American politics. Paul Ryan cites Atlas Shrugged as forming the basis of his value system and says it was one of the main reasons he chose to enter politics. Other notable admirers include Rush Limbaugh, Alan Greenspan, Clarence Thomas, as well as Congressional Tea Party Caucus members Steve King, Mick Mulvaney, and Allen West.

More here.

Lack of Education Widens Gap in Life Expectancy

From Columbia Magazine:

OldhandsThe MacArthur Research Network on Aging, chaired by Dr. John W. Rowe, has published its latest research showing a widening gap in life expectancy between Americans with higher education and those without a high school diploma. The gap has increased dramatically among whites, with those who lack a high school diploma suffering dramatic declines in life expectancy. The biggest gap, however, persists between college-educated whites and blacks who don't complete high school. The provocative paper was published in the August issue of the journal Health Affairs and was the lead story in today's The New York Times. Dr. Rowe, Professor of Health Policy and Management, and Dr. Linda P. Fried, Mailman School Dean, are co-authors.

The research looked at life expectancy by race, sex, and education and examined trends in disparities from 1990 through 2008. The study cautions that failure to complete high school takes a heavy toll on longevity among all groups, essentially negating the effects of recent healthcare advances and longevity gains. “It's as if Americans with the least education are living in a time warp,” says S. Jay Olshansky, professor of epidemiology at the University of Illinois at Chicago School of Public Health and lead author of the study. “The least educated black men are living in 1954, black women in 1962, white women in 1964, and white men in 1972.”

More here.

Tuesday, October 2, 2012

Why do people love to say that correlation does not imply causation?

Daniel Engber in Slate:

120926_SCI_karlpear.jpg.CROP.article250-mediumDepressed people send more email. They spend more time on Gchat. Researchers at the Missouri University of Science and Technology recently assessed some college students for signs of melancholia then tracked their behavior online. “We identified several features of Internet usage that correlated with depression,” they said. Sad people use IM and file-share. They play video games. They surf the Web in their own, sad way.

Not everyone found the news believable. “Facepalm. Correlation doesn't imply causation,” wrote one unhappy Internet user. “That's pretty much how I read this too… correlation is NOT causation,” agreed a Huffington Post superuser, seemingly distraught. “I was surprised not to find a discussion of correlation vs. causation,” cried someone at Hacker News. “Correlation does not mean causation,” a reader moaned at Slashdot. “There are so many variables here that it isn't funny.”

And thus a deeper correlation was revealed, a link more telling than any that the Missouri team had shown. I mean the affinity between the online commenter and his favorite phrase—the statistical cliché that closes threads and ends debates, the freshman platitude turned final shutdown. “Repeat after me,” a poster types into his window, and then he sighs, and then he types out his sigh, s-i-g-h, into the comment for good measure. Does he have to write it on the blackboard?Correlation does not imply causation. Your hype is busted. Your study debunked. End of conversation. Thank you and good night.

More here.

The deeply disturbing Israel court ruling on Rachel Corrie

Cindy Corrie [Rachel's mother] in The Seattle Times:

The home Rachel and her friends from the International Solidarity Movement defended was eventually demolished with hundreds more in mass-clearing operations to create a buffer along Gaza’s southern border.

Our lawsuit was not a solution, but rather a symptom of a broken system of accountability within Israel and our own U.S. government. Despite a promise from Israel Prime Minister Ariel Sharon for a “thorough, credible, and transparent” investigation and repeated calls from the highest levels of our government for such an investigation to occur, there was no diplomatic resolution. According to the U.S. State Department, its calls “have gone unanswered or ignored.”

Court testimony also confirmed a credible investigation did not occur. Investigators failed to question key military witnesses, including those recording communications; failed to secure the military video, allowing it to be taken for nearly a week by senior commanders with only segments submitted to court; failed to address conflicting soldiers’ testimonies; and ignored damning statements in the military log confirming a “shoot to kill” order and command mentality to continue work in order not to create a precedent with activists.

I had no illusions about the uphill battle we faced in Israeli court, but as I sat with my family in a packed courtroom awaiting the verdict, I held hope that, like so many observing the trial, the judge would see that evidence warranted some criticism of the military’s actions.

The room was filled with human-rights observers, U.S. Embassy officials, family supporters and a throng of media. Judge Oded Gershon surveyed the scene before reading his decision. From the halting tone of my translator and friend, and audible groans around us, I knew it was bad.

He ruled that Rachel was killed as an act of war, which, according to Israeli law, absolves the military of responsibility. He added that she alone was to blame for her own killing and then went on to commend the military police for their professionalism in carrying out such a credible investigation. The courtroom heard the judge parrot the state prosecuting attorneys’ original claims in the case, nearly verbatim.

More here.

The Myths of Muslim Rage

Kenan Malik in Pandaemonium:

Satanic-versesThe Rushdie affair is shrouded in a number of myths that have obscured its real meaning. The first myth is that the confrontation over The Satanic Verses was primarily a religious conflict. It wasn’t. It was first and foremost a political tussle. The novel became a weapon in the struggle by Islamists with each other, with secularists and with the West. The campaign began in India where hardline Islamist groups whipped up anger against Rushdie’s supposed blasphemies to win concessions from politicians nervous about an upcoming general election and fearful of alienating any section of the Muslim community. The book subsequently became an issue in Britain, a weapon in faction fights between various Islamic groups.

Most important was the struggle between Saudi Arabia and Iran for supremacy in the Islamic world. From the 1970s onwards Saudi Arabia had used oil money to fund Salafi organisations and mosques worldwide to cement its position as spokesman for the umma. Then came the Iranian Revolution of 1979 that overthrew the Shah, established an Islamic republic, made Tehran the capital of Muslim radicalism, and Ayatollah Khomeini its spiritual leader, and posed a direct challenge to Riyadh. The battle over Rushdie’s novel became a key part of that conflict between Saudi Arabia and Iran. Saudi Arabia made the initial running, funding the campaign against the novel. The fatwa was an attempt by Iran to wrestle back the initiative. The campaign against The Satanic Verses was not a noble attempt to defend the dignity of Muslims, nor even a theological campaign to protect religious values. It was part of a sordid political battle to promote particular sectarian interests.

The second myth is that most Muslims were offended by the novel. They weren’t.

More here.

And see also this article “Muslim Rage is About Politics, Not Religion” by Hussain Haqqani in Newsweek.

indian summer

290px-IndianSummer

Gradually, the term Indian summer has spread beyond its American origins. First to England, replacing a bevy of poetic names—All Halloween Summer, in Shakespeare’s day; St. Luke’s little summer, St. Martin’s Summer—with that single term. Then to France, capturing the popular imagination with the success of Joe Dassin’s classic homage, “L’été indien.” (Now that I think about it, I likely heard the name in French before I ever did in English; Joe Dassin—himself American born—was always popular in Russia, and I’d hummed the tune many a time before its meaning actually sunk in.) And the Indians and old women aren’t alone. Over the years, many others have laid claim to those days of waning heat. In the southern Slavic countries, it’s known as gypsy summer. I’d like to think that has something to do with the colorful vibrancy of the gypsy music and the sound of guitar strings by the open fire. In Italy, it’s a time of year owned by San Martino, or St. Martin.

more from Maria Konnikova at Paris Review here.

slave castle

Article_raboteau

Of the dozens of trade castles and forts dotting Ghana’s three-hundred-mile coastline, I’d chosen to visit the one in Elmina because it was the most notorious. Being the first permanent European settlement in Africa, it was also the oldest. The Portuguese began construction on São Jorge da Mina in 1482 with stones imported from Portugal. It was designed to defend against attacks from the local people and from other Europeans, but in the seventeenth century it was captured by the Dutch. In the nineteenth century it was purchased by the British. Now it is a World Heritage monument. In the castle’s early days, Europeans didn’t think of themselves as European any more than Africans thought of themselves as African. The Dutch weren’t betraying a European bond when they captured the Portuguese castle, just as the Mandinke weren’t betraying an African bond when they captured Ayuba. Nor, in the castle’s early days, was the castle a slave castle.

more from Emily Raboteau at The Believer here.

Eric Hobsbawm (1917-2012)

Eric-Hobsbawm-010

In the foothills of Hampstead Heath, where Karl Marx and Friedrich Engels used to take their afternoon strolls, stands the home of Eric and Marlene Hobsbawm. To enter the Nassington Road drawing room for a conversation with Hobsbawm was to be transported back to the great ideological struggles of the extreme 20th century. Here was where ideas mattered, history had a purpose, and politics was important. And one could have no more generous, humane, rigorous, and involved a guide than the late Eric Hobsbawm. The breadth of his work and the reach of his intellect was always startling. Right to the end of his days, he stayed up to date with scholarship, never failed to flay an opponent, and continued to write. Afternoon tea with Hobsbawm could range from the achievements of President Lula of Brazil to the limitations of Isaiah Berlin as an historian, the unfortunate collapse of the Communist party in West Bengal to what Ralph Miliband would have made of his boys, David and Ed.

more from Tristram Hunt at The Guardian here.

Tuesday Poem

34. Not What Makes Tao Tick

The Tao flows everywhere
so everything is of it

uncreated

Every thing receives Tao’s work
but it claims nothing for itself

It feeds all worlds
but never enslaves them

The Tao and each thing are merged
—the heart of all things
are filled with Tao’s humility

Each thing comes and goes
from it and to it but
Tao endures

Call it great

But being great is not what makes
Tao tick
……………..…..……which
…………………….. makes it
…………………….. great

from the Tao Te Ching by Lao Tzu
Adaptations by R.Bob

Cinema has changed us all: The birth of alienation

David Thomson in The Independent:

Alienation_freeIn his autobiography, The Words (1964), Jean-Paul Sartre described his discovery of cinema as a child. He would have been 10 years old in 1915 when The Birth of a Nation opened. But he hardly noticed particular films at first. What he saw or felt was something he called “the frenzy on the wall”. That could have been a reaction to the brilliant battle scenes in Griffith's films, but it also covers the still face of Garbo absorbing romantic loss, or the stoic blankness of Buster Keaton baffled by the physical chaos around him. The frenzy was in the whirl with which projected film ran at 16 or 24 frames a second, a passage of time that seethed on the wall – and, paradoxically, the serenity of another reality. That was the inherent madness and the magic in cinema: that we watch the battle but never risk hurt, and spy on Garbo without having her notice us.

At first, the magic was overwhelming: in 1895, the first audiences for the Lumière brothers' films feared that an approaching steam engine was going to come out of the screen and hit them. That gullibility passed off like morning mist, though observing the shower in Psycho (1960) we still seem to feel the impact of the knife. That scene is very frightening, but we know we're not supposed to get up and rescue Janet Leigh. In a similar way, we can watch the surreal imagery of the devastation at Fukushima, or wherever, and whisper to ourselves that it's terrible and tragic, but not happening to us.

More here.

High Stress Can Make Insulin Cells Regress

From The New York Times:

DiabetesFor years, researchers have investigated how the body loses the ability to produce enough insulin, a hallmark of diabetes. Now an intriguing theory is emerging, and it suggests a potential treatment that few scientists had considered. The hormone insulin helps shuttle glucose, or blood sugar, from the bloodstream into individual cells to be used as energy. But the body can become resistant to insulin, and the beta cells of the pancreas, which produce the hormone, must work harder to compensate. Eventually, the thinking goes, they lose the ability to keep up. “We used to say that the beta cells poop out,” said Alan Saltiel, director of the Life Sciences Institute at the University of Michigan. In reality, he added, this shorthand meant “we have no idea what’s going on.” Some evidence suggested that large numbers of these cells died through a process of programmed cell death called apoptosis. But that was at best a partial explanation. Now, researchers at Columbia University have put forth a surprising alternative. In mice with Type 2 diabetes, the researchers showed that beta cells that had lost function were not dead at all. Most remained alive, but in a changed form. They reverted to an earlier developmental, “progenitor,” state. It’s as if these cells are “stepping back in time to a point where they look like they might have looked during their development,” said Dr. Domenico Accili, director of the Columbia University Diabetes and Endocrinology Research Center, who led the new work.

…A range of physiological stresses, including obesity, pregnancy and aging, all tend to increase demand on beta cells to produce more insulin, Dr. Accili said. It may be that they are “taking a little rest,” he said, in returning to a less active state. Although it’s not yet clear why this might happen, the finding may lend support to the view that doctors should focus on relieving stress on the beta cells rather than pushing them to produce more insulin, which may speed the progression of diabetes, Dr. Accili said.

More here.

Monday, October 1, 2012

Sunday, September 30, 2012

The Golden Age: Keynesian Utopianism

42-20440421-quiggan-keynes-leisureJohn Quiggin in Aeon:

I first became an economist in the early 1970s, at a time when revolutionary change still seemed like an imminent possibility and when utopian ideas were everywhere, exemplified by the Situationist slogan of 1968: ‘Be realistic. Demand the impossible.’ Preferring to think in terms of the possible I was much influenced by an essay called ‘Economic Possibilities for our Grandchildren,’ written in 1930 by John Maynard Keynes, the great economist whose ideas still dominated economic policymaking at the time.

Like the rest of Keynes’s work, the essay ceased to be discussed very much during the decades of free-market liberalism that led up to the global financial crisis of 2007 and the ensuing depression, through which most of the developed world is still struggling. And, also like the rest of Keynes's work, this essay has enjoyed a revival of interest in recent years, promoted most notably by the Keynes biographer Robert Skidelsky and his son Edward.

The Skidelskys have revived Keynes’s case for leisure, in the sense of time free to use as we please, as opposed to idleness. As they point out, their argument draws on a tradition that goes back to the ancients. But Keynes offered something quite new: the idea that leisure could be an option for all, not merely for an aristocratic minority.

Writing at a time of deep economic depression, Keynes argued that technological progress offered the path to a bright future. In the long run, he said, humanity could solve the economic problem of scarcity and do away with the need to work in order to live. That in turn implied that we would be free to discard ‘all kinds of social customs and economic practices, affecting the distribution of wealth and of economic rewards and penalties, which we now maintain at all costs, however distasteful and unjust they may be in themselves, because they are tremendously useful in promoting the accumulation of capital’.

Poll Averages Have No History of Consistent Partisan Bias

Nate Silver in the NYT's Five Thirty Eight:

The analysis that follows is quite simple. I’ll be taking a simple average of polls conducted each year in the final 21 days of the campaign and comparing it against the actual results. There are just two restrictions.

First, I will be looking only at polls of likely voters. Polls of registered voters, or of all adults, typically will overstate the standing of Democratic candidates, since demographic groups like Hispanics that lean Democratic also tend to be less likely to turn out in most elections. (The FiveThirtyEight forecast model shifts polls of registered voters by 2.5 percentage points toward Mr. Romney for this reason.)

Second, the averages are based on a maximum of one poll per polling firm in each election. Specifically, I use the last poll that each conducted before the election. (Essentially, this replicates the methodology of the Real Clear Politics polling average.)

Let’s begin by looking at the results of national polls for the presidential race.

In the 10 presidential elections since 1972, there have been five years (1976, 1980, 1992, 1996 and 2004) in which the national presidential polls overestimated the standing of the Democratic candidate. However, there were also four years (1972, 1984, 1988 and 2000) in which they overestimated the standing of the Republican. Finally, there was 2008, when the average of likely voter polls showed Mr. Obama winning by 7.3 percentage points, his exact margin of victory over John McCain, to the decimal place.

Arthur Ochs Sulzberger, 1926-2012

120929044541-arthur-ochs-sulzberger-story-topClyde Haberman in the NYT:

Arthur Ochs Sulzberger, who guided The New York Times and its parent company through a long, sometimes turbulent period of expansion and change on a scale not seen since the newspaper’s founding in 1851, died early Saturday at his home in Southampton, N.Y. He was 86.

His death, after a long illness, was announced by his family.

Mr. Sulzberger’s tenure, as publisher of the newspaper and as chairman and chief executive of The New York Times Company, reached across 34 years, from the heyday of postwar America to the twilight of the 20th century, from the era of hot lead and Linotype machines to the birth of the digital world.

The paper he took over as publisher in 1963 was the paper it had been for decades: respected and influential, often setting the national agenda. But it was also in precarious financial condition and somewhat insular, having been a tightly held family operation since 1896, when it was bought by his grandfather Adolph S. Ochs.

By the 1990s, when Mr. Sulzberger passed the reins to his son, Arthur Sulzberger Jr., first as publisher in 1992 and then as chairman in 1997, the enterprise had been transformed. The Times was now national in scope, distributed from coast to coast, and it had become the heart of a diversified, multibillion-dollar media operation that came to encompass newspapers, magazines, television and radio stations and online ventures.

The expansion reflected Mr. Sulzberger’s belief that a news organization, above all, had to be profitable if it hoped to maintain a vibrant, independent voice.

I’m Sorry, Steve Jobs: We Could Have Saved You

Siddhartha Mukherjee in Newsweek:

SidWe are failing to treat and prevent cancer—even as the promise of life-saving remedies await us. On the anniversary of Steve Jobs’s death, leading oncologist and the author of The Emperor of All Maladies Siddhartha Mukherjee explains how we failed to save an icon and why we will lose so many more lives if we do not give cancer research the funding it deserves. In Oct. 5, the night that Steve Jobs died, I ascended 30,000 feet into the thin air above New York on a flight to California. On my lap was a stash of scientific papers. I was reading and taking notes—where else?—on an iPad.

Jobs’s death—like a generational Rorschach test—had provoked complex reactions within each of us. There was grief in abundance, of course, admixed with a sense of loss, with desolation and nostalgia. Outside the Apple store in SoHo, New York, that evening, there were bouquets of white gerberas and red roses. Someone had left a bushel of apples by the doorstep and a sign that read “I-miss …” I missed Jobs, too—but I also felt a personal embarrassment in his death. I am an oncologist and a cancer researcher. I felt as if my profession, my discipline, and my generation had let him down. Steve Jobs had promised—and then delivered—life-altering technologies. Had we, in all honesty, given him any such life-altering technologies back? I ask the question in all earnestness. Jobs’s life ended because of a form of pancreatic cancer called pancreatic neuroendocrine tumor, or PNET. These tumors are fleetingly rare: about five in every million men and women are diagnosed with PNETs each year. Deciphering the biology of rare cancers is often challenging. But the past five years have revealed extraordinary insights into the biology of some rare cancers—and PNETs, coincidentally enough, have led part of that charge. By comparing several such tumors, scientists are beginning to understand the biology of these peculiar tumors.

More here.

The Art of Stem Cells

From Harvard Magazine:

Jennifer Quick, a Ph.D. candidate in the department of the history of art and architecture, stood before her audience in the main gallery of the Carpenter Center for the Visual Arts. Behind her, on a free-standing wall running down the middle of the gallery, hung more than a dozen works by Michael Wang ’03, a visual artist who creates micrograph images of artificially produced stem cells.

But as Quick began to lecture on Wang’s works, another voice could be heard from behind the wall. It belonged to cellular biologist Gabriella Boulting, Ph.D. ’12, who was speaking to her own audience about pieces Wang had created that hung on her side of the wall. Dueling gallery tours? Wang, an artist who aims to bring together art and science in interesting and thought-provoking ways, prefers to think of two lecturers working as one. “I wanted to literally stage an encounter between two disciplines, the artistic community on one hand and the scientific on the other,” Wang says. “I don’t think of artworks as ending with the individual artistic object. There is an expanded field that includes context and discourse around [that object] so I really wanted to make sure that could be an expanded part of the work. I wanted people from within the University to provide two very different insights at the exact same time.” His latest work, Differentiation Series, is a sequence of micrograph images of artificially produced stem cells that have been hand-tinted using a system that matches a unique color to every specific cell type that can potentially be produced from these initially undifferentiated cells.

More here.

The Britishisation of American English

Cordelia Hebblethwaite at the BBC:

ScreenHunter_02 Sep. 30 13.07There is little that irks British defenders of the English language more than Americanisms, which they see creeping insidiously into newspaper columns and everyday conversation. But bit by bit British English is invading America too.

Spot on – it's just ludicrous!” snaps Geoffrey Nunberg, a linguist at the University of California at Berkeley.

“You are just impersonating an Englishman when you say spot on.”

Will do – I hear that from Americans. That should be put into quarantine,” he adds.

And don't get him started on the chattering classes – its overtones of a distinctly British class system make him quiver.

But not everyone shares his revulsion at the drip, drip, drip of Britishisms – to use an American term – crossing the Atlantic.

“I enjoy seeing them,” says Ben Yagoda, professor of English at the University of Delaware, and author of the forthcoming book, How to Not Write Bad.

“It's like a birdwatcher. If I find an American saying one, it makes my day!”

Last year Yagoda set up a blog dedicated to spotting the use of British terms in American English.

So far he has found more than 150 – fromcheeky to chat-up via sell-by date, and the long game – an expression which appears to date back to 1856, and comes not from golf or chess, but the card game whist. President Barack Obama has used it in at least one speech.

More here.