In the LRB, Tariq Ali on Musharraf:
Musharraf tells us he agreed to become Washington’s surrogate because the State Department honcho, Richard Armitage, threatened to bomb Pakistan back to the Stone Age if he didn’t. What really worried Islamabad, however, was a threat Musharraf doesn’t mention: if Pakistan refused, the US would have used Indian bases.
Musharraf was initially popular in Pakistan and if he had pushed through reforms aimed at providing an education (with English as a compulsory second language) for all children, instituted land reforms which would have ended the stranglehold of the gentry on large swathes of the countryside, tackled corruption in the armed forces and everywhere else, and ended the jihadi escapades in Kashmir and Pakistan as a prelude to a long-term deal with India, then he might have left a mark on the country. Instead, he has mimicked his military predecessors. Like them, he took off his uniform, went to a landlord-organised gathering in Sind and entered politics. His party? The evergreen, ever available Muslim League. His supporters? Chips off the same old corrupt block that he had denounced so vigorously and whose leaders he was prosecuting. His prime minister? Shaukat ‘Shortcut’ Aziz, formerly a senior executive of Citibank with close ties to the eighth richest man in the world, the Saudi prince Al-Walid bin Talal. As it became clear that nothing much was going to change a wave of cynicism engulfed the country.
Musharraf is better than Zia and Ayub in many ways, but human rights groups have noticed a sharp rise in the number of political activists who are being ‘disappeared’: four hundred this year alone, including Sindhi nationalists and a total of 1200 in the province of Baluchistan, where the army has become trigger-happy once again. The war on terror has provided many leaders with the chance to sort out their opponents, but that doesn’t make it any better.
From BBC News:
Scientists report of two cases where female Komodo dragons have produced offspring without male contact.
Tests revealed their eggs had developed without being fertilised by sperm – a process called parthenogenesis, the team wrote in the journal Nature. Lizards could make use of the ability to reproduce asexually when, for example, a lone female was washed up alone on an island with no males to breed with.
Because of the genetics of this process, her children would always be male. This is because Komodo dragons have W and Z chromosomes – females have one W and one Z, males have two Ws. The egg from the female carries one chromosome, either a W or Z, and when parthenogenesis takes place, either the W or Z is duplicated. This leads to eggs which are WW and ZZ. ZZ eggs are not viable, but WW eggs are, and lead to male baby Komodo dragons.
More here.
From Science:
If the scale has tipped too far in the wrong direction, perhaps you should blame the bugs living in your gut. Some microbes are better at wringing calories out of those holiday meals than others, researchers report in two papers in today’s Nature. Transferring such high-octane bugs into lean mice causes the rodents to plump up, suggesting a microbial contribution to obesity.
Genetics certainly play a role in obesity, which is on the rise in many countries. But there’s more to the problem. In 2004, Jeffrey Gordon from Washington University in St. Louis, Missouri, and his colleagues demonstrated that intestinal bacteria could also contribute to weight gain in mice. A year later, microbial ecologist Ruth Ley, a postdoctoral fellow working in Gordon’s lab, discovered that lean and obese mice have different microbial communities in their gut. Now Gordon and his colleagues have shown this difference exists in people as well, and that diets can shift the microbial balance.
More here.
Wednesday, December 20, 2006

The contributions of a historical tradition of religious writing are just as essential as the natural operations of the human brain. While Tremlin systematically overemphasizes the latter, van Huyssteen’s postfoundationalism avoids exclusive claims for either. What is interesting is that both authors resist the temptation to make hasty inferences from their observations about the naturalness of religious beliefs to a conclusion about either the truth or the falsity of those beliefs. The implication, but not the explicit conclusion, of Tremlin’s reductionist account is that religious beliefs can be not only explained, but effectively explained away by cognitive science. Van Huyssteen tends towards the opposite view – that the naturalness of religious beliefs argues, if anything, in favour of their plausibility and rationality. Of course most of us assume that all our beliefs – the true ones as well as the false ones – are, among other things, products of an evolved brain. The fact that many writers about science and religion no longer assume that such an observation is a knock-down argument either for or against religious faith is surely a sign of progress in the field of science and religion.
more from the TLS here.

In 1988, to commemorate Austria’s annexation by Adolf Hitler fifty years earlier, a new play was commissioned from Thomas Bernhard. The author of eleven novels and more than twenty plays, Bernhard had a well-deserved reputation as the country’s most provocative postwar writer: he spent his career alternately mocking and mourning Austria’s Nazi legacy, which, with typical bluntness, he once represented as a pile of manure on the stage. At first, he declined to participate in the commemoration, saying with caustic humor that a more appropriate gesture would be for all the shops once owned by Jews to display signs reading “Judenfrei.” But the author of plays like “The German Lunch Table,” in which family members gathered for a meal discover Nazis in their soup, could not resist such a rich opportunity to needle Austria’s political and cultural élite. “All my life I have been a trouble-maker,” he once wrote. “I am not the sort of person who leaves others in peace.”
more from The New Yorker here.
John Londregan offers a right-wing argument against Pinochet, in The Weekly Standard.
Despite Pinochet’s initial declaration that he was the temporary leader of a temporary government, he managed to push aside the other heads of the armed forces, and to remain in power for the next 16 and a half years, longer than any other ruler, elected or otherwise, in the post-independence history of Chile. During the long years of military rule, Pinochet remorselessly sought control. He outlawed political parties and had opponents murdered. The butcher’s bill for his time in power included the lives of over 3,000 of his fellow citizens (in a country of 15 million), not counting the many thousands more who were tortured by the government, and the thousands driven into exile. Pinochet sought to transform Chilean society, and he incorporated a series of free-market economic reforms as a part of his recipe for success.
His embrace of economic reform seems unlikely to have sprung from a commitment to freedom, given the overarching contempt for liberty that characterized the rest of his government. Rather, in order to insulate himself from the consequences of his murderous seizure of power, Pinochet sought out political allies, and his free market reforms helped him to garner support domestically on the right, and also among members of the international community. One must be careful not to fall into Pinochet’s trap–accepting his brutal seizure of power and tyrannical rule as a natural accompaniment of free market reforms. Propagandists on the left lost no time in seeking to discredit economic freedom by associating it with Pinochet. To this day, we hear from Moscow that it takes a Pinochet to implement economic reforms successfully; Vladimir Putin seems all too willing to have Pinochet’s uniform taken in a few sizes so he can try it on.
In the Harvard International Review blog, Stephen Wertheim comments on Alex Motyl’s answer to the question whether America is an empire. (Now if Dan Nexon would weigh in.)
Is America an empire? In the midst of much academic debate, political scientist Alexander J. Motyl asks a practical question: what does it matter?
“Imagine,” he writes, “that policy analysts and scholars stopped applying the label to the United States. Would it make any difference? I think not. The challenges facing the country—war in Iraq, nuclear weapons in Iran and North Korea, rising authoritarianism in Russia, growing military power in China, the Israeli-Palestinian conflict, terrorism, avian flu, climate change, and so forth—would be exactly the same, as would US policy options…Life would go on, and no one—except for scholars of empire—would notice the difference.”
Motyl is undeniably right that challenges and policy options would be exactly the same. What he misses is that policymakers might never think of them or take them seriously. Here are two recent examples from Motyl’s own list.
If policymakers thought of America as an empire, they might have thought it prudent to plan for postwar occupation of Iraq. “I don’t think our troops ought to be used for what’s called nation-building,” George W. Bush declared in the 2000 presidential debates. Evidently he believed his rhetoric. When deciding to invade Iraq, the Bush administration found little need to draw up long-term plans to rule and reconstruct the country. Nor did Democrats in Congress press the point. Nor is the US military equipped, in doctrine or manpower, to do large-scale nation-building. Why prepare for what America by nature “doesn’t do?”
If policymakers thought of America as an empire, they might have been quicker to grasp Islamist terrorism as a major threat before 9/11. Policymakers were focused on state actors. And rightly, if America is solely a nation-state capable of being threatened solely by nation-states. By contrast, stateless tribal fighters are the age-old enemies of empire. They sacked Rome until Rome fell. They raided China from the north, conquering the realm several times despite the Great Wall built to keep them out. Pirates harassed Britain at sea. A clear lesson of empire is to beware the barbarian on the frontier. But if there is no empire, there is no frontier and no barbarian to beware.
In ScienceNOW Daily News:
School officials in Cobb County, Georgia, yesterday agreed to drop their 4-year attempt to tell high school biology students that evolution is only a “theory.” Local school officials had fought a ruling by a federal judge to remove stickers that they had placed on textbooks, but yesterday, they threw in the towel, pledging to adhere to the state science curriculum and also to pay $167,000 in legal fees to the plaintiffs. In return, the five parents who brought the suit agreed to drop any further legal action against the school district.
“The case is done, and they have agreed never again to put stickers in the textbooks,” says Debbie Seagrave, executive director of the American Civil Liberties Union’s Georgia affiliate, which represented the parents in Selman v. Cobb County. School board chair Teresa Plenge said the district decided to forgo “the distraction and expense of starting all over with more legal actions and another trial.”
The legal battle began after the school board embraced the arguments of parents who felt the teaching of evolutionary theory unfairly neglected the biblical story of creation. The board voted in September 2002 to apply stickers to 35,000 textbooks warning that “evolution is a theory, not a fact, regarding the origin of living things” and that “this material should be approached with an open mind, studied carefully and critically considered.” In 2004, several parents sued the school board in federal court, and last year, District Judge Clarence Cooper ordered the stickers removed on the grounds that the language amounted to an unconstitutional endorsement of a religious belief (ScienceNOW, 14 January 2005). An appellate court rejected the school board’s appeal, saying it lacked sufficient information to issue a ruling, and remanded the case to the district court.
In EurekaAlert!:
By repeating the Stanley Milgram’s classic experiment from the 1960s on obedience to authority – that found people would administer apparently lethal electrical shocks to a stranger at the behest of an authority figure – in a virtual environment, the UCL (University College London) led study demonstrated for the first time that participants reacted as though the situation was real.
The finding, which is reported in the inaugural edition of the journal PLoS ONE, demonstrates that virtual environments can provide an alternative way of pursuing laboratory-based experimental research that examines extreme social situations.
Professor Mel Slater, of the UCL Department of Computer Science, who led the study, says: “The line of research opened up by Milgram was of tremendous importance in the understanding of human behaviour. It has been argued before that immersive virtual environment can provide a useful tool for social psychological studies in general and our results show that this applies even in the extreme social situation investigated by Stanley Milgram.”
Stanley Milgram originally carried out the series of experiments in an attempt to understand events in which people carry out horrific acts against their fellows. He showed that in a so cial structure with recognised lines of authority, ordinary people could be relatively easily persuaded to give what seemed to be even lethal electric shocks to another randomly chosen person. Today, his results are often quoted in helping to explain how people become embroiled in organised acts of violence against others, for example they have been recently cited to explain prisoner abuse and even suicide bombings.
From Scientific American:
The mind is not as agile as it once was, even at the ripe old age of 34. Names elude me, statistics slip away, memory fades. This is just the first step on a long journey into senescence; and by 74, if I make it that far, I might remember practically nothing. That age is the average of a cohort of 2,802 seniors who recently participated in a long-term study to see if anything can be done to reverse this age-related mind decline. The good news: there is.
Sherry Willis of Pennsylvania State University led a team of scientists that followed this group of adults, aged 65 and older, still living independently between 1998 and 2004. The seniors came from all walks of life, races, and parts of the country, including Birmingham, Ala., Detroit, Boston and other major cities. They all had one thing in common when the study commenced: no signs of cognitive impairment.
More here.
From despardes:
(Commencement lecture by Pervez Hoodbhoy at the Indus Valley School of Art and Architecture, Karachi, December 9, 2006.) To help us along, let’s imagine a film like “Jinnah”. You die and fly off to the arrival gate in heaven where an angel of the immigration department screens newcomers from Pakistan. Admission these days is even tougher than getting a Green Card to America. You have to show proofs of good deeds, argue your case, and fill out an admission form. One section of the form asks you to specify three attitudinal traits that you want fellow Pakistanis, presently on earth, to have. As part of divine fairness, all previous entries are electronically stored and publicly available and so you learn that Mr. Jinnah, as the first Pakistani, had answered – as you might guess – “Faith, Unity, Discipline”. This slogan was in all the books you had studied in school, and was emblazoned even on monuments and hillsides across the country. Since copying won’t get you anywhere in heaven, you obviously cannot repeat this.
What would your three choices be? As you consider your answer, I’ll tell you mine. First, I wish for minds that can deal with the complex nature of truth. My second wish is for many more Pakistanis who accept diversity as a virtue. My third, and last, wish is that Pakistanis learn to value and nurture creativity.
More here.
Tuesday, December 19, 2006
Does this mean that the attention on Bosnia and Rwanda was also just about feeling good? Alexander Cockburn in Counterpunch.
As a zone of ongoing, large-scale bloodletting Darfur in the western Sudan has big appeal for US news editors. Americans are not doing the killing, or paying for others to do it. So there’s no need to minimize the vast slaughter with the usual drizzle of “allegations.” There’s no political risk here in sounding off about genocide in Darfur. The crisis in Darfur is also very photogenic.
When the RENAMO gangs, backed by Ronald Reagan and the apartheid regime in South Africa were butchering Mozambican peasants, the news stories were sparse and the tone usually tentative in any blame-laying. Not so with Darfur, where moral outrage on the editorial pages acquires the robust edge endemic to sermons about inter-ethnic slaughter where white people, and specifically the US government, aren’t obviously involved.
Since March 1 the New York Times has run seventy news stories on Darfur (including sixteen pieces from wire services), fifteen editorials and twenty-one signed columns, all but one by Nicholas Kristof. Darfur is primarily a “feel good” subject for people here who want to agonize publicly about injustices in the world but who don’t really want to do anything about them. After all, it’s Arabs who are the perpetrators and there is ultimately little that people in this country can do to effect real change in the policy of the government in Khartoum.
In response to the NYC ban on trans fats, Ampersand noted, “Banning trans fats in restaurants, but not in grocery stores, doesn’t make sense. I guess the supermarket lobby is more powerful than the fast-food and donut lobby.” The Economist’s blog takes on the question:
I’d guess that it has more to do with public choice theory than ardent lobbying. Since national food producers are unlikely to reformulate their entire line for the benefit of a few million New Yorkers, a trans-fat ban would sweep large categories of food off the supermarket shelves, in a way that would be directly and obviously attributable to the ban (since they would disappear from every supermarket shelf at once). Banning them in restraurants, on the other hand, will merely make some of the food taste worse, other of the food more expensive, and so forth, in a thoroughly idiosyncratic way. Consumers are unlikely to connect thousands of subtles shift in their local restaurant fare to the ban, as they surely would of glazed donut holes suddenly vanished from the shelves of the city. The legislators do not need to be paid to act in their own self interest.
Although for me, this answer suggests Marx (on obscured casual pathways as a mechanism in ideology, here, with causes attributed not to inputs but to other factors) as much as it does public choice, which is not to say that the answer is correct.
In Naked Punch, an interview with Arthur Danto on art and philosophy:
I: So would you say then that now you need, really, a talent to notice or a talent to think, rather than a talent to craft?
ACD: Exactly. That`s it. I mean, Barbara and I just visited her nephew, who`s a sculpture student at Rutgers, and he`s building these intricate house-of-cards structures with tiles painted with nail polish. He gets all the girls to give him their old nail polish. It was really quite beautiful and he has this incredible patience. So you know it seems to be a transformation of a really remarkable sort. I felt with the Whitney Biennial of 2002, where you had all of these artists that no one ever heard of, that they were all working. Here was a beautiful work: it was a little collaborative called Praxis, just a man and a woman and they had this little storefront down in the East Village. You could go in on Saturdays and get one of three things: you could get a hug, you could have a band-aid put on and they would kiss it, or you could ask for a dollar and they`d give you a dollar. Simple things, but people would go in, they`d line up, get hugged, ask for a band aid—she`d put it on and make them feel better—or they`d get a dollar. And I thought, God how simple can life get, but there was something very moving about that work. It was interactive, people just came in, the couple were being artists in this kind of way. I thought it had a lot of meaning.
The news story of the day may just be the tragic verdict of the show trial (I suppose all show trial verdicts are tragic) of six foreign medical workers in Libya accused of deliberately infecting 400 children with HIV. In [email protected]:
A Libyan court today condemned to death six foreign health professionals accused of infecting over 400 children with HIV in 1998. The court refused to take into account a swathe of independent scientific evidence indicating that the outbreak had begun several years before the accused began working there and was caused by poor hygiene at the hospital.
The defence say they intend to appeal to the Supreme Court, which would be the last chance that the medics would have of being acquitted. Emmanuel Altit, the head of the international defence team, says the international community can help by insisting that scientific evidence be taken into account.
Behind-the-scenes discussions are also ongoing between the European Union, Bulgaria, the United States and Libya to find a diplomatic solution (under Islamic law, victims’ relatives may withdraw death sentences in return for compensation) but so far this has proved elusive.
The verdict has prompted widespread international condemnation. “We are appalled by the decision of the Libyan court to sentence the five Bulgarian nurses and the Palestinian doctor to death,” says a statement from The World Medical Association and the International Council of Nurses. They emphasize that the denial of health problems that can promote the accidental spread of HIV, such as the use of dirty needles, is an ongoing, dangerous situation in Libya. “How many children will go on dying in Libyan hospitals while the Government ignores the root of the problem?”
In the American Scientist, Per Brandtzaeg tells us about how we develop food allergies.
The story of food allergy is a story about how the development of the immune system is tightly linked to the development of our digestive tract or, as scientists and physicians usually refer to it, our gut. A human being is born with an immature immune system and an immature gut, and they grow up together. The immune system takes samples of gut contents and uses them to inform its understanding of the world—an understanding that helps safeguard the digestive system (and the body that houses it) against harmful microorganisms.
The many-layered defenses of the immune system are designed to guard against invaders while sparing our own tissues. Food represents a special challenge to this system: an entire class of alien substances that needs to be welcomed rather than rebuffed. An adult may pass a ton of food through her gut each year, nearly all of it distinct at the molecular level from her own flesh and blood. In addition, strains of normal, or commensal, bacteria in the gut help with digestion and compete with pathogenic strains; these good microbes need to be distinguished from harmful ones. The body’s ability to suppress its killer instinct in the presence of a gut-full of innocuous foreign substances is a phenomenon called oral tolerance. It requires cultivating a state of equilibrium, or homeostasis, that balances aggression and tolerance in the immune system. Intolerance, or failure to suppress the immune response, results in an allergic reaction, sometimes with life-threatening consequences.
From Lens Culture:
In this age when many famous fine art photographers and photojournalists strive to capture the mundane, the banal, the everyday reality of our existence, it is like a breath of fresh air to come upon this unique collection of inexpensive snapshots taken by inexperienced camera operators.
These are truly delightful photos of ordinary day-in-the life experiences taken by the men and women who deliver the mail throughout Great Britain.This project — conceived, managed and edited by the young photographer Stephen Gill — offered the free use of disposable cameras to every member of the Royal Mail. Hundreds took him up on the offer, and as a result, Gill painstakingly reviewed over 30,000 images to end up with the best of the best. It is apparent that everyone had fun in the process.
The goal was to create intimate documentary views of the UK that are rarely seen except by postal carriers, utility workers, garbage collectors, and so on.
More here.
From Science:
Does a good mood help when doing your job? Not always, a new study suggests. Happy thoughts can stimulate creativity, but for mundane work such as plowing through databases, being cranky or sad may work better. The study is the first to suggest that a positive frame of mind can have opposite effects on productivity depending on the nature of a task.
Stress, anxiety, and a bad mood are notorious for narrowing people’s attention and making them both think and see only what’s right in front of them; for example, a person held at gunpoint usually recalls nothing but the weapon itself. Well-being, on the other hand, is known to broaden people’s thinking and make them more creative. But whether a good mood also expands people’s attention to visual details was unknown.
More here.
Monday, December 18, 2006
Banksy. Withus Oragainstus. 2005
Dead beetle with glued on sidewinder missiles and satellite dish.
More on this irreverent British artist here and here.
Sunday, December 17, 2006
From Nature:
If you think only hounds can track a scent trail, think again: people can follow their noses too, a new study says. And they do so in a way very similar to dogs, suggesting we’re not so bad at detecting smells — we’re just out of practice.
Scientists have found that humans have far fewer genes that encode smell receptors than do other animals such as rats and dogs. This seemed to suggest that we’re not as talented at discerning scents as other beasts, perhaps because we lost our sense of smell when we began to walk upright, and lifted our noses far away from the aroma-rich earth. A team of neuroscientists and engineers, led by Noam Sobel of the University of California, Berkeley, and the Weizmann Institute of Science in Rehovot, Israel, decided to test this conventional wisdom.
The team first laid down a 10-metre-long trail of chocolate essential oil in a grass field (the scent was detectable but not strong or overpowering). Then they enlisted 32 Berkeley undergraduates, blindfolded them, blocked their ears and set them loose in the field to try to track the scent. Each student got three chances to track the scent in ten minutes; two-thirds of the subjects finished the task. And when four students practiced the task over three days, they got better at it.
Next, the team tested how the students were following the trails. They counted how many whiffs of air each student took while tracking the scent trail, and tested the effect of blocking one nostril at a time. The scientists found that humans act much like dogs do while tracking a scent, sniffing repeatedly to trace the smell’s source. They didn’t do so well with one blocked nostril, suggesting that the stereo effect of two nostrils helps people to locate odours in space.
The study proves that humans aren’t so bad at smelling after all, says neuroscientist Gordon Shepherd of Yale University in New Haven, Connecticut.
More here.