John Tierney in the New York Times:
As promised in my column today, here’s a survey you can take to see where you fall on the Tightwad-Spendthrift scale developed by economists at Carnegie Mellon University. I took it, and the result independently confirms the shopping brain-scan experiment I describe in the Science Times column: I’m a spendthrift.
I’m not proud of that, but I don’t yearn to be a tightwad, either, not as it’s defined by the economists. They find that tightwads aren’t any happier than spendthrifts, and suffer more in some ways. “It’s like the old line about a hero dying once and a coward dying a thousand deaths,” says George Loewenstein, one of the CMU economists. “A spendthrift suffers after he buys something. A tightwad suffers while he buys it and then again afterwards.”
Loewenstein and his colleagues, Scott Rick and Cynthia Cryder, distinguish “tightwadism” from “frugality,” which is measured (naturally) on a whole separate scale developed by other social scientists. Being frugal means you enjoy saving money, and people who are more frugal tend to be happier than average. Tightwads are less happy. They pass up purchases not because they enjoy saving money or are sensibly calculating the benefits, but because they hate to part with cash. They do without things they could afford that would genuinely improve their lives.
The happiest people are the “unconflicted” consumers who fall in the middle of the the scale…
More here.
From The New Republic (via Powells’ Review-a-Day):
American conservatism is in crisis. That much is almost universally clear. But the next period in American politics will be determined not least by how clearly we understand the crisis of the right. For it may be that the remarkably successful Republican coalition of the last three decades is not at all doomed at the polls. A Giuliani or Romney candidacy, especially up against a Clinton candidacy, could well eke out a victory in 2008. Nor is it quite the case that the familiar fault lines within the movement — libertarians versus social conservatives, neoconservatives versus realists, economic internationalists versus populists — have somehow come to a head all at once. The strains are there, all right, and they have been made much more acute in the Bush years under the weight of massive spending increases, evangelical overreach, abuse of executive power, conventional corruption, and (most disastrously) a mismanaged war. But the reflexive sense of cohesion on the right still manages to keep the rickety coalition together — if only because of the palpable weakness of the alternatives, at least so far.
The crisis, rather, is of a different kind. It is intellectual, and it is deeper than anything captured by the conventional categories. The sole merit of Dinesh D’Souza’s new book is that it acknowledges this intellectual collapse, even as it is itself a document of that collapse; and it proposes a new way forward. Whatever else may be said about The Enemy at Home — and the maledictions from left and right have been ferocious — it has at least the courage to pursue the logic of Bush-era conservatism all the way to its end. In this sense, it is a mainstream conservative book, in its own way even a visionary one, expanding on the direction that American conservatism has taken and daring it to continue aggressively on that very path.
More here.
On the 23rd of March, from Wikipedia:
The Lahore Resolution, commonly known as the Pakistan Resolution, was the National documentation and a formal political statement adopted by the All India Muslim League at the occasion of its three-day general session on 22-24 March 1940 that called for greater Muslim autonomy in British India. This has been largely interpreted as a demand for a separate Muslim state, Pakistan. The resolution was presented by A. K. Fazlul Huq.
Although the idea of founding the state was introduced by Allama Iqbal in 1930 and the name Pakistan had been proposed by Choudhary Rahmat Ali in 1934, Muhammad Ali Jinnah and other leaders had kept firm belief in Hindu–Muslim unity. However, the volatile political climate and religious hostilities gave the idea stronger backing.
More here. Adil Najam has a nice post at All Things Pakistan:
To me, the 23rd of March is a day to reflect on the message of Mohammad Iqbal, just like the 14th of August is to ponder on the legacy of Mohammad Ali Jinnah.
We, as Pakistanis, have not really been kind to the legacy of either man. We turned both into idols. And once we convinced ourselves that these were ’supermen’ we conveniently absolved ourselves of the responsibility to learn from – let alone emulate – either. We are fond of celebrating but incapable of incorporating either the actions of Mr. Jinnah or the thoughts of Mohammad Iqbal.
More here.
Thursday, March 22, 2007
Also in the Boston Review, Abhijit Banerjee on a new development economics:
[T]he only theories that we hold onto with some confidence are disaster warnings—banning all trade is bad, as is banning all private enterprise and printing money to pay everyone. With anything more nuanced, or less negative, there are too many doubts and differences.
It is perhaps natural that the reaction to this kind of uncertainty is to be pessimistic about the possibility of taking any constructive action. William Easterly, the most articulate of the pessimists, in his 2006 book The White Man’s Burden, comes very close to suggesting that there are no recipes for growth that can be brought in from the outside, other than the recipe of giving people within the country incentives to find a recipe on their own.
But this is not what the evidence is telling us. All it is saying is that the cross-country data we are using is not up to answering the kinds of questions that are being asked of it. It does not mean that these are the only useful questions to ask, or that there is no other kind of data that can help us.
In the new issue of the Boston Review, Nancy Birdsall on inequality and globalization.
In 1993 I left the World Bank to become the executive vice president at the Inter-American Development Bank. By then I was persuaded that Latin America’s high inequality was an economic problem, slowing its growth, as well as a social problem. I advocated more research on the issue. By that time—soon after the fall of the Berlin Wall had liberated the mainstream from the taboo of Marxian thought—academic economists were also beginning to study inequality as a possible cause of low growth, and thus as a phenomenon that mattered, at least for understanding growth itself.
Subsequent work by many economists has strengthened my conviction that while inequality may be constructive in the rich countries—in the classic sense of motivating individuals to work hard, innovate, and take productive risks—in developing countries it is likely to be destructive. That is especially true in Latin America, where conventional measures of income inequality are high. It also may well apply in other parts of the developing world, where our conventional indicators are not so high but there are plentiful signs of other forms of inequality: injustice, indignity, and lack of equal opportunity.
A new floodgate system should protect the city from high tides—unless climate change interferes.
Eric Jaffe in Smithsonian Magazine:
Fabio Carrera has been studying the Venice lagoon since 1988, so when he heard a high tide siren one evening in 2002, it wasn’t the first time. But it might have been the strangest.
The sirens warn Venetians that the tide has reached roughly 43 inches—enough to spread shallow water across 12 percent of the city. These alarms typically sound in fall or winter. But here stood Carrera in early June and the tide had reached more than 47 inches, the only summer tide above 43 since modern records began in 1923.
To Carrera, a Venice native and urban information scientist at Worcester Polytechnic Institute in Massachusetts, the event was an early symptom of the impact climate change is having on sea levels in Venice. “Things seem to be off,” he says. “Things like a weird summer high tide—those are the best indicators that something’s happening in the lagoon.”
More here.
Carl Zimmer in his blog, The Loom:
When Craig Venter and his colleagues published their rough draft of the human genome in 2001 they identified 26,588 human genes. They then broke those genes down by their functions. Some were involved in building DNA, some in relaying signals, and so on. Remarkably, though, they classified 12809 genes–almost half–as “molecular function unknown.” Last week I wanted to know if those numbers still hold. I’ve been working on a book on Escherichia coli, and I wanted to contrast just how well scientists understand that microbe to just how poorly we understand ourselves (biologically, in this case). I wanted some numbers to make my case.
They weren’t so easy to find. In 2003 some reports came out to the effect that the genome had shrunk down to 21,000 genes. But I couldn’t turn up much news in the past four years. I wondered what sort of artificial milestone I would have to wait for in order to get some fresh numbers.
Fortunately there are now some rivals to the milestone model of science. There are web sites where you can observe works in progress, such as the human genome. One of those sites is called PANTHER. I contacted the top scientist behind it, Paul D. Thomas, with my question, and he sent me a link. When I clicked on the link, I got the pie chart I’ve posted here (click on the image to go to the original page if it’s hard to read).
The pie shows that we’re now down to just 18,308 genes.
More here.
Click here to see these videos by Laurie McGuinness.
Steve Coll in The New Yorker:
In October, 2005, a radiation sensor at the Port of Colombo, in Sri Lanka, signalled that the contents of an outbound shipping container included radioactive material. The port’s surveillance system, installed with funds from the National Nuclear Security Administration, an agency within the Department of Energy, wasn’t yet in place, so the container was loaded and sent to sea before it could be identified. After American and Sri Lankan inspectors hurriedly checked camera images at the port, they concluded that the suspect crate might be on any one of five ships—two of which were steaming toward New York.
Sri Lanka is a locus of guerrilla war and arms smuggling. It is not far from Pakistan, which possesses nuclear arms, is a haven for Al Qaeda, and has a poor record of nuclear security. The radiation-emitting container presented at least the theoretical danger of a “pariah ship,” Vayl Oxford, the director of the Domestic Nuclear Detection Office, which is part of the Department of Homeland Security, said. It seemed plausible, if unlikely, that Al Qaeda or rogue Pakistani generals might load a bomb onto a cargo vessel. Within days, American satellites located the five suspect ships and intelligence analysts scrutinized their manifests; a team at the National Security Council took charge. One ship, it learned, was bound for Canada, and another for Hamburg, Germany. The White House decided to call in its atomic-bomb squad, known as NEST, the Nuclear Emergency Support Team—scientists who are trained to search for nuclear weapons. One team flew to Canada and a second to Europe, where it intercepted one of the ships at sea before it could reach Hamburg. They found nothing.
More here.
Roxanne Khamsi in New Scientist:
Mr Spock, the fictional Vulcan famously logical and lacking in emotion, sacrificed himself for his comrades in the movie Star Trek II: The Wrath of Khan with the following words to Captain Kirk: “The needs of the many outweigh the needs of the few, or the one…”
Now, revealing new research shows that people with damage to a key emotion-processing region of the brain also make moral decisions based on the greater good of the community, unclouded by concerns over harming an individual.
It is the first study to demonstrate how emotion impacts moral judgement and sheds light on why people often act out of respect for an individual rather than choosing to act in a more logical, utilitarian way. The findings could cause a rethink in how society determines a “moral good”, and challenge the 18th-century philosophies of Immanuel Kant and David Hume.
More here.
‘Are we beasts?’ asked Winston Churchill one night in 1943 after watching a film of the bomb damage done to Germany. The question was probably rhetorical: Churchill had authorised the bombing campaign from its puny beginnings in 1940 to the massive Combined Offensive launched with the American air forces in the last two years of war. His language was always intemperate and flowery – ‘extermination’, ‘annihilation’ and so on. Did he mean it? Did the British military machine set out deliberately in the Second World War on a path to the genocide of the German people?
This issue lies at the heart of Jörg Friedrich’s searing account of the bombing of around 150 German cities between 1940 and 1945. In Germany his book sold half-a-million copies. He is the first German historian to expose in remorseless, almost unreadable detail just what the millions of tons of high explosive and incendiary bombs did to Germany’s people and its cultural heritage. Most British readers will be familiar with Dresden, which has come to symbolise the awful horror of a ruthless total war. What they will not know is the fate of a host of other small cities – Kassel, Paderborn, Aachen, Swinemünde, and many more – which were all but obliterated by the bombing, or of the many large cities such as Cologne or Essen which experienced more than 250 raids each, so many that at the end the bombers were simply turning ruins into ruins.
more from Literary Review here.
The best of Rachel Harrison’s smartly snarky, heavily attitudinal, sometimes traditional-looking new free-standing sculptural amalgamations—some of which sport stuffed chickens and diet drinks—look as if Robert Rauschenberg, Jean Dubuffet, Louise Nevelson, and John Chamberlain made sculptures together and had Renoir and Hans Hoffmann paint them while Jessica Stockholder, Isa Genzken, and Franz West kibitzed. As you circumnavigate her angular cubistic twists and craggy abstract caryatids and columns you get reverberations of these artists as well as histories and -isms gone-by.
more from The Village Voice here.
Khlebnikov’s ideas, not Marinetti’s, dominate Russian futurism and its most sensational creation, the opera Victory Over the Sun. In the same year Stravinsky stormed Paris, the St Petersburg futurists staged an opera about a group of astronauts who wage war on the sun, kill it and bury it. With the sun dead, a new reality is defined by what became Russian modern art’s fundamental image: a pure black square – painted by artist Kasimir Malevich onto the opera’s backcloth. This is the end point, the birthplace of a new cosmos. Malevich announced this new art when he exhibited a painting of a black square in 1915 in a show, called The Last Futurist Exhibition, in St Petersburg.
more from The Guardian here.
From The Washington Post:
BABY LOVE Choosing Motherhood After a Lifetime Of Ambivalence By Rebecca Walker.
Rebecca Walker comes to her ambivalence by birth. The biracial daughter of divorced parents, she spent her childhood moving between two households on opposite coasts — and between two radically different ways of life. She is also a product of 1970s feminism, a member of “the first generation of women to grow up thinking of children as optional.” Her mother, the novelist Alice Walker, has written of her own mixed feelings about having a child; now it is Rebecca’s turn. Her new memoir is a thoughtful and amusing play-by-play of pregnancy and birth, investigating the difference between the theory surrounding motherhood and the scary, messy, snuggly practice of it.
She barely got beyond the theory phase. During her eight-year relationship with the musician Meshell Ndegeocello, the two women had asked a male friend to serve as birth father — “the natural way, no turkey basters.” They considered moving as a group to Europe, “where I could write and be cared for by the thriving holistic midwifery and healing network. I could learn French, and the baby could be bilingual, and we could live in one of those charming villages in Switzerland.” The arrangement fell apart after a first failed try at conception.
But that’s just backstory. The 30-something Walker who learns she is pregnant on page 1 of Baby Love is somewhat more grounded, no small thanks to her new partner, Glen, the baby’s father, seemingly a model of well-adjusted, nurturing manhood.
More here.
From BBC News:
There are now 61 complementary medicine courses of which 45 are science degrees, the Nature journal reported. University College London Professor David Colquhoun urged watchdogs to act, as complementary medicine was not based on scientific evidence. But supporters of the approach said the views were a “sweeping generalisation”. Professor Colquhoun, of the university’s department of pharmacology, cited the example of homeopathy.
He said it had barely changed since the start of the 19th Century and was “more like religion than science”. He also pointed out that some supporters of nutritional therapy have been known to claim that changes in diet can cure Aids. He said the teaching of complementary medicine under a science banner was worse than “Mickey Mouse” degrees in golf management and baking that have sprung up in recent years as “they do what it says on the label”. “That is quite different from awarding BSc degrees in subjects that are not science at all, but are positively anti-science. “Yet this sort of gobbledygook is being taught in some UK universities as though it were science.”
He suggested it would be better if courses in aromatherapy, acupuncture, herbal medicine, reflexology, naturopathy and traditional Chinese medicine were taught as part of a cultural history or sociological course.
More here.
Wednesday, March 21, 2007
By giving tumors their right names, scientists gain power over them.
Robert Dorit in American Scientist:
Pity taxonomy. When it is not being mistaken for the craft of making dead things look alive, the science of naming things seems, in this age of scientific razzle-dazzle, hopelessly old-fashioned.
And yet the act of naming is, in many ways, the fundamental task of our intellect. The world, as William James suggested, appears “a blooming, buzzing confusion.” As scientists, our ability to parse that confusion—to group objects into meaningful categories and give those categories names—is both the prerequisite to and the culmination of our understanding of the world. The way we name things, however, inevitably affects how we perceive those things.
Nowhere is the importance of naming more obvious than in the ways we describe breast cancer, a disease that evokes faint anxiety every time its name is uttered. Descriptions of this disease go back 3,000 years; over the past 30 years, it has become one of the most intensively studied diseases, not to mention the focus of promotional and educational campaigns. Yet despite this long history and our relentless scrutiny, we are not yet sure what “breast cancer” is, or even whether it is a single disease. The more we learn of this condition and its underlying mechanisms, the more complex and multifaceted this disease appears: We are making progress in our understanding of this disease, but sometimes the very name impedes us.
More here.
Daniel Hahn reviews The Cat Orchestra and the Elephant Butler by Jan Bondeson, in The Guardian:
Bondeson has collected stories that span centuries and continents. As well as the incredible performing animals – Hear the amazing cat orchestra! See the learned pig! – he cites natural marvels never adequately explained: showers of fish, toads surviving for years completely encased in rock. For the former, he occasionally suggests some explanation, revealing the crafty showman’s trick, the devices used to teach the pig to spell or the horse to count money; for the latter he wheels out scientific arguments against, and (often more interesting) scientific explanations for, the existence of these seemingly impossible phenomena. And between the performing beasts and the zoological anomalies come the legends – the vegetable lamb, the geese grown from barnacles (this particular tall tale, he says, is behind our use of the word canard).
More here.
Geoffrey Nunberg:
In 1569, an Antwerp physician and naturalist named Johannes Goropius Becanus published a book arguing that the language spoken in the Garden of Eden must have been Flemish — or more specifically, the Flemish of Antwerp — and that all other languages could be derived from that tongue. According to Becanus, for example, the name Eve came from the Flemish words eu-vat — “people barrel” or “barrel of generations” — since all of humanity had its origin in Eve’s womb.
Not surprisingly, Becanus’s theories were congenial to many of his countrymen, though others found them loopy — Ben Jonson ridiculed him in his play The Alchemist and the philosopher Leibniz turned his name into a verb that means “to speculate foolishly about language.” But Becanus’s spiritual descendants have flourished over the centuries. Scarcely a day goes by that the group of linguists I post with at the LanguageLog blog aren’t debunking some claim about language that’s no less absurd than Becanus’s were. So we decided to create the annual Johannes Goropius Becanus award, or Becky for short, awarded to the promulgater of the single most ridiculous or misleading bit of linguistic nonsense that somebody manages to put over in the media.
The year 2006 was rich in contenders…
More here. [Thanks to Neeraj Kayal.]
George Packer in The New Yorker:
Millions of Iraqis, spanning the country’s religious and ethnic spectrum, welcomed the overthrow of Saddam Hussein. But the mostly young men and women who embraced America’s project so enthusiastically that they were prepared to risk their lives for it may constitute Iraq’s smallest minority. I came across them in every city: the young man in Mosul who loved Metallica and signed up to be a translator at a U.S. Army base; the DVD salesman in Najaf whose plans to study medicine were crushed by Baath Party favoritism, and who offered his services to the first American Humvee that entered his city. They had learned English from American movies and music, and from listening secretly to the BBC. Before the war, their only chance at a normal life was to flee the country—a nearly impossible feat. Their future in Saddam’s Iraq was, as the Metallica fan in Mosul put it, “a one-way road leading to nothing.” I thought of them as oddballs, like misunderstood high-school students whose isolation ends when they go off to college. In a similar way, the four years of the war created intense friendships, but they were forged through collective disappointment. The arc from hope to betrayal that traverses the Iraq war is nowhere more vivid than in the lives of these Iraqis. America’s failure to understand, trust, and protect its closest friends in Iraq is a small drama that contains the larger history of defeat.
More here. [Thanks to Elan Reisner.]