Geoffrey Miller is a man with a theory that, if true, will change the way people think about themselves. His idea is that the human brain is the anthropoid equivalent of the peacock’s tail. In other words, it is an organ designed to attract the opposite sex. Of course, brains have many other functions, and the human brain shares those with the brains of other animals. But Dr Miller, who works at the University of New Mexico, thinks that mental processes which are uniquely human, such as language and the ability to make complicated artefacts, evolved originally for sexual display.
One important difference between peacocks’ tails and human minds, of course, is that the peahen’s accoutrement is a drab affair. No one could say the same of the human female psyche. That, Dr Miller believes, is because people, unlike peafowl, bring up their offspring in families where both sexes are involved in parenting. It thus behoves a man to be as careful about choosing his wife as a woman is about choosing her husband.
Both sexes, therefore, have reason to show off. But men and women will have different criteria for making their choices, and so the sexual-display sides of their minds may differ in detail.
Testing this hypothesis will be a long haul. But in a paper he has just published in the Journal of Personality and Social Psychology, in collaboration with Vladas Griskevicius of Arizona State University, Dr Miller goes some way towards it. He, Dr Griskevicius and their colleagues look into two activities—conspicuous consumption and altruism towards strangers—to see if these support the “mating mind” hypothesis, as Dr Miller has dubbed his idea. Their conclusion is that they do.
Via DeLong, David Rees in The Huffington Post, too funny, can’t breathe…
Hello everyone! Personal message to all the New Yorkers out there: Did you read Michael Ignatieff’s essay in the the NY Times Magazine? If so, contact me ASAP to let me know you’re OK. I put your flyer up at Grand Central Station, but have heard no response.
Myself, I’m just making my way out of the debilitating Level-Five Mind Fog that came from reading the thing. Even my “Second Life avatar” has a headache! (Hey young people, did I get that right? Hope so! See you in “Warcraft Worlds!”)
The essay is called “Getting Iraq Wrong.” And baby, if Michael Ignatieff got Iraq wrong, I don’t want him to be right! Because this essay can MAKE LEMONADE IN YOUR MIND.
I wrote a “cyber-essay” on the Huffington Post a couple years ago about Ignatieff. (Oh, sorry, you didn’t know I blog on the Huffington Post? Yeah, not to brag or nothing, but I totally do. Also, my friend has a flickr.) My cyber-essay concerned itself with a masterpiece of foreign-policy fan fiction: Ignatieff’s 2005 NY Times Magazine essay justifying the Iraq war. Ignatieff’s essay was called “Let’s All Fly Up In Space Together and Smoke Dope.” (That was the vibe, anyway.)
In that 2005 essay, you’ll recall, Ignatieff said the reason the American public wanted to invade Iraq was to spread “The Ultimate Task of Thomas Jefferson’s Dream.” (I am not making a joke. This is for real.) And, he implied, anyone who opposed the invasion of Iraq did so because they hated Thomas Jefferson– and they didn’t believe in the Ultimate Tasks of Dreams!
So far, so GREAT, right?
Ignatieff’s latest essay is what Latin people call a “mea culpa,” which is Greek for “Attention publishers: I am ready to write a book about the huge colossal mistake I made.”
The Freakanomics blog has moved over to the NYT. Levitt asks “If You Were a Terrorist, How Would You Attack?”:
I’d start by thinking about what really inspires fear. One thing that scares people is the thought that they could be a victim of an attack. With that in mind, I’d want to do something that everybody thinks might be directed at them, even if the individual probability of harm is very low. Humans tend to overestimate small probabilities, so the fear generated by an act of terrorism is greatly disproportionate to the actual risk.
Also, I’d want to create the feeling that an army of terrorists exists, which I’d accomplish by pulling off multiple attacks at once, and then following them up with more shortly thereafter.
Of course, if a terrorist cell wanted to provoke a massive retaliation in a foreign land in order to win adherents rather than inspire fear, then Levitt’s approach may not be optimal.
Christy Lynn Freeman, a 37-year-old, Ocean City, Md., woman, was recently charged with murder after delivering a stillborn child under Maryland’s as-yet-untested Viable Fetus Act of 2005.
Worcester County prosecutor Joel J. Todd charged Freeman with murder and a district court judge held her without bail for allegedly performing her own late-term abortion. Though these charges were eventually dropped, Freeman’s case illustrates the coercive potential of legislation that gives fetuses rights at the expense of women.
Freeman arrived at Atlantic General Hospital by ambulance on July 26, bleeding profusely. She denied that she had ever been pregnant, but doctors found a placenta and an umbilical cord inside her body. Later, she admitted that she had given birth to a stillborn fetus at home.
[T]hrough the Red Mosque confrontation, Pakistan’s Talibanized Islamist movements have taken on the Pakistani state, casting it in the same pit as the pro-American governments of Iraq, Afghanistan and Mahmoud Abbas’s Palestinian Authority in the West Bank.
In the past five days more than 120 people have been killed by suicide attacks, mostly in Pakistan’s Federally Administered Tribal Areas (FATA) and North-West Frontier Province (NWFP) but also, on July 17, in Islamabad, where seventeen were killed at an opposition rally for Pakistan’s suspended Chief Justice, Iftikhar Mohammed Chaudhry. The safe money is that the Taliban or pro-Taliban groups were behind these attacks, though in the Islamabad blast the suspicion cannot be ruled out that Pakistan’s lethal intelligence service may have been trying to rid its leader of a judge who has proved so adept at mobilizing the nation against him.
At the same time, the Taliban has announced it is scrapping a ten-month peace accord with the Pakistani government in the North Waziristan tribal agency bordering Afghanistan, invoking the specter of a full-fledged insurgency. Thousands of tribespeople are fleeing, as many soldiers are being rushed in.
Rarely has Pakistan felt so much like Iraq and Afghanistan. Is it heading the same way?
Imagine that you’re a writer and you have decided to offer your readers a first-hand account of the politically correct primate, the idol of the left, known for its “gay” relations, female supremacy, and pacific life-style. Your focus is the bonobo: a relative of the chimpanzee, and genetically equally close to us as the chimpanzee. You go all the way to a place called the Democratic Republic of the Congo to see these darling apes frolic in their natural habitat, hoping to come back with new and exciting material.
Alas, you barely get to see any bonobos. You watch a few of them quietly sitting in the trees, eating nuts. That’s all. This is what happened to Ian Parker, who nevertheless managed to write thirteen pages of carefully crafted prose as a “far-flung correspondent” for The New Yorker. We learn about the “hot, soupy air,” the rainstorms, the mud streams, the sound of falling fruit shells, and his German host, Gottfried Hohmann, who is described as rather unsympathetic.1
The main message of Parker’s piece could of course have been that fieldwork is no picnic, but instead he went for profound revelation: bonobos are not nearly as nice and sexual as they have been made out to be. Given that the bonobo’s reputation has been a thorn in the side of homophobes as well as Hobbesians, the right-wing media jumped with delight. The bonobo “myth” could finally be put to rest. Parker’s piece was gleefully picked up by The Wall Street Journal and Dinesh D’Souza (yes, the same one who blamed 9/11 on the left), who accused “liberals” of having fashioned the bonobo into their mascot. D’Souza urged them to stick with the donkey.
An Indian woman in a flowing beaded gown glides through a pond. A mosquito net brushes over her lover. And to insistent drumbeats of the Indian heartland, a dancing phalanx of tunic-clad women twirls. These images appear in the trailer of a forthcoming movie, “Saawariya.” The melodramatic film, whose Hindi title means “beloved,” has an Indian director and cast. Its characters speak Hindi and burst into eight song-and-dance numbers. It is, in other words, vintage Bollywood — but for one thing.
It is brought to you by Hollywood.
The studio behind “Saawariya,” Sony Pictures Entertainment, is the first in a wave of American studios to produce their own kaleidoscopic Bollywood musicals. The American studios are keen to make money in India, but in a nation where $19 of every $20 spent at the box office goes to indigenous films, the studios are deciding to join Bollywood, not conquer it with their American-made fare. (Picture shows Salman Khan, the hero of Saawaiya)
Buyer beware: Videos aimed at improving infant and toddler language skills are not as beneficial for language learning as they claim to be, according to a new study. Rather than helping youngsters, such products may actually hurt their vocabularies.
Videos like Brainy Baby and Baby Einstein have been marketed to parents since 1997. The researchers interviewed the parents of more than 1000 U.S. children between the ages of 8 and 16 months, gathering information on the children’s vocabulary and how frequently they watched videos like Baby Einstein. When the team controlled for factors such as socioeconomic status, race, and parental education, it found that Baby Einstein and his ilk are not the geniuses they’re cracked up to be. For every hour per day spent watching the videos, children understood an average of six to eight fewer words than did those of the same age who did not watch them–a 17-percentile drop in vocabulary, the team reports online tomorrow in the Journal of Pediatrics. “There is no clear evidence of a benefit coming from these videos, and there is some suggestion of harm,” says Zimmerman.
There are two ways to look at the explosive growth of the Internet: One is to celebrate the fact that in the 15 years since it became commercially available, what began as an obscure military technology morphed into a global phenomenon that is regularly accessed by over a billion people. The other is to ask why the world’s other five billion folks aren’t online yet.
Biswas says his goal, and that of Meraki, is to “connect the next billion people.” Biswas and his engineers are almost exclusively programmers, yet Meraki doesn’t sell software. Instead it sells Wi-Fi hardware—relatively cheap, commodity hardware built by outside vendors. It’s a combination of this hardware and Meraki’s software that yields a kind of magic that Biswas believes will go viral the way few things have. His business model depends on it.
“We now have more than 1,000 networks around the world,” Biswas says, “and all that growth was through word of mouth.” Meraki doesn’t advertise, in part because Biswas’s team has been too busy to bother. “Our focus has been to create the best thing possible, and then trust that people will run with it.”
Mohammed’s interrogation was part of a secret C.I.A. program, initiated after September 11th, in which terrorist suspects such as Mohammed were detained in “black sites”—secret prisons outside the United States—and subjected to unusually harsh treatment. The program was effectively suspended last fall, when President Bush announced that he was emptying the C.I.A.’s prisons and transferring the detainees to military custody in Guantánamo. This move followed a Supreme Court ruling, Hamdan v. Rumsfeld, which found that all detainees—including those held by the C.I.A.—had to be treated in a manner consistent with the Geneva Conventions. These treaties, adopted in 1949, bar cruel treatment, degradation, and torture. In late July, the White House issued an executive order promising that the C.I.A. would adjust its methods in order to meet the Geneva standards. At the same time, Bush’s order pointedly did not disavow the use of “enhanced interrogation techniques” that would likely be found illegal if used by officials inside the United States. The executive order means that the agency can once again hold foreign terror suspects indefinitely, and without charges, in black sites, without notifying their families or local authorities, or offering access to legal counsel.
In a moment of frustration yesterday at the slowness of a major research and exchange program I’ve been developing over the past year or so, at its twists and turns, at uncreative bureaucratic inertia and unimaginative and fearful academic doings, at serious political differences and narrow ideologies, and at the seemingly constant deferment of the payoff, I decided to work on the perfect crime. This won’t be a full-time project, but a pastime, which will make the crime’s perfection all the more delicious through the insouciance of its effort. Ideal perfection would be the crime that is not a crime, and effort that is not effort. I imagine sitting here doing nothing while the crime bathes its brilliance around me (or radiates from me, the guiltless criminal?).
Although seemingly unwise in a public forum such as this, I don’t mind telling you of my intent to develop the perfect crime because it is a crime for which there does not yet exist any penalty, so perfect is it. In fact, this perfect crime itself does not yet exist, since “crime” necessarily implies concomitant moral and/or legal sanction. The perfect crime is not simply one of great efficiency at its execution and its circumventing of moral norms and law. That’s the “perfect” crime of slaves. My perfect crime is a Nietszchean “transvaluation of value,” a criminal mastery so superb that it involves the very creation of the sanction at the same time as the crime. The sanction will depend entirely on my whim. If I require the thrill of transgression, I will write up moral laws to transgress. Otherwise, the essence of the crime’s perfection will be in its crimelessness, for only the criminal can determine the perfect crime to be a crime or not.
Early one freezing January morning in 1896, a massive explosion ripped through the Tylorstown Colliery in the Rhondda Valley. The force of the explosion blew the roof off pitshaft number 7 and sent a “black tornado of dust up through the shafts”. A quick count of the missing miners’ lamps suggested that more than 100 men were below. In addition, there were the boys, known as “the trappers”, employed to open and close the thick wooden doors in the pitch-black tunnels.
Alerted by a Home Office telegram, Dr J. S. Haldane arrived as quickly as the train connections from his Oxford laboratory would allow. The 35-year-old physiologist had been instructed to determine the cause of death and to test a theory. It was generally believed that deaths from such explosions were caused by blast injuries.
Haldane thought differently. He was convinced that three quarters of fatalities were due to suffocation from gases seeping into tunnels. But as Haldane talked to the Tylorstown rescue party, he found that the toxic gas produced after the explosion – known as “afterdamp” (damp from the German for steam, Dampf) – had not extinguished the miners’ lamps. This meant oxygen was present and therefore suffocation must be ruled out.
Even though it threatened his theory, the man The Times once described as a “medical detective” was excited by this intriguing new evidence. In order to explain what had happened, Haldane had the unenviable task of visiting the bereaved families to examine the bodies of the dead miners. Here he found the evidence he needed. From the pink and red appearance of their skin, he suddenly understood why so many miners were dying in pit explosions: carbon-monoxide poisoning. A distinctly carmine-red blood sample provided the final proof.
Dwayne Carter, the twenty-four-year-old rapper from New Orleans known as Lil Wayne, hasn’t released an album or a single in months, though he has appeared as a guest on songs by other artists. But he is indisputably the rapper of the year. He has been recording songs constantly—sometimes three or four a night. Many are astonishingly good, and most eventually find their way onto the Internet, where they can be downloaded for free, apparently with his blessing. Among hip-hop fans, discussion has been dominated by talk of Lil Wayne—his place in the hip-hop canon, his romantic status, his sexual orientation, and the release date of his sixth album, “Tha Carter III.” Last week, MTV’s “hip-hop brain trust”—ten staffers—voted Lil Wayne the “hottest m.c. in the game.”
In four years, Lil Wayne has evolved from a fairly predictable Southern gangsta rapper into an artist who may actually deserve the bragging rights to “best rapper alive,” his current motto.
CONSIDER this frank greeting in William Gibson’s “Spook Country”: “I’ve just checked the number of your Google hits, and read your Wikipedia entry.” This is what translates as fame today: a foothold in the ether, an identity composed by a faceless committee of unknown size. Gibson famously coined the term “cyberspace” in his reality-crashing, paradigm-shifting 1984 debut, “Neuromancer,” and his conception of its “consensual hallucination” rings truer now, more than two decades later, as we pursue terminally framed existences teeming with hyperlinks and blogs, worlds of Warcraft and second lives.
The Googlee in question is Hollis Henry, singer in a defunct 1990s cult band, the Curfew. She’s now a journalist working on a story for a shadowy magazine, Node, that hasn’t published an issue yet. (It’s variously and hilariously described as a would-be Wired, generating sub-rosa buzz by its very anti-buzz.) Cults, shadows, secrets: in other words, Gibson country.
LESSING: Listen, I feel — I’m sure I’ve said this before but I’ll say it again — there’s a kind of problem between critics and writers. A writer falls in love with an idea and gets carried away. A critic looks at the finished product and ignores the rush of a river that went into the writing, which has nothing to do with the kind of temperate thoughts you have about it.
If you can imagine the sheer bloody pleasure of having an idea and taking it! It’s one of the great pleasures in my life. My god, an idea!
IDEAS: You’ve written that past the age of 60, “you float away from the personal. You have received the great gift of getting older — detachment, impersonality.” Isn’t it possible to get too detached?
LESSING: I can’t judge. You do become more distant from the passions of your youth, thank God. How would we be able to live, if we’re always in a state of turmoil?
Does my photo on the front cover remind you of anyone? Can you imagine my tongue flicking the last drops of a strawberry lassi from my lips as my dark brown eyes fix your gaze? Yes. I am the Nigella of Indian food. I am the woman who can ooze sex into a cucumber raita and persuade the chattering classes that a curry is not just a fumbling Friday-night drunken grope, but also a Sunday-morning, Agent Provocateur smooch fest.
Indian food is often held to be unhealthy – full of cream, ghee and nut pastes. But the cuisine is so much more than overweight proles burying their faces in a tub of chicken tikka masala. It’s food for the thin, the glamorous, the middle classes. People like you. The secret is in the ingredients. Get one thing wrong and a whole dish can be spoiled. Now I know how intimidating a different culture can be. I was brought up in London and Switzerland and when I visit my relatives in Delhi, I, too, find the noise and the smell overpowering. But I never let this stop me from sending the servants out to the market to track down the best spices. And you must do the same. If it means sending the au pair in a cab to Southall, so be it.
For thousands of years, most people on earth lived in abject poverty, first as hunters and gatherers, then as peasants or laborers. But with the Industrial Revolution, some societies traded this ancient poverty for amazing affluence.
Historians and economists have long struggled to understand how this transition occurred and why it took place only in some countries. A scholar who has spent the last 20 years scanning medieval English archives has now emerged with startling answers for both questions. Gregory Clark, an economic historian at the University of California, Davis, believes that the Industrial Revolution — the surge in economic growth that occurred first in England around 1800 — occurred because of a change in the nature of the human population. The change was one in which people gradually developed the strange new behaviors required to make a modern economy work. The middle-class values of nonviolence, literacy, long working hours and a willingness to save emerged only recently in human history, Dr. Clark argues. Because they grew more common in the centuries before 1800, whether by cultural transmission or evolutionary adaptation, the English population at last became productive enough to escape from poverty, followed quickly by other countries with the same long agrarian past.